Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
English pages given preference over local language
-
We recently launched a new design of our website and for SEO purposes we decided to have our website both in English and in Dutch. However, when I look at the rankings in MOZ for many of our keywords, it seems the English pages are being preferred over the Dutch ones. That never used to be the case when we had our website in the old design. It mainly is for pages that have an English keyword attached to them, but even then the Dutch page would just rank.
I'm trying to figure out why English pages are being preferred now and whether that could actually damage our rankings, as search engines would prefer copy in the local language.
An example is this page: https://www.bluebillywig.com/nl/html5-video-player/ for the keywords "HTML5 player" and "HTML5 video player".
-
Possible Reasons for English Page Preference:
Technical SEO:
Hreflang tags: Double-check your hreflang implementation to ensure it's correctly pointing to the Dutch version for Dutch users.
Content differences: Verify if the content on the Dutch page is identical to the English page, including title tags, meta descriptions, and headings. Even slight differences can impact rankings.
Mobile responsiveness: Ensure both versions are mobile-friendly and optimized for different screen sizes.
Content Quality:
Keyword targeting: Analyze keyword usage in both versions. Are the Dutch pages properly optimized for Dutch keywords?
Unique content: While mirroring content is acceptable, unique value in the Dutch version can attract Dutch users and improve rankings.
User engagement: Check analytics to see if users engage more with the English page (e.g., higher time on page, lower bounce rate). This can signal search engines about user preference.
Potential Impacts and Actions:Ranking Damage: While having English pages rank for Dutch keywords isn't necessarily damaging, it can divert traffic from the intended audience. Ideally, the Dutch page should rank for Dutch keywords in the Netherlands.
Investigate: Use tools like Google Search Console and Moz to analyze specific keyword rankings, crawl errors, and user engagement metrics for both versions.
Optimize Dutch Pages: Ensure proper technical SEO, optimize content for Dutch keywords, and consider adding unique value to attract Dutch users.
Monitor and Refine: Track progress and adjust your approach based on ongoing analysis and results.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Unsolved SEO And Digital Marketing Training
Hi Everyone, I have a basic SEO and Digital Marketing knowledge and looking for a course /training which will teach me step by step SEO and tools need to use with hand on training. I have a website (https://gemslearninginstitute.com/) which I need bring in Google Packs and on the first page of Google. I have attended a few courses but none of them offered in depth knowledge with hands on training so whatever I do it is not producing results. Thanks
SEO Tactics | | fslpso0 -
Is it okay to update Page Titles and Meta descriptions over a period of time?
Some of the page are not performing even after having good content, videos, images and faqs. I am planning to update the page titles and planning to use Long Tail keywords in it for example, Contact US - Brand name would be Contact US - Brand Keyword. Is it okay to do that for all the pages?
Local SEO | | Ravi_Rana0 -
Local Site stuck on page 2 for years. Can’t penetrate page 1! Help!
Hey there Moz community! This is the first time I've ever asked a question here so please forgive if I slip up on any etiquette. I manage a website for a small Orlando Florida family law and divorce law firm who are targeting search phrases that include those "Orlando divorce attorney" variants. The site is located at https://www.affordablefamilylawyer.com/ If you run a search for "Orlando divorce attorney" along with close variant search terms our law firm website for about the past two years has hovered at the top of the second page of google but has never actually penetrated page 1. When you examine metrics such as page authority, domain authority, trust, and other traditional metrics it tells you that our site should be on page 1 but alas it's not happening. We have, however been featured quite often in the three pack for the local listings for the target search terms. Though valuable, our goal has always been to be featured in the top three of the organic search results. To add to the confusion we have a practice area page located at https://www.affordablefamilylawyer.com/orlando-divorce-lawyer/ dedicated to divorce and expected that page to rank for these divorce attorney search terms but it will not rank for the search terms and instead our homepage ranks for them every single time regardless of how we swap around the optimization on the page. Never had any manual actions. any help you guys can offer is greatly appreciated and I really appreciate your time!
Local SEO | | Seanthewood1230 -
Keyword rich domain names -> Point to sales funnel sites or to landing pages on primary domain?
Hey everyone,
Local SEO | | Transpera
We have a tonne of old domains we have done nothing with. All of them are keyword-rich domains.
Things like "[City]SEOPro" or "[City]DigitalMarketing" where [city] is a city that we are already targeting services in. So all of these domains will be targeted for local cities as keywords. We have been having an internal debate about whether or not we should just host sales funnel pages on these domains, that are rich in keywords and content......... ... Or ... ... Should we point these domains to landing pages on our existing domain that are basically the same as what we would do with the sales funnel pages, but are on our primary site? (keyword rich, with good and plentiful content) Then, as a follow-up question... Should these be set as just 301 redirects on these domains to our actual primary domain so the browser sees the landing page domain instead of the actual keyword-rich domain? ( [city]seopro.com ) Thanks guys. I know for some, the response will be an obvious one. However; we have probably way over thought this and have arguments for almost every scenario. We think we have an answer but wanted to send this out to the community first. I won't post what we are thinking yet, so that the answers can remain unbiased for now and we can have a conversation without it being swayed any one way. We understand that 301 redirects would be seen as a doorway page.
We are also only discussing in the context of organic search only.
If we ran the domains as their own sites, they would be about 3 pages of content only. Pretty static, but good content. Think of a PAS style sales funnel. Problem -> Acknowledgement -> Solution.0 -
Location based landing pages best practices
Hello, I am looking for the communities thoughts on location-based landing pages. That is, writing out dozens, sometimes hundreds of landing pages in the format of domain.com/[keyword]-[location] and recycling the same content over and over to localize organic search engine results. i have done it with multiple websites and seen tremendous success, however, i am considering getting rid of these pages and having all of the spammy location based pages 301 redirect to my main page domain.com/[keyword] I am considering this because the above practice seems to be a bit black-hat / spammy and those pages do not offer any unique or valuable content. While i have seen great results from this practice, i feel like Google will eventually penalize this or may already be penalizing me without me knowing it. At the same time, i am hesitant to because these pages are ranking. i.e. domain.com/[keyword-houston] is ranking but domain.com/[keyword] is not ranking Thoughts?
Local SEO | | RyanMeighan0 -
Local SEO Website Structure.
Hi everyone, This might be quite a long post so please bear with me. I am currently rebuilding my website. My previous website was built by a web designer and was very basic. 5 page html site consisting of home, services, gallery, testimonials, contact pages. None of them were great - thin content, not optimised as well as could be - no h1's etc. To be fair I knew nothing about websites and didn't bother much with the site. As a new business I used it simply as a place for people to visit for more information after receiving a leaflet and never bothered much about driving traffic to the site. A few years down the line and I have realised I need the website to be working for me as opposed to alongside me. I am building it myself via wordpress as web designer didn't want to work in wordpress. I have done my keyword research and I'm working on pages as we speak. Previously my homepage - around 80% of visitors landed here for my main keyword (driveway cleaning glasgow) as it was number 6 in the organic listing. With my services page appearing directly underneath in 7 for the same keyword. I have starting building a new page for that keyword which contains (driveway-cleaning-glasgow) in the url. I have 301'd my previous services page to this url. Now for my questions...
Local SEO | | sfrediktru8
My 2nd keyword based on volume is driveway cleaning. How do I optimise for this or will the (driveway-cleaning-glasgow) page rank for this also as the words are contained within this page? I plan on having the same structure for the remaining services - pressure-washing-glasgow, monoblock-cleaning-glasgow etc, etc. As I am building new pages for each service with location built in, where does this leave my homepage? Should I be targeting keywords for this page? It is still my strongest page and apart from the (driveway-cleaning-glasgow) page which will get some help from the 301 these are all new pages so I would expect perhaps initially to lose some traffic. But as I am not ranking well for anything other than the main 2 keywords mentioned above it can only be beneficial long term when google recognises the specific pages for each service. And when I start using Adwords I will have a specific landing page for each service. Any advice would be appreciated. Thanks0 -
How to find best local websites?
For example, I'd like to type in a zipcode and get the highest ranking websites by DA/whatever metric the software uses, within a 25 mile radius? Does that type of service exist? I'm looking to build up our local links, but most of the websites have extremely low authority. I'm trying to find some good ones without having to manually check each one. Thanks, Ruben
Local SEO | | KempRugeLawGroup1