Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do I geo-target continents & avoid duplicate content?
-
Hi everyone,
We have a website which will have content tailored for a few locations:
USA: www.site.com
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-caLink hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices.
What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent.
Thanks!
-
Moz most definitively need a "give a beer" feature!! Thanks for the in-depth response. We'll also work on building "local" links as you suggest.
We've since changed the structure of the site to :
USA/Canada: www.site.com
Europe EN: www.site.com/en_gb/
Europe FR: www.site.com/fr_fr/
Canada FR: www.site.com/fr/That way we can use hreflang and avoid duplicate content. In your experience, will Google serve www.site.com/fr_fr/ instead of www.site.com/fr/ to Belgium and Switzerland? Will UK and Ireland see www.site.com or www.site.com/en_gb/ ?
Thanks a lot for the answer!
-
Hi there,
As Marcus mentioned before, at the moment geographical targeting is country based, not per continent, so you're correct: hreflang works for languages or / and countries and the geotarget option in Google Webmaster Tools (when you're not using a ccTLD) is only for countries.
So there are really two alternatives: language targeting (although each language is different in each country) or country targeting (which is the ideal in order to connect with each audience, localizing the content as maximum and leveraging all types of local characteristics).
With language targeting you will avoid having content duplication issues (since it will be only one English or one Spanish version), nonetheless, as I mentioned, it can be tricky: The Spanish spoken in Spain is different than the one from Mexico and each other Latin American country. Seasonality and currency are different. People's culture, tastes and local characteristics too. So language based versions might serve to have a "generic" approach to these audience but not really targeting them as specific markets.
On the other hand with country targeting if you have two English versions you can refer each one to the appropriate country with hreflang, ccTLDs (if you use a generic domain, then with the geotarget option in Google Webmaster tool) and then by doing local link building focused on each country, to enhance the popularity of each version there. This would be the recommended approach. If you can't enable many countries because of resources restrictions then start with the most important ones.
More over, from what you mention about targeting Europe as a whole, even if you enable a domain of the type: www.yourbrand.eu for Europe, it is likely to be treated as a generic domain as Google specifies here, and then inside this domain what you would really have --as I understand from your description-- are language versions targeting Europe in General:
- www.yourbrand.eu/ in English (UK, Ireland, etc.)
- www.yourbrand.eu/fr/ in French (In France, Belgium, Switzerland)
- www.yourbrand.eu/es/ in Spanish
- www.yourbrand.eu/de/ in German (for Germany, Switzerland or Austria)
The issue comes when you have the same content in English for your American audience in www.yourbrand.com or in Spanish (for Spanish speakers in the US) in www.yourbrand.com/es/ that could cause a content duplication issue with www.yourbrand.eu/ and www.yourbrand.eu/es/.
If this is the scenario, then the best you can do is to differentiate the content, changing them by giving signals that one is targeting the US audience and the other, well, what would be English speakers in Europe. But again, there's no real support or straight-forward solution for this scenario since beyond what Google supports, is not "natural" or the best alternative from an "international audience targeting" perspective.
If you have any other information that you think would be relevant to give you additional recommendations please let me know.
I hope this helps!
-
Hey Axial
As far as I am aware there is no option to target regions like Europe and to do this in webmaster tools you will need to create a folder for each country you are looking to target within Europe.
Obviously, there are lots of different languages across Europe so in an ideal world, you will want a version geotargeted to each country in the correct language. If you want to be really fancy you will want a version with english and the relevant countries language.
So, for spain as an example, targeting Spanish and English the hreflang would be set as "ES-es" and "ES-en" (Spain-Spanish and Spain-English). Directories could be matched /es-es & /es-en.
Not an answer as such but as far as I am aware, Europe is not targetable in a single folder via webmaster tools so you are going to have to work with what's available.
Hope that helps
Marcus
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paragraphs/Tables for Content & SEO
Hi Does anyone know if Google prefers paragraphs over content in a table, or doesn't it make much difference?
Intermediate & Advanced SEO | | BeckyKey0 -
Real Estate MLS listings - Does Google Consider duplicate content?
I have a real estate website. The site has all residential properties for sale in a certain State (MLS property listings). These properties also appear on 100's of other real estate sites, as the data is pulled from a central place where all Realtors share their listings. Question: will having these MLS listings indexed and followed by Google increase the ratio of duplicate vs original content on my website and thus negatively affect ranking for various keywords? If so, should I set the specific property pages as "no index, no follow" so my website will appear to have less duplicate content?
Intermediate & Advanced SEO | | khi50 -
How to resolve duplicate content issues when using Geo-targeted Subfolders to seperate US and CAN
A client of mine is about to launch into the USA market (currently only operating in Canada) and they are trying to find the best way to geo-target. We recommended they go with the geo-targeted subfolder approach (___.com and ___.com/ca). I'm looking for any ways to assist in not getting these pages flagged for duplicate content. Your help is greatly appreciated. Thanks!
Intermediate & Advanced SEO | | jyoung2220 -
Duplicate Content www vs. non-www and best practices
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
Intermediate & Advanced SEO | | EnvoyWeb
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0