Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to safely exclude search result pages from Google's index?
-
Hello everyone,
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blahI wanted to block everything of that sort, but how do I do it without blocking /index.php ?
Thanks in advance and have a great day everyone!
-
Hi Louise,
If you can ID the parameters, you can also look at blocking these in Webmaster Tools. This page explains more. As with any blocking of URLs, of course, proceed with caution.
-
I agree that can be effective. The reason I suggested the robots.txt is because Louise mentioned "blocking and preventing" as an objective. Robots.txt are particularly useful in the example where results from a search bar or something of that nature is involved. A NOINDEX, FOLLOW will not prevent bots from getting tired and dizzy, whereas the robots.txt can "block and prevent" bots from crawling certain parameters.
With all of that said, I think it is important to understand whether you need the bots to crawl and not index (in which case Spencer's answer is correct), or if you need to prevent bots from crawling the parameters altogether.
Hope that is more clear
-
I'm not sure that robots.txt is effective when url parameters are involved.
I would just add a meta robots tag to the head section of the search results template:
-
If you are able to identify a url parameter, you may excluded them using robots.txt. Here is a great resource on Robots.txt - http://outdoorsrank.com/learn/seo/robotstxt
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fire a tag when element is loaded on page (Google Tag Manager)
I'm using an Element Visibility trigger to track a value that appears on a page. However, I want to track this value even when the user doesn't scroll to the area of the page where the element is (i.e. when the page is loaded, and the value is displayed below the fold, but the user doesn't scroll down there). Is there a way of doing this
Reporting & Analytics | | RWesley0 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Google analytics suddenly stopped tracking all my landing pages
Hey guys. I love the new update of GA. Looks so clean. So, of course, I was excited to see how my landing pages were doing. I went to behavior, all content, all pages. And I noticed it's only showing me 19 pages out of the 93 I have indexed. And none of the top ones at all! Can't find them anywhere in GA! Anyone seen this before? Thank you so much
Reporting & Analytics | | Meier0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
How can I see what Google sees when it crawls my page?
In other words, how can see the text and what not it sees from start to finish on each page. I know there was a site, but I can't remember it.
Reporting & Analytics | | tiffany11030 -
Google Ad referral
I was wondering if someone could decode the jumble of a referral - this is supposedly the referal that led to a click through to my site via a product listing ad. I am trying to figure out how www.nextag.com comes in to the picture as we do not have refurbexperts even listed there? Thanks to anyone who tries/does work it out. http://www.googleadservices.com/pagead/aclk?sa=L&ai=CGXud6DmDU_qeL5THygHpuICwCaTZwMYD_Nvvv0bEwMS50wEIBhAEIOn5-gEoBVCl7P7f-v____8BYMnu8omYpPQSoAHAhIv9A8gBB8gDG6oEJ0_QwcNc5zNun_d7S5KNcMT6uPjjH_mMDkKFFgBCQ6aKICRPJVVa7MAFBYgGAaAGJoAHqPv0ApAHAeASupqdo-ypit0m&ohost=www.google.com&cid=5GhZEzUCSC6x9n2wxOdz3-mrAfSUkvHKPN3wD5yLInnlNil_&sig=AOD64_1D1z1JPYbFP0UnUglJVOfvd25RfA&adurl=http://refurbexperts.com/product/527/HP-LaserJet-P2015-Laser-Printer-RECONDITIONED%3Futm_source%3Dproductlistingads%26utm_medium%3Dadwords%26utm_campaign%3Dadwords&ctype=5&nb=0&res_url=http%3A%2F%2Fwww.nextag.com%2Fhp-p2015-laserjet%2Fproducts-html%3Fnxtg%3D116d0a1c0504-9FFEB16DE52A7E2A&rurl=http%3A%2F%2Fwww.nextag.com%2Fgoto.jsp%3Fp%3D3652%26search%3Dhp%2520p2015%2520laserjet%26t%3Dag%253D1384181795%26crid%3D48271786%26gg_aid%3D20169721025%26gg_site%3D%26gclid%3DCjgKEAjwzIucBRDzjIz9qMOB3TASJABBIwL1LHK7GcAPS6yHGpd9Kq3wsZrcPORAWD8QCWivr4W75PD_BwE&nm=11&nx=43&ny=12&is=700x181&clkt=187
Reporting & Analytics | | henya0 -
Switch to www from non www preference negatively hit # pages indexed
I have a client whose site did not use the www preference but rather the non www form of the url. We were having trouble seeing some high quality inlinks and I wondered if the redirect to the non www site from the links was making it hard for us to track. After some reading, it seemed we should be using the www version for better SEO anyway so I made a change on Monday but had a major hit to the number of pages being indexed by Thursday. Freaking me out mildly. What are people's thoughts? I think I should roll back the www change asap - or am I jumping the gun?
Reporting & Analytics | | BrigitteMN0 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0