Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Correct robots.txt for WordPress
-
Hi. So I recently launched a website on WordPress (1 main page and 5 internal pages). The main page got indexed right off the bat, while other pages seem to be blocked by robots.txt. Would you please look at my robots file and tell me what‘s wrong?
I wanted to block the contact page, plugin elements, users’ comments (I got a discussion space on every page of my website) and website search section (to prevent duplicate pages from appearing in google search results). Looks like one of the lines is blocking every page after ”/“ from indexing, even though everything seems right.
Thank you so much.
-
Me too, can you upload or screenshot the actual file that you are using
-
I have edited it down to
User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Disallow: /contact/ Disallow: /refer/ It didn’t help. I get a “Blocked by robots.txt” message after submitting the URL for indexing in google webmaster tools. I’m really puzzled.
-
Hi, in addition to the answer that effectdigital gave; another option,optimised for WordPress:
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml -
Just seems overly complex and like there's way more in there than there needs to be
I'd go with something that 'just' does what you have stated that you want to achieve, and nothing else
User-Agent: *
Disallow: /wp-content/plugins/
Disallow: /comments
Disallow: /*?s=
Disallow: /*&s=
Disallow: /search
See if that helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Moving a site from .cfm to Wordpress - How to keep the authority?
Hi guys, My client has a site built with Cold Fusion (web pages end in .cfm) and we're moving them over to Wordpress (for many reasons), keeping the same menu structure and navigation. Their previous SEO company was pretty awful, however, they did manage to establish some decent authority/backlinks for the website and its 20 or so pages. My questions: I assume I'll want to do 301 redirects for each page, possibly by editing the .htaccess file? Any advice on this? Anything else I need to consider in this move? Thanks!
On-Page Optimization | | alpen0 -
Anyone knows h1, h2, h3..h6 plugin for wordpress?
Can anyone share which is the good plugin of wordpress for header tags?
On-Page Optimization | | surajrathore0 -
Best practice for Meta-Robots tag in categories and author pages?
For some of our site we use Wordpress, which we really like working with. The question I have is for the categories and authors pages (and similiar pages), i.e. the one looking: http://www.domain.com/authors/. Should you or should you not use follow, noindex for meta-robots? We have a lot of categories/tags/authors which generates a lot of pages. I'm a bit worried that google won't like this and leaning towards adding the follow, noindex. But the more I read about it, the more I see people disagree. What does the community of Seomoz think?
On-Page Optimization | | Lobtec0 -
How do Maximize WordPress with 2 SEO Plugins
I have 2 WordPress SEO Plugins, Yoast and All-in-One SEO. I have tried like heck to make them work together, but every time I crawl my site here, I get multiple error messages. My question is, how can I tweak the title settings to avoid having multiple meta desctiptions, titles etc.
On-Page Optimization | | TheSportsDaddy0 -
How to optimize a wordpress blog
I’m helping a client optimize a word press blog, and I’m not that familiar with Wordpress. The site is www.athleticfoodie.com. At first I was treating it like a normal website, where the categories would be optimized like pages on a website. However, I now realize that categories don’t have any content on them, so I can’t really optimize anything other than the names. Are the following things the best way to handle on-page optimization for a blog? Optimizing the homepage & domain: Find ways to incorporate the most important keywords into the elements on the main frame of the site: Navigation menu, Widgets, Category names, Alt Images. Optimizing the categories: For the posts within the categories (i.e., photos), work to make sure the category keywords are worked into the post titles (but not too much to seem spammy) Optimizing specific posts. Work keywords into the text and images. Any other suggestions would be greatly appreciated.
On-Page Optimization | | EricVallee340