Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Webmaster Guideline Change: Human-Readable list of links
-
In the revised webmaster guidelines, google says "[...] Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)." (Source: https://support.google.com/webmasters/answer/35769?hl=en)
I guess what they mean by this is something like this: http://www.ziolko.de/sitemap.html
Still, I wonder why they say that. Just to ensure that every page on a site is linked and consequently findable by humans (and crawlers - but isn't the XML sitemap for those and gives even better information)? Should not a good navigation already lead to every page? What is the benefit of a link-list-page, assuming you have an XML sitemap? For a big site, a link-list is bound to look somewhat cluttered and its usefulness is outclassed by a good navigation, which I assume as a given. Or isn't it?
TL;DR: Can anybody tell me what exactly is the benefit of a human-readable list of all links?
Regards,
Nico
-
Hi Netkernz_ag,
It is just good practice to have those types of pages available. While I wouldn't say it is an absolute requirement, it should be something you do for your users. The page you pointed to is a general checklist of things to do, and not to do for your users. Creating a Site Index maybe a bit dated, but I still tend to do them as they are fairly easy to create. (example).
Hope this helps,
Don -
Hi there,
Remember that google always seeks to serve the a better user experience.
Technically, the XML sitemap is the one needed for crawlers. And the "human-readable" sitemap is focused on users.
I might be saiyng something obvious, that's the way i've understood it.The benefit of the "human-readable" sitemap shuld be in the part of user experience, Google might see it that way.
As a visitor, I find usefull that kind of sitemap, it gives you a quick overview of the siite and make your way to the final page faster.Hope it helps.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Count Links Loaded from JavaScript Files After the Page Loads
Hi, I have a simple question. If I want to put an image with a link to another site like a banner ad on my page, but do not want it counted by Google. Can I simply load the link and banner using jQuery onload from a separate .js file? The ideal result would be for Google to index a script tag instead of a link.
On-Page Optimization | | CopBlaster.com1 -
Link flow for multiple links to same URL
Hi there,
On-Page Optimization | | doctecs
my question is as follows: How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?) This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail. Answers should include source Information about the current state of art at Google is preferable The question is not about anchor text, general best practises for linking, "PageRank is dead" etc. We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.0 -
Google Indexing Wrong Title
Hey guys ! I have a wordpress website and also yoast seo plugin . I've set up a meta title which is : TV Online | Assistir Filmes| Notícias | Futebol |GogsTV . (I checked on some free tools to see , and they also show up this) but .... google is showing this : GogsTV: TV Online | Assistir Filmes| Notícias | Futebol . Seems they are trying to show my brand name first instead of my main keyword . I'm not sure why it doesnt indexes as i want ... Does anybody know how can i fix this . Thanks
On-Page Optimization | | tiagosimk0 -
Schema description wordcount guidelines ?
Hi is there a wordcount guideline for the description field in Ravens schema creator ? according to their page on event schema an excerpt from the page will show up as a short description but then their tool has a field for adding a description! I was just adding some edited copy from the page into this but if it already pulls in an excerpt is there any need ? I take it its a good idea for better control of what's displayed in rich snippet, if so what's suggested wordcount limit ? cheers dan
On-Page Optimization | | Dan-Lawrence0 -
When I changes Template, why traffic goes down?
I've noticed that when I change my blog's template the traffic goes down dramatically, about of 40% decrease. I know that new themes can have some problems but I have tried this with 2 different themes. First try was with genesis framework(Paid one) and just in one day traffic went down and when I reverted the old theme, the traffic became normal. Should I wait for 1 week to see what happens? What could be the potential reason of this?
On-Page Optimization | | hammadrafique0 -
Blocking Subdomain from Google Crawl and Index
Hey everybody, how is it going? I have a simple question, that i need answered. I have a main domain, lets call it domain.com. Recently our company will launch a series of promotions for which we will use cname subdomains, i.e try.domain.com, or buy.domain.com. They will serve a commercial objective, nothing more. What is the best way to block such domains from being indexed in Google, also from counting as a subdomain from the domain.com. Robots.txt, No-follow, etc? Hope to hear from you, Best Regards,
On-Page Optimization | | JesusD3 -
Too many links on page -- how to fix
We are getting reports that there are too many links on most of the pages in one of the sites we manage. Not just a few too many... 275 (versus <100 that is the target). The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does. Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
On-Page Optimization | | novellseo2 -
How long does it take for Google to see Changes to a site?
Hi, I have a low PR site (PR 1) that I am starting to work on. Ingeneral when you make changes to my site how long would it take Google to recognize and index those changes? The reason I am wondering is because the site I am working on had a lot of duplicate content (around 700 pages), I got rid of it all, but I wasn't sure how long it would take Google to spider all these pages and re-index them since the site is low PR. Thanks, Ken
On-Page Optimization | | Jason_3420