Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do web pages have to be linked to a menu?
-
I have a situation where people search for terms like, say 1978 one dollar bill. Even though there never was a 1978 one dollar bill. I want to make a page to capture these searches but since there wasn't such a thing as a one dollar bill I don't want it connected to the rest of my content which is reality based. Does that make sense?
Anyway, my question is, can I publish pages that aren't linked to my menu structure but that will be searchable or, am I going to have to figure out a way to make these oddball pages accessible through my menu?
-
ok... After some thought I have figured out a way to add these links and have it make sense. At least it will be defensible.
Thanks for your help.
-
Hi Greg,
Google usually discover pages via links. So if a page does not have any links, it is hard for Google to discover. This being said, you can try submitting XML sitemaps with the pages to Google, and they might crawl and index them.
However, if a page does not receive any links from your own site, it does signal that you do not consider the page to be particularly good/important, making it quite unlikely that it will rank well.
Hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | | davidvogel1 -
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Page titles in browser not matching WP page title
I have an issue with a few page titles not matching the title I have In WordPress. I have 2 pages, blog & creative gallery, that show the homepage title, which is causing duplicate title errors. This has been going on for 5 weeks, so its not an a crawl issue. Any ideas what could cause this? To clarify, I have the page title set in WP, and I checked "Disable PSP title format on this page/post:"...but this page is still showing the homepage title. Is there an additional title setting for a page in WP?
Technical SEO | | Branden_S0 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | | PooleyK0 -
Drop Down Menu - Link Juice Depletion
Hi, We have a site with 7 top level sections all of which contain a large number of subsections which may then contain further sub sections. To try and ensure the best user experience we have a top navigation with the 7 top level sections and when hovered a selection of the key sub sections. Although I like this format for the user as it makes it easier for them to find the most important sections / sub sections it does lead to a lot of links within every page on the site. In general each top section has a drop down with approx 10 - 15 subsections. This has therefore lead to SeoMoz's tools issuing its too many internal links warning. Then alongside this I am left wondering if I shouldn’t have to many links to my subsections and whether I would be better off being more selective of when I link to them. For instance I could choose the top 5 sub sections and place a link to them from our homepage and by doing so I would be passing a greater amount of link juice down the line. So I guess my dilemma is between ensuring the user has as easy a time traversing the site as possible whilst I try to keep a close watch on where, and how, our link juice is distributed. One solution I am considering is whether no-follow links could be utilised within the drop down menus? This way I could then have the desired user navigation and I would be in greater control of what pages link to which sub sections. Would that even work? Any advice would be greatly appreciated, Regards, Guy
Technical SEO | | guycampbell1 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0