Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
One page with multiple sections - unique URL for each section
-
Hi All,
This is my first time posting to the Moz community, so forgive me if I make any silly mistakes.
A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff.
I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material.
I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types.
What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this?
My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page.
I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds?
Let me know your thoughts! Thank you for the help!
-
Agreed with Oleg. On the Time and Quartz pages you are essentially looking at their blog index. Their CMS (content management system) does some fancy footwork to show the url of the post you happen to be looking at. It isn't one page with a different URL for each section.
If you are using a CMS like WordPress, Drupal or Joomla, you can create content (posts) for the material, applications and types, and then list them on pages for the customer type. If not, you can simply make pages like Oleg suggested.
-
"I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. "
^ this. Create a page targeting each type of customer (one looking for a specific material, one looking for a specific application and one for available types). You can use similar content with some quoted content blocks on each page, just make sure there is unique content there as well that tailors the content searching for a specific need.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Which one is highest Traffic Generation forum ?
Hello guys, can any one please suggest me Good highest Traffic Generation forum For Business. as there are many but i need best sites Which could generate Good traffic,As i am new to Forum posting. any export can help please. Thank you in advance, Falguni
White Hat / Black Hat SEO | | innovativeecomindia0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Best Location to find High Page Authority/ Domain Authority Expired Domains?
Hi, I've been looking online for the best locations to purchase expired domains with existing Page Authority/ Domain Authority attached to them. So far I've found: http://www.expireddomains.net
White Hat / Black Hat SEO | | VelasquezEF
http://www.domainauthoritylinks.com
http://moonsy.com/expired_domains/ These site's are great but I'm wondering if I'm potentially missing other locations? Any other recommendations? Thanks.1 -
Multiple stores for the same niche
I started developing a new niche of products in my country about 3 years ago. That's when I opened my first store. Everything went fine, until a year ago, when someone I thought was a friend secretly stole my idea and made his own competing store. I was pretty upset when I caught him and decided to make it as difficult as possible for him, so I made another 4 stores, trying to get him as low as possible in the search results. The new sites have similar products (although not 100% identical), slightly different titles, images and prices. They look different and are built on different e-commerce platforms. They are all hosted on the same server, have roughly the same backlinks, use the same Google account for Analytics, have the same support phone numbers etc etc. I wasn't thinking that I'm doing something fishy, so I didn't try to hide anything. Trouble is that those sites, after doing fine for a few months, dropped like bricks in the search results, almost to the point that they can't be found at all. At the moment, the only site that ranks relatively well is the original one and a couple of secondary pages with no importance from one of the other sites. How did this happen? Does Google have something against this practice? Did they take action by themselves when they realized that I was trying to monopolize this niche, or did my competitor report me for some kind of webspam? And more importantly, what do I do now? Do I shutdown all but my original site and 301 redirect users to it from the others? Can I report my competitor for engaging in the same practice? (He fought back and now he has 3-4 sites, some of which still rank kind of OKish, also he has no idea about web development, SEO or marketing, he just crudely copies what I do and is slowly but surely starting to do better than me).
White Hat / Black Hat SEO | | pandronic0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0