Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
SEO effect of URL with subfolder versus parameters?
-
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations).
For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL:
https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php
or
http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
-
Thanks Miriam, This is very helpful and makes a lot of sense. What do you think of towns and villages, or boroughs of a large city. Do you think the close proximity is dangerous territory re: keyword permutations?
I take your point about unique content tailored to the people of the city - it makes a lot of sense. But what about locations that are closer to each other?
I know it's a tricky question but any insight would be most welcome.
-
That's a good question, Andrew. It's true that it's no longer a best practice to build out a set of pages featuring slightly different permutations of a keyword (car repair, auto repair, repairing cars, fixing cars, etc.). That approach is now quite dated. Honestly, it never made any sense beyond the fact that when Google wasn't quite so sophisticated, you could trick your way into some additional rankings with this type of redundant content.
The development of location landing pages is different. These are of fundamental use to consumers, and the ideal is to create each city's landing page in a way that is uniquely helpful to a specific audience. So, for example, your store in Detroit is now having a special on winter clothing right now, because it's still snowing there. Meanwhile, your store in Palm Beach is already stocking swim trunks. For a large, multi-location Enterprise, location landing pages can feature highly differentiated content, including highlights of regional-appropriate inventory and specials, as well as unique NAP, driving directions, reviews from local customers, and so much more.
The key to avoiding the trap of simply publishing a large quantity of near-duplicate pages is to put in the effort to research the communities involved and customize these location pages to best fit local needs.
-
Hi Searchout,
Good for you for creating a unique page for each of your locations. I like to keep URLs as simple as possible, for users, so I'd go with:
etc.
From an SEO perspective, I don't think there's a big difference between root URLs and subfolders. If you're using one structure, I doubt you'd see any difference from doing it differently (unless you were using subdomains, which is a different conversation).
-
Of course that cities will be counted.
That´s why im always reinforcing the idea of creating UNIQUE and Special pages for each keyword.
Google is getting smarter and smarter, so simple variations in a few words are easly detected.Hope it helps.
Best luck.
GR. -
Hi
Thanks for your response I'm interested in this too. I've been targeting cities with their own pages but I head recently that google are going to be clamping down on multiple keyword permutations. Do you think cities will be counted in this?
-
Hi there!
In my opinion, for SEO purposes it is correct to have a unique page (really different from other, not just changing the city name and location) por each big city you are optimizing.
Thus said, a subfolder is useful in order to show google the name of the city in the URL. It is common that google considers parameters different than folders.Also, remember to avoid duplicate content. /dallas/ and /dallas/index.php should not be accesible and indexable for google. Redirect one to the other or canonicalize one to the other. Same with www, non-www, http and https versions.
Hope it helps.
Best luck.
GR.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does google ignore ? in url?
Hi Guys, Have a site which ends ?v=6cc98ba2045f for all its URLs. Example: https://domain.com/products/cashmere/robes/?v=6cc98ba2045f Just wondering does Google ignore what is after the ?. Also any ideas what that is? Cheers.
Intermediate & Advanced SEO | | CarolynSC0 -
Merging Pages and SEO
Hi, We are redesigning our website the following way: Before: Page A with Content A, Page B with Content B, Page C with Content C, etc
Intermediate & Advanced SEO | | viatrading1
e.g. one page for each Customer Returns, Overstocks, Master Case, etc
Now: Page D with content A + B + C etc.
e.g. one long page containing all Product Conditions, one after the other So we are merging multiples pages into one.
What is the best way to do so, so we don't lose traffic? (or we lose the minimum possible) e.g. should we 301 Redirect A/B/C to D...?
Is it likely that we lose significant traffic with this change? Thank you,0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
If I own a .com url and also have the same url with .net, .info, .org, will I want to point them to the .com IP address?
I have a domain, for example, mydomain.com and I purchased mydomain.net, mydomain.info, and mydomain.org. Should I point the host @ to the IP where the .com is hosted in wpengine? I am not doing anything with the .org, .info, .net domains. I simply purchased them to prevent competitors from buying the domains.
Intermediate & Advanced SEO | | djlittman0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1