Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
XML Sitemap and unwanted URL parameters
-
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing.
So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong?
Thanks !
-
Our IT department is on a big project and we won't have any support for almost a year, that's why I was looking at other solutions.
We currently add about 10 to 20 pages a month, so I probably could redo the sitemap once a month, right after the new content is published.
-
Glad I could help
The only other issue I see with this is your sitemap will get outdated quickly if you have a lot of content/pages being added to your site. Additional work or development may be needed to create a fluent sitemap that auto-updates alongside the website.
-
Thanks, I really like your answer.
I should have thought about cleaning it in Excel. I will get right on it !
-
HI Jean-Francois
I would try to keep you sitemap as clean as possible. But could you export all the data into a CSV and clean up the pages using a formula. If you got a full list of your URLs in column A in Excel. Then used the following formula
=LEFT(A1,Find("ref=",A1)-1)
Put this formula into cell B1 and drag the formula down all the rows. This should strip out all of the parameters you do not want. Then simply remove the duplicates and you have your list of URLs to create a clean sitemap.
Let me know if this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL path randomly changing
Hi eveyone, got a quick question about URL structures: I'm currently working in ecommerce with a site that has hundreds of products that can be accessed through different URL paths: 1)www.domain.com/productx 2)www.domain.com/category/productx 3)www.domain.com/category/subcategory/productx 4)www.domain.com/bestsellers/productx 5)... In order to get rid of dublicate content issues, the canoncial tag has been installed on all the pages required. The problem I'm witnessing now is the following: If a visitor comes to the site and navigates to the product through example 2) at time the URL shown in the URL browser box is example 4), sometimes example 1) or whatever. So it is constantly changing. Does anyone know, why this happens and if it has any impact on GA tracking or even on SEO peformance. Any reply is much appreciated Thanks you
Technical SEO | | ennovators0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Is there a maximum sitemap size?
Hi all, Over the last month we've included all images, videos, etc. into our sitemap and now its loading time is rather high. (http://www.troteclaser.com/sitemap.xml) Is there any maximum sitemap size that is recommended from Google?
Technical SEO | | Troteclaser0 -
Should all pagination pages be included in sitemaps
How important is it for a sitemap to include all individual urls for the paginated content. Assuming the rel next and prev tags are set up would it be ok to just have the page 1 in the sitemap ?
Technical SEO | | Saijo.George0 -
XML Sitemap without PHP
Is it possible to generate an XML sitemap for a site without PHP? If so, how?
Technical SEO | | jeffreytrull11 -
Should me URLs be uppercase or lowercase
I'm in the middle of doing a bunch of 301 redirects for me site. Should I make them Lowercase, uppercase, or does it matter? Also, do I want to be using hyphens (-), or underscores (_)? Any other tips? EX: http://www.stupid.com/golf-slippers.html OR http://www.stupid.com/Golf-Slippers.html
Technical SEO | | JustinStupid0 -
Should XML sitemaps include *all* pages or just the deeper ones?
Hi guys, Ok this is a bit of a sitemap 101 question but I cant find a definitive answer: When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages. What do you think?
Technical SEO | | timwills0