Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
My website hasn't been cached for over a month. Can anyone tell me why?
-
I have been working on an eCommerce site www.fuchia.co.uk.
I have asked an earlier question about how to get it working and ranking and I took on board what people said (such as optimising product pages etc...) and I think i'm getting there.
The problem I have now is that Google hasn't indexed my site in over a month and the homepage cache is 404'ing when I check it on Google. At the moment there is a problem with the site being live for both WWW and non-WWW versions, i have told google in Webmaster what preferred domain to use and will also be getting developers to do 301 to the preferred domain. Would this be the problem stopping Google properly indexing me? also I'm only having around 30 pages of 137 indexed from the last crawl.
Can anyone tell me or suggest why my site hasn't been indexed in such a long time?
Thanks
-
Fair point about the Sitemap. Thanks a lot, I'll take these on board and see what happens from there.
Thanks,
-
Cache won't be built or updated overnight so sometimes the first few caches are a waiting game. How long has this site been live? If it's fairly new, what you're experiencing is common. If it's an older site and you recently started changing a lot of the technical stuff - redirecting, canonicals, etc. it may just take a little while to settle in.
The other major recommendation I would give you is to change your sitemap "change frequency" to be slightly more accurate. Does this page http://www.fuchia.co.uk/products/clothing/dresses/dog-tooth-print-dress.aspx really change "daily"? By having daily on every page you aren't helping Google prioritize their crawl, which means you may get a cache for your dog tooth print dress before you get a new cache for your main page.
So I would fix that, resubmit sitemap and then it's a waiting game. Could be a week, could be two, I've seen it go almost a month but not if you use G+.
-
Hi Matt,
I used ping device and it's pinging fine.
I will work on the Google+ suggestion.
I have resubumitted a Sitemap for both fuchia.co.uk and www.fuchia.co.uk as I verified ownership of both to allow me set preferred domain. I submitted one this morning, so maybe that will help. But we will see.
It seems like the main priority at the moment is getting everything redirected and canonicalised and see if that helps anything.
-
Hi Sanket,
The site has been live for around 3 months I would say.
-
I've found that if you manually ping Google, they often update their cache at the same time.
Google doesn't have a cache for either cache: www.fuchia.co.uk. or cache: fuchia.co.uk. so I don't think it's a canonical issue.
I would suggest a few things:
-
Use PingDevice http://www.pingdevice.com/
-
Put your main domain in a Google Plus post every now and then.
-
Resubmit a sitemap. Usually this gets you crawled fairly quickly and possibly updates your cache.
-
-
Hi,
Your site is open with or without WWW so it is major problem you have to do proper 301 redirect in .htaccess file. Need to implement rel=canonical into your site i did not find that code. I see 243 pages are indexed of your site by google. can i know about the domain edge of your site?? when you have live this site?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moved company 'Help Center' from Zendesk to Intercom, got lots of 404 errors. What now?
Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/*** is where all our articles used to sit https://help.rewind.io/*** is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!
Intermediate & Advanced SEO | | henrycabrown1 -
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
Intermediate & Advanced SEO | | JB19770 -
Why some websites can rank the keywords they don't have in the page?
Hello guys, Yesterday, I used SEMrush to search for the keyword "branding agency" to see the SERP. The Liquidagency ranks 5th on the first page. So I went to their homepage but saw no exact keywords "branding agency", even in the page source. Also, I didn't see "branding agency" as a top anchor text in the external links to the page (from the report of SEMrush). I am an SEO newbie, can someone explain this to me, please? Thank you.
Intermediate & Advanced SEO | | Raymondlee0 -
Why Google isn't indexing my images?
Hello, on my fairly new website Worthminer.com I am noticing that Google is not indexing images from my sitemap. Already 560 images submitted and Google indexed only 3 of them. Altough there is more images indexed they are not indexing any new images, and I have no idea why. Posts, categories and other urls are indexing just fine, but images not. I am using Wordpress and for sitemaps Wordpress SEO by yoast. Am I missing something here? Why Google won't index my images? Thanks, I appreciate any help, David xv1GtwK.jpg
Intermediate & Advanced SEO | | Worthminer1 -
Can I tell Google to Ignore Parts of a Page?
Hi all, I was wondering if there was some sort of html trick that I could use to selectively tell a search engine to ignore texts on certain parts of a page. Thanks!
Intermediate & Advanced SEO | | Charles_Murdock
Charles0 -
Can't crawl website with Screaming frog... what is wrong?
Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
Intermediate & Advanced SEO | | McTaggart
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/0 -
Does anyone know of any tools that can help split up xml sitemap to make it more efficient and better for seo?
Hello All, We want to split up our Sitemap , currently it's almost 10K pages in one xml sitemap but we want to make it in smaller chunks splitting it by category or location or both. Ideally into 100 per sitemap is what I read is the best number to help improve indexation and seo ranking. Any thoughts on this ? Does anyone know or any good tools out there which can assist us in doing this ? Also another question I have is that should we put all of our products (1250) in one site map or should this also be split up in to say products for category etc etc ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0