Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Temporarily shut down a site
-
What would be the best way to temporarily shut down a site the right way and not have a negative impact on SEO?
-
I asked the Q&A associates their opinion, and several people also responded that a 503 would be the way to go.
-
It is due to some legal matter. So we need it to shut it down
-
Can you give us some more details about the shutdown (the reasons, why it needs to be so long, etc)? We can help you a bit better if we know more information.
When we switched from SEOmoz.org to outdoorsrank.com, we were only down for half an hour, if that. If this is about upgrading, is there a testing server that you can use to get the website rebuilt and tested on the testing/staging server before you make it live? We used multiple staging servers to test out the site and did lots of checks so that we had minimal downtime when it came time to move the site.
-
What if it is more than a week?
-
I'm also assuming that you're talking about just a day or two, and not two months. There was a post on Moz last year about this that can also help, in addition to the good info provided by CleverPhD http://outdoorsrank.com/blog/how-to-handle-downtime-during-site-maintenance
-
Appreciate the positive comment EGOL!
-
That was a great answer. Thanks. I didn't know that.
-
Thank you - please mark my response as Good Answer if it helps.
Cheers!
-
Thank you
-
According to Matt Cutts
"According to Google's Distinguished Engineer Matt Cutts if your website is down just for a day, such as your host being down or a server transfer, there shouldn't be any negative impact to your search rankings. However, if the downtime is extended, such as for two weeks, it could have impact on your search rankings because Google doesn't necessarily want to send the user to a website that they know has been down, because it provides the user with a poor user experience.
Google does make allowances for websites that are sporadically having downtime, so Googlebot will visit again 24 hours later so and see if the site is accessible."
That said, what should you show Google?
http://yoast.com/http-503-site-maintenance-seo/
According to Yoast, you should not show a 200 (ok) or 404 (file not found), but a 503 code on all pages with a retry-after header to Google.
The 503 (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html) tells Google "The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay. If known, the length of the delay MAY be indicated in a Retry-After header. If no Retry-After is given, the client SHOULD handle the response as it would for a 500 response.:
The retry after tells Google when to come back. You should set this to a time that is generous to allow you plenty of time to get everything back up and running.
Another point from Yoast that he links to https://plus.google.com/+PierreFar/posts/Gas8vjZ5fmB - if the robots.txt file shows a 503 then Google will stop wasting time crawling all your pages (and wasting time) until it sees a 200 back on your robots.txt file. So it is key that you get the 503 and retry after properly on the robots.txt
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Do I have to optimize every page on my site?
Hi guys I run my own photography webstie (www.hemeravisuals.co.uk Going through the process optimizing my page for seo. I have one question I have a few gallery pages with no text etc? Do I still have to optimize these ? Would it rank my site lower if they weren't optimized? And how can i do this sucessfully with little text on these pages ( I have indepth text on these subjects on my services & pricing pages? Kind Regards Cam
Intermediate & Advanced SEO | | hemeravisuals0 -
Should I redirect images when I migrate my site
We are about to migrate a large website with a fair few images (20,000). At the moment we include images in the sitemap.xml so they are indexed by Google and drive traffic (not sure how I can find out how much though). Current image slugs are like:
Intermediate & Advanced SEO | | ArchMedia
http://website.com/assets/images/a2/65680/thumbnails/638x425-crop.jpg?1402460458 Like on the old site, images on the new website will also have unreadable cache slugs, like:
http://website.com/site_media/media/cache/ce/7a/ce7aeffb1e5bdfc8d4288885c52de8e3.jpg All content pages on the new site will have the same slugs as on the old site. Should I go through the trouble of redirecting all these images?0 -
SEO site Review
Does anyone have suggestions on places that provide in depth site / analytics reviews for SEO?
Intermediate & Advanced SEO | | Gordian0 -
Regional and Global Site
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa. These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region. My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD. For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site? Many thanks, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Outbound Links to Authority sites
Will outbound links to a related topic on an authority site help, hurt or be irrelevanent for SEO purposes. And if beneficially, should it be Nofollow?
Intermediate & Advanced SEO | | VictorVC0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0