Web Can I put robots.txt for "sorry" server?

Web Can I put robots.txt for "sorry" server?,web,web-crawler,search-engine,robots.txt,google-crawlers,Web,Web Crawler,Search Engine,Robots.txt,Google Crawlers,I'm considering whether I should put following robots.txt for my "sorry server" that returns some sorry message to our customer that we are under maintenance. User-agent: * Disallow: / So here's my concerns/questions: Won't it tell crawlers to not to index our site forever in spite of our server is ready after the maintenance is done? If I put the robots.txt for my sorry server, should I put another robots.txt for our regular server that tells crawlers to "please index our site"? [EDIT] Spe

I'm considering whether I should put following

robots.txt
for my "sorry server" that returns some sorry message to our customer that we are under maintenance.

User-agent: *
Disallow: /

So here's my concerns/questions:

  1. Won't it tell crawlers to not to index our site forever in spite of our server is ready after the maintenance is done?

  2. If I put the

    robots.txt
    for my sorry server, should I put another
    robots.txt
    for our regular server that tells crawlers to "please index our site"?

  3. [EDIT] Speaking of extremes, Won't it delete our site from Google?


#1

You should not use robots.txt for that situation.

A bot that fetches the robots.txt while you’re in maintenance mode might cache it and apply its rules also when your site is back online (with a changed robots.txt that this bot won’t see for some time). And a bot that fetches the robots.txt while your site is online might apply its rules also when your sites is in maintenance mode.

Instead, you should provide an appropriate HTTP header for the maintenance mode: 503 Service Unavailable. This signal alone should be sufficient (at least for somewhat capable bots) not to index the pages.

In addition, you could provide a meta-robots element with a noindex value, or the corresponding HTTP header X-Robots-Tag (see example).