I discovered through Webmaster Tools that the file permissions for the robots.txt file at the root of the Web site were set incorrectly, so I changed them to 777:

chmod 777 robots.txt

This should enable Google to crawl and index the site the next time it tries.

use smart FTP and change the permission of the robots.txt file to 644


I tried to replace my robots.txt, but without success..
Permissions for the folders is 750 and for the files 640
I changed the permissions for robots.txt from 640 to 644, but nothing,I still can`t "fetch as google" most of the pages..

Thanks. I usually give Google sitemaps, but didn't think I needed to provide robots.txt.

However, I've just uploaded robots.txt and checked that mysite.com/robots.txt exists and contains

chmod files that you really dont want people to see as 400 (wp-config.php)
and NEVER chmod 777, if something requires write access use 766 or 775

Its recommended to set the robots.txt file to a 777 permission in order for Google Bot to be able to access the file without any complications.

The worst that can happen as a result of using 777 permissions on a folder or even a file, is that if a malicious cracker or entity is able to upload a devious file or modify a current file to execute code, they will have complete control over your blog, including having your database information and password.

Related Posts Plugin for
WordPress, Blogger...