Are Big Robots.txt Files A Problem For Google?

As many SEOs know, robts.txt files are an important part of your SEO website, although it’s just “a small part” of SEO overall.

Robots.txt files tells the search engines like Google and Big what pages they can access and index on your website and which pages not to. they also keep the search engines from accessing certain pages on your site is essential for both the privacy of your site and for your SEO.

During a Google Search Central SEO office-hours hangout, Google’s John Mueller weighed in on the subject where he spoke up about robots.txt and whether or not it’s a good SEO practice to keep them at a decent size.

David Zieger, who is the SEO manager for a large German news publisher, who joined John in the stream, says that that larger robots.txt files might be a concern. In this case, he’s talking files that are over 1,500 lines with a big list of disallows that keeps getting added to as time goes on.

Are there really negative SEO efforts that can result from a huge robots.txt file?

Check out the video below for the answer from John Mueller himself!

Scott Davenport

Scott Davenport is the content writer and social media man of Thrive Business Marketing and Thrive HVAC in Portland Oregon. Writing about the current events of the SEO world, as well as tips and advice that fellow SEOs could use to improve their own SEO campaigns and shares it for the whole world to see!

Leave a Comment

Your email address will not be published.

Ready To Thrive?

Get in touch with us today to start growing your business, or if you’d like to discuss campaign options.