Are Big Robots.txt Files A Problem For Google?

As many SEOs know, robts.txt files are an important part of your SEO website, although it\’s just \”a small part\” of SEO overall.

Robots.txt files tells the search engines like Google and Big what pages they can access and index on your website and which pages not to. they also keep the search engines from accessing certain pages on your site is essential for both the privacy of your site and for your SEO.

During a Google Search Central SEO office-hours hangout, Google’s John Mueller weighed in on the subject where he spoke up about robots.txt and whether or not it’s a good SEO practice to keep them at a decent size.

David Zieger, who is the SEO manager for a large German news publisher, who joined John in the stream, says that that larger robots.txt files might be a concern. In this case, he’s talking files that are over 1,500 lines with a big list of disallows that keeps getting added to as time goes on.

Are there really negative SEO efforts that can result from a huge robots.txt file?

Check out the video below for the answer from John Mueller himself!

Scott Davenport

Leave a Comment

Your email address will not be published. Required fields are marked *

Are You Ready To Thrive?