What is Google's 15MB Googlebot Limit?

By RK Desk

Lately, Google has updated the Googlebot help document about crawling to explain that the 15MB fetch size limit applies to each fetch of the individual subresources referenced in the HTML as well i.e. JavaScript and CSS files.

Learn More

Arrow

Several months ago, Google added details pertained to 15MB limit that led to a lot of concern in the SEO industry.

Learn More

Arrow

What does the updated help document says,

Learn More

Arrow

"Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Each resource referenced in the HTML such as CSS and JavaScript is fetched separately, and each fetch is bound by the same file size limit. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers, for example Googlebot Video and Googlebot Image, may have different limits."

Learn More

Arrow

What did it read earlier? "Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers may have different limits."

Learn More

Arrow

Gary from Google, said on LinkedIn "PSA from my inbox: The 15MB resource fetch threshold applies on the JavaScript resources, too, so if your JavaScript files are larger than that, your site might have a bad time in Google Search. See googlebot.com for information about the fetch features and limitations. Yes, there are JavaScript resources out there that are larger than 15MB. No, it's not a good idea to have JavaScript files that are that large."

Learn More

Arrow