Google published an update to the Googlebot help documentation notifying that it’ll stop crawling a web page after reaching the first 15MB of an HTML file. It’s a hard limit, and any code / content after which will not be accounted for in ranking calculations.
When building your page, you’ll have to recognize what’s essential for your users / customers. Omit the information that’s unnecessary to your target audience and make sure your focus is on what that audience needs – there’s an opportunity here to map specific content to the user / customer personas you’ve developed as a guide.
There’s an easy way to identify the file size of an HTML asset built into Google Chrome, if you’re looking at doing an initial assessment of file size across your site. If you’ve identified any HTML files over 15MB, we recommend that you do a content audit on those pages to ensure your messaging / content is crawled and indexed.
The recent amendment to Googlebot’s help documentation has brought awareness to the significance of the page layout and reminds us of the criteria vital to search engine result pages (SERP). Use of the right content focused on the right users, crawlable, and accessible to bots are all critical in getting the best placement for your content in search results.
Although Google has only recently documented this update, they clarified that this policy was not new – it’s been in effect for years.
High SERP performance is critical to increasing web traffic and exposure. Our team of SEO experts will dig into your business, understanding your goals, challenges, intentions, and target audience. We bring the right combination of technical knowledge and services (like content marketing), and our approach is geared to delivering the results that your business needs to drive growth.
* Googlebot help document, learn more about the limits’ full details visit the Google Search Central Blog