Default Banner

Locale Adaptive Web-pages now Get Support from Google

Google recently introduced the Googlebot Crawl configurations for local-aware pages and provided insights on how locale adaptive web pages are crawled and indexed.

What are Locale-Adaptive pages?

Web-pages which dynamically change content depending on the user’s origin country or language settings are called locale-adaptive pages.

Barry Schwartz says Google will now be able to handle such content by sending Googlebot from different IPs across the world, as well as let it set language settings.

Google will use two methods of crawling and indexing:

  1. Geo-distributed crawling: When Googlebot uses IP addresses coming from outside the USA in addition to the current IP addresses Googlebot uses that appear to be from the USA.
  2. Language-dependent crawling: When Googlebot starts to crawl with an Accept-Language HTTP header in the request.

It is important to understand how Googlebot currently recognizes whether a website has locale-specific content and  uses different crawl settings.

There are 3 approaches:

  1. Serving different content on the same URL—based on the user’s perceived country (geolocation)
  2. Serving different content on the same URL—based on the Accept-Language field set by the user’s browser in the HTTP request header
  3. Completely blocking access to requests from specific countries

This update from Google solves the problem that Google crawlers have understanding sites that have utilized locale-adaptive techniques. On Webmaster support Google said that “new crawling configurations are enabled automatically for pages we detect to be locale-adaptive, you may notice changes in how we crawl and show your site in Google search results without you altering your CMS or server settings”.

What does this mean to Webmasters?

  • Usage of different URLs or TLDs is important for sites having content specific for different countries or languages.
  • Usage of ‘rel=”alternate” hreflang’ annotations with separate URLs needs to be continued
  • Verification of Googlebot geo-distributed crawls can be done using reverse DNS lookup
  • Robot exclusion protocol needs to be consistent across the site; robots meta-tags and the robots.txt file should specify the same directives in each locale.

Conclusion

With the latest announcement by Google on support to locale-specific pages, it now becomes easier to get these dynamically changing content pages indexed on Search.

Be ready to witness changes in crawl rates and appearances of web-pages on SERP’s over the next few weeks. Contact Position² to deploy locale-adaptive techniques for your website now!

Team Position2

February 18, 2015

By Team Position2