BERT: Largest Google Update In Five Years


It’s the biggest algorithm change in five years, and it affects every tenth search query: Google BERT Update aims to improve understanding of complex long-tail searches and show more relevant results. Using Natural Language Processing, Google can now better understand the semantic context of a search query. The BERT update is already rolled out – at least for the determination of the Featured Snippets.

Is the Google BERT update rolled out?

BERT has been active since October 24, 2019, according to Pandu Nayak, Google’s vice president of search, in a blog post. For which areas and countries BERT is already active differs in:

  • Organic search results: To start, the BERT update for the determination of the US search results from Google.com has been rolled out and every ten queries. It will gradually be rolled out for other country indices, Nayak said in his blog post. Danny Sullivan also made it clear that there is no release timeline for other countries.
  • Featured Snippets: Slightly different is the implementation of BERT for finding the appropriate Featured Snippets to play over the organic search results as Position 0 with text, spreadsheet, or list. Here – as Danny Sullivan explained via Twitter – BERT is already being used in all 25 languages, for which Google also displays Featured Snippets.

What does BERT mean?

The acronym BERT stands for Bidirectional Encoder Representations from Transformers and refers to an algorithm model based on neural networks. Using Natural Language Processing (NLP), machine systems attempt to understand the complexity of human language adequately. A very good, detailed documentation about BERT can be found in the Google AI Blog.

To put it in a simpler way, BERT uses Google to understand the context of a search query better, as well as interpreting each word in context. This breakthrough is based on computational models called Transformers: they relate a word to all other words in a sentence and not just word for word. This is especially true for the understanding of prepositions as well as the position of individual words within a search query.

Why is the BERT update so important to Google?

According to Google, about 15 percent of all daily searches are made for the first time. Also, the search queries are becoming more and more similar to human, real communication – due to technical developments such as voice search.

The statistics service Comscore expects the share of voice search in two years at 50 percent. Also, search queries are getting longer and longer – 70% of search queries can now be assigned to the long-tail area. People want Google to have the most precise answer to whole question sets, and in a split second – on the underlying technology now also BERT has a significant share.

That is why Google has been working on neural networks for many years, which are able to understand complex search queries better.

  • Hummingbird: In 2013, Hummingbird became part of Google’s algorithm. This algorithm change could now better interpret the entire search query and not search for individual words within the search query.
  • RankBrain: In 2015, RankBrain was added to the Google algorithm and declared the third most important Google ranking factor. This enabled the processing of ambiguous and complex queries beyond a long-tail search. Likewise, RankBrain can better handle search queries and colloquial speech, dialogues, or word creations for the first time.

Which queries are related to BERT?

BERT concerns long-tail searches. Here, BERT can better understand the context of a longer search query typed into the search bar as a question or phrase or spoken as a language-based query.

Google itself has given a few examples of search queries in the mentioned blog post, which can be better understood by the algorithm using BERT and deliver corresponding results.

Example 1: “2019 Brazil traveler to usa need a visa.”

In this example of an organic search result, Google explains, so far, the word “to” and the relationship to the other words in the search query had been underestimated.

BERT update

However, it is fundamental for the understanding – after all, it is about a person from Brazil who wants to travel to the US, and not vice versa. The new BERT model now allows Google to correctly answer the search query according to the actual search intent.

Example 2: Parking on a hill with no curb

In this example, for the determination of the appropriate Featured Snippets, Google explains it in the blog post, the algorithm in the past looked too much on the word “curb” – the word “no” had been ignored.

Google algorithm update

Accordingly, a Featured Snippet was played, which was not very helpful, because it said the exact opposite of what is actually behind the search.

What can SEOs and webmasters do?

There is no comprehensive answer to this. There are no simple tactics that suddenly make a website much better or lose money. Instead, it’s hugely important not to write for an algorithm, but for the people and potential users and buyers who visit and interact with a website.

Leave a Reply

Your email address will not be published. Required fields are marked *