Search Engine Optimization October 30th, 2019
Google announced the BERT update to its search ranking system last week. Or let’s say what they called the most significant update in five years, after their RankBrain Algorithm Update in 2015.
According to the release, Google says that it will affect 10% of queries, which means Google’s way at better understanding one out of ten queries, in a way humans understand them. With this update, Google is paying more attention to the search context and each word in your search query.
The technology behind this new update is a neural network-based technique for natural language processing (NLP) called “Bidirectional Encoder Representations from Transformers” or BERT.
The Google BERT update was officially announced on October 25, 2019, but reported that it has already been rolling out for more than a few days. Google first talked about the BERT update last year and open-sourced the code for its implementation and pre-trained models.
Transformers are one of the most recent advanced development in the machine learning field. They work great for data and make a useful tool for working with natural language processing and search queries. The technique involves teaching the systems to properly understand the context and order in which a word appears.
And this BERT update also marks the first time Google is using its latest Tensor Processing Unit (TPU) chips to serve search results.
According to Google, this new update will affect complicated search queries that depend on the search context.
This is what Pandu Nayak, Google’s Vice President of Search has to say about their new update:
“At Google’s core, Search is understanding the language we use. And by applying BERT models to both rankings and featured snippets in search, we’re able to understand and do a much better job helping everyone find useful information in their search results. When it comes to ranking search results, BERT will help search better understand 1 in 10 searches in the U.S. in English.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. And as a result, you can search in a way that feels natural for you.”
BERT brings more, better results, said Pandu Nayak, he also added that this change will affect those results which miss those marks more than they do now.
The new update is more focused on interpreting the intent of search queries better.
Rather than looking at the user’s search query on a word by word basis, BERT allows Google to interpret the entire phrase better to give the searcher more accurate results, just like humans interpret a sentence.
Even sparse modification or even simple words in a search query can dramatically alter the search intent. However, the update will not be used for 100% of searches. For now, this update will be used on 1 in 10 search results in the US in English.
As per Google, BERT is a very complex update that pushes the limits of Google’s hardware, which is most probably why it’s only being used on a limited amount of searches.
Google laid out a few examples in its announcement:
In their example, it’s a search for “2019 brazil traveler to usa need a visa”. To understand the meaning of this search, its essential to understand the word “to” and its relationship to other words in the search query.
Here’s another search query: “do estheticians stand a lot at work.” As you can see, in their previous results, the results were shown by matching the keywords, matching the term “stand-alone” in the search result with the word “stand” in the search query. But that isn’t the right use of the word “stand” in context. On the other hand, the BERT model understands that the word “stand” in query is related to the concept of the physical demands of a job, and shows more useful relevant response.
Here’s another search result: “Can you get medicine for someone pharmacy.” With the BERT model, the results are more based on the context and understanding the word “for someone” is an important part of this query, whereas, in the previous results, it missed the meaning, showing general results about filling descriptions.
The new update is here to stay for good! The new Google BERT update is one of the significant updates in recent years. Because this update is focused on providing more context-based search results, there is no need to worry about getting penalized. Instead, it’s more focused on recognizing search intent much better.
Understanding natural language is a complex and ongoing challenge to Google, and they admit that, even with BERT, it may not get a 100% accurate result.
Keep an eye on your search results and let us know what you see on your end with this new update!