Google BERT algo: Bidirectional Encoder Representations from Transformers

Google BERT

Google is making the largest change to its search system since the company introduced RankBrain, almost five-years ago.

It is said this will impact 1 in 10 queries in terms of changing the results that rank for those queries.

What is BERT?

It is Google’s neural network-based technique for natural language processing (NLP) pre-training.

BERT stands for Bidirectional Encoder Representations from Transformers.

In short, BERT can help computers understand language a bit more like humans do. It is next level of word semantics.

When is BERT used?

Google said BERT helps better understand the nuances and context of words in searches and better match those queries with more relevant results. It is also used for featured snippets, as described above.

In one example, Google said, with a search for “2019 brazil traveler to usa need a visa,” the word “to” and its relationship to the other words in query are important for understanding the meaning.

Previously, Google wouldn’t understand the importance of this connection and would return results about U.S. citizens traveling to Brazil. “With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query,” Google explained.

In the example below, Google can understand a query more like a human to show a more relevant result on a search for “Can you get medicine for someone pharmacy.”

Featured snippet example. Here is an example of Google showing a more relevant featured snippet for the query “Parking on a hill with no curb”.

In the past, a query like this would confuse Google’s systems. Google said, “We placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query.

So we’d return results for parking on a hill with a curb.”

Google explained that there are a lot of ways that it can understand what the language in your query means and how it relates to content on the web. For example, if you misspell something, Google’s spelling systems can help find the right word to get you what you need.

And/or if you use a word that’s a synonym for the actual word that it’s in relevant documents, Google can match those. BERT is another signal Google uses to understands language.

Depending on what you search for, any one or combination of these signals could be more used to understand your query and provide a relevant result.

Take away for us:

We were listening about Natural Language Processing (NLP) since long time, now this is first major real application of the same we are going to see which is going to affect our traffic.

Does this mean technical SEO will go away?

A big NO, technical SEOs will be as important as it was or even more. Google must be changing the way it Indexing the things in its library,

So you can assume crawler will crawl and scrap more info from the site to arrange their library using BERT.

How can optimise for it? There is no specific way to optimise for it as BERT is not a one algo, it is set algorithm.

But there is way to win and improve traffic Just we should write content for users, like we always do.

This is Google’s efforts at better understand the searcher’s query and matching it better to more relevant results.

Get Blog Updates and Free Advice for Your Business
in your inbox

Get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.