Understand User Intent Better then Ever

What is Google BERT?

BERT Stand for Bidirectional Encoder Representations from Transformers. This algorithm is based on the Natural Language Processing (NLP) and neural network

BERT is an AI algorithm pre-trained to analyze language representation. Meaning the pre-trained representation can be context free and contextual. He processes bidirectional unidirectional text analysis.

Models such as word2vec or GloVe are Context Free. They generate a single word embedding. each word as a unique representation in the vocabulary.

For example the word Jaguar for the animal and Jaguar for the car mean the same thing.

Contextual Model on the other hand analyzes words based on the other word in the sentence. Thus if you take for example: “I bring my jaguar to the garage” and “I was in the jungle when I saw a tiger”

The word Jaguar here has two different meanings based on the context of the sentence

Bidirectional Model Strength

To Help Google understand better the context of the page. Google developed BERT for his bidirectional capacities. 

Before BERT the way Google was reading a text was unidirectional, this model is perfect to understand a word and analyze it based on the previous word. 

So you can think a Bidirectional model can just read the same word based on the next and previous word. That would mean the same word sees itself in a multi-layer model, which creates conflict. In order to solve this issue, the model uses a straightforward technique called the masked word.

The algorithm mask out some word in the sentence and condition each word next and previous in the sentence to figure out the masked word

for example: “The man went to the [mask1], he bought a [mask2] for his car. labels : [mask1] = garage; [mask2] = tire

BERT is the first AI algorithm use to pre-train deep neural network. It can predict if the next sentence is predictable based on the previous one or not’

For example: “I am going to the gym”, next sentence “I am on a boat in entartica” Expect: Not related

How does BERT change SEO?

With this new AI Algorithm that understands the context of the word. BERT helps Google understand the text like a human. He understands the context of the sentence. Compared to RankBrain, Google BERT is different. It will be an addition to RankBrain instead of overwriting it.

Scroll to Top