Google is rolling the biggest rolling step forward for search in the past 5 years and one of the biggest steps forward in the history of Search altogether.
The new technology that is introduced last year is called BERT.
BERT stands for bidirectional encoder representations from transformers. Transformers refer to models that process words in relation to all other words in a sentence.
That means BERT models can interpret the appropriate meaning of a word by looking at the words that come before and after. This will lead to a better understanding of queries, compared to processing words one-by-one in order.
Google search users in the US should start to use more useful information in search results:
“Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
Google says BERT went through rigorous testing to ensure that the changes are actually more helpful for searchers.
The Google executives didn’t say whether websites should expect to see more or less traffic. Gomes did add, however, that he expected improving the feature for users would lead to more searches, which would bring more traffic to all websites. “As we answer more exotic questions, hopefully that will lead to people asking more and more exotic questions,” he said.
People who use Google won’t know whether their results are powered by BERT, and can’t revert to non-BERT results. BERT will only be applied to searches in English in the US.