On the 21st of October 2019, Google’s search engine rolled out its latest update for English language queries: the Bidirectional Encoder Representations from Transformers (BERT) algorithm. It impacts approximately 1 out of 10 queries—the informational queries in particular. If it hasn’t impacted your site yet, rest assured it eventually will as your traffic grows.
So, what is BERT, how does it work, and what can you do? Here’s everything you need to know about BERT, and how you can improve your site’s content to accommodate one of the biggest Google algorithm updates to date!
What is BERT?
Unlike previous algorithms, BERT was created to have better Natural Language Processing (NLP). This is the way in which computers understand and communicate Philippines Photo Editor with humans and a component of Artificial Intelligence (AI). BERT is both the first fully trained language model and the first bidirectional contextual language model, meaning that it is the most significant breakthrough model in NLP so far.
In short, BERT helps improve Search by helping Google understand the connotation of a question to better match questions with related answers.
How does BERT work?
Written words are ambiguous and polysemantic (i.e. has many meanings). When spoken, different tones inflected can also result in different meanings. In addition, people tend to type in incomplete sentences without proper punctuation according to how they think. BERT provides context and helps Google better understand human language in conversational English.
Another example would be this search for “do estheticians stand a lot at work”, with the focus of the question being on the physical demands of the job. Previously, the engine understood and matched “stand” to the term “stand-alone”, which was not the proper use of the word in the context of which it was asked. With BERT, the meaning behind the query is understood, and much more useful responses are displayed instead.