If you’ve been wondering “what does BERT mean?”, well it stands for Bidirectional Encoder Representations from Transformers. It is Google’s biggest and most important update in five years after RankBrain.
In definition, the Google BERT update is a deep learning algorithm on Natural Language Processing (NLP), hence sometimes it is referred to as BERT NLP.
What the BERT algorithm does is understand what the words in a sentence mean by taking the context into account. In essence, it is a way for Google to better understand what is important in search queries.
However, the algorithm only analyses search queries and not web pages. More so, it impacts only 10% of search queries.
How the BERT Update Works
The main objective for the BERT update is to help the user find more relevant information on their search query. The algorithm is particularly meant for understanding longer, more conversational search queries, according to Google. Searches with prepositions such as “for” and “to” are also found to matter a lot when it comes to helping the search engine better understand the context of a phrase or query. However, users can still search in a way that feels natural to them.
In terms of ranking results, the BERT algorithm update will help Search understand one in 10 search queries.
Google BERT Search Examples
Below are just some of the examples that Google showed when they rolled out the BERT update.
In this example, the preposition “to” and its connection to the other words is important to help Search understand the meaning of the query. It’s about a Brazilian wanting to travel to the USA and not Vice Versa.
Previously Google would consider matching keywords instead of understanding the overall context. In this instance, Search previously matched the term “stand” in the query with the keyword “stand-alone” in the result, despite the result not matching what the query is about.
Here Search understands that “for someone” is an important part of this query. Previously, it would miss the overall meaning and have general results on prescriptions. Whereas now it includes results of that “someone”; either a family member or friend getting medication for a patient.
Before the BERT model, Search would get confused with overall the meaning of this query. It would place more importance on the term/keyword “curb” and ignored the term “no” – not understanding how important ‘no’ was to properly respond to the query.
In this example, Google no longer confuses the “adult” context with young adults, which means younger persons.
Google BERT and SEO – How to Optimise your Content
Although BERT is not for web pages, there is a pretty good chance that your site will be impacted. If not, it will eventually be affected as the traffic grows. But also, there is no direct way to optimize your pages for BERT. However, webmasters can optimize their content for users.
- The best way to go about it is to follow entity SEO – which is optimizing your content in a way that establishes a relationship between two things. Afterall, Natural Language Processing is meant to connect a question to an answer. So, your web pages should look to answer the 5Ws and 1H: Who, What, When, Where, Why and How.
This means you will need to:
- have more relevant content.
- establish a connection between ranges of search intent.
- build authority through links – the more you write on a particular niche, the more authority you have.
- Answer the user’s query by creating content that is focused, disambiguated and well-organized for Google to select. This includes using prepositions and transition words in a proper manner. This also improves your chances of winning a featured snippet.
- The word “is” is also regarded as a big trigger word to get a featured snippet. Research shows that Google usually picks paragraphs with the word, besides the lists. Also, you need to be definitive.
- Use long-tail keywords that make sense.
- Make use of FAQs (another way to get featured snippets), as well as Google suggestions.
- Use conversational phrases in your content since more searches are also being done using voice.
Overview – The Google BERT Model
More improvements to the BERT model are still underway as Google is looking to make Search better for people around the world. Here are a few more things to note about the BERT model.
- The BERT update was first rolled out in October 2019 for US English, and 70 more languages in December 2019.
- At the moment, the model can take learnings from one other language (in this case, US English) and apply them to others. Google is planning to expand and make it useful in every other language.
- BERT is also used to improve featured snippets in languages like Hindi, Korean, Portuguese and others.
- This doesn’t mean Search will always get everything right.
Still want to have a look? Then download our Google BERT and SEO presentation for later!