What is BERT?
BERT stands for Bidirectional Encoder Representations from Transformers. This algorithm was released in open source to the scientific community in 2018. On 10/25/2019, Pandu Nayak (vice president of the Google search engine) explained that BERT is used on Google.
What BERT is used for? The user experience at the heart of Google BERT
The purpose of the BERT update is to improve the understanding of queries, in order to provide more relevant results, especially for queries formulated in a natural way. BERT also helps Google better understand the indexed content.
Basically, BERT allows Google to adapt even better than before to the growth of voice searches (which is not limited to those made with assistants).
In more detail, BERT is also used by Google for the following tasks:
- Understand “textual cohesion” and disambiguate expressions or sentences, especially when a word have many different significations that could modify the contextual meaning of a query or question.
- Understand which entities pronouns refer to, which is particularly useful in long paragraphs with multiple entities. A real and concrete use: automatic generation of featured snippets and voice / conversational search.
- Resolve homonyms issues
- Predict the next sentence
- Answer questions directly in SERPs
Google BERT follows on from the latest changes made by the American giant. They aim to improve the relevance of the results of queries made on the search engine. With Google BERT, robots will take into account the entire query and provide results according to the context in which it is formulated. Previously, keywords were managed individually. This is no longer the case with this new update.
The goal pursued by Google is to provide relevant answers to the voice searches which are more and more numerous and more complex. It’s not just one or two keywords like it can be written. These are often questions or phrases in which a stop word can be important. Queries are often formulated in so-called conversational language. They are therefore more difficult to understand for robots with possible homonyms, linguistic subtleties, etc. Thanks to machine learning, robots will be able to better understand these requests and understand what the user is looking for.
Small impact on natural SEO
If you already apply good SEO practices on your sites, Google BERT will only have a limited impact on your positions. This will only be seen on complex expressions and the long tail and not on the main keywords.
A small number of requests concerned
Google BERT hasn’t done too much to the SERPs. Generic keywords were not impacted by the rollout of this update. The effects focused on expressions which could present a certain complexity: presence of homonyms, numerous stop-words, turn of the sentence, etc.
However, most of the keywords covered by SEO monitoring are generic terms with a large search volume. The majority of them do not fall within the scope of Google BERT.
An update that strengthens good SEO practices
Concretely, there is little to set up to optimize a site according to Google BERT. The quality of the content, its semantic richness, the consistency of the internal linking and the relevance of the backlinks are all elements that will allow your sites to send the right signals to the search engine robots. In the event of a complex request, precise and relevant content will be given priority, as can be seen with the Featured Snipets which are increasingly present in the result pages.
Ben Gomes (Google) believes that using BERT on the search engine, users should do more searches, which would bring more traffic to all sites. And therefore more advertising revenue.
He adds, “As we answer more exotic questions, we hope that this will lead people to ask more and more exotic questions. ”
How does BERT impact Google searches (Quantitative approach)?
According to Google, BERT has an impact on 10% of searches (this figure dates from the launch with the requests made in English in the USA and much less for other languages). This probably concerns less high volume queries consisting of few words.
What is the relationship between the long tail and BERT?
In fact, none! With BERT, Google better understands complex formulations in queries and indexed pages, but “working the long tail” is not “optimization for BERT“.
With BERT, Stop words become important again?
This is not how to see things at all! Either way, you need to write naturally in order to be understood by your readers. And by Google, which is improving on this level with BERT. Use stop words or “little words” where they are needed.
What impact on the ranking for my keywords?
You will probably see much less than 10% of impact, because the requests that you monitor are probably not formulated in natural language.
What type of site can win with BERT?
According to Ben Gomes (vice president of Google Core Search), “niche sites could perform better on niche issues”. As long as you have top notch content that really give solid information to the readers in your topic.
What is BERT used for in other applications?
Here are other things BERT excels at:
- Sentiment analysis
- Classification and analysis of feelings
- Sentence pair matching and natural language inference
- Identification of offensive tweets
- Categorization through predictive sentiment analysis: (IMDB) movies.
Do you have any other questions or you have something to share with us about Google BERT update ? Drop us a line via comments 🙂
This post was last modified on April 24, 2020 06:43