On October 25th, 2019 Google announced the introduction of its BERT algorithm update. Although said to be among the biggest improvements in how the Google understands searches, practically speaking it doesn’t matter to SEOs and online marketers. In this brief post I’ll explain why.
What is BERT?
BERT (Bidirectional Encorder Representation of Transformers) is a technique used to better understand natural language through the use of machine learning. Essentially this process assesses the full context of a word by reviewing the terminology that comes before and after it within a sentence.
The primary impact of this algorithm update is a dramatic improvement in understanding query intent. Longer, more converstational queries that use prepositions like “for” and “to” are particularly affected. Here’s one illustration:
In this example, Google’s algorithm overlooked the presence of “to.” The result was that the SERP did not properly address the intent of the query.
Why BERT Doesn’t Matter
BERT improves the SERP relevance for longtail, conversational queries which are in low demand. So eventhough it’s a big improvement, practically speaking it doesn’t impact the types of searches for which most optimize. So..
“If a tree falls in a forest and no one is around to hear it, does it make a sound?”
In short, SEOs and online marketers should simply write for users and ignore BERT. This update appears to be more an improvement on Google’s ability to match relevant information to searcher queries.