Google released a new update called BERT, on October 21. Google claims it’s the largest update in five years. It currently affects 10% of search queries but that is set to grow.
BERT is a recipe that helps data scientists create intelligent AIs which understand natural language better than ever before. When a user searches in sentences or uses voice search with their Google Assistant, BERT is in charge.
There is no known SEO method to exploit it, so focus on producing high-quality content that is highly relevant to your audience’s needs.
In order to allow BERT to understand you: use conversational tone with minimal ambiguity.
Computers have evolved from number-crunching calculators. A programmer uses a programming language in order to communicate with the computer in a way it understands.
Babies, however, are naturally wired to listen, mimic and naturally grow to become fluent in the languages that surround them. Quite clearly, we think and learn differently from computers, so making a computer understand natural day-to-day conversations has been a computational holy grail.
BERT is the latest solution to making computers understand us. BERT or Bidirectional Encoder Representations from Transformers is a technique to turn text like this blog into something a computer can understand.
For instance: homonyms are difficult for computers to interpret. You cannot interpret them on their own, you need more information. “I arrived at the bank after crossing the…” requires knowing if the sentence ends in “… road.” or “… river.” Google’s Transformer Neural Networks are the foundation of BERT and are capable of interpreting the meaning of “bank” by searching for context. The “Bidirectional” in BERT signifies that the computer reads the content from left to right and right to left simultaneously, therefore it can spot the word “road” or “river” to give true meaning to the word “bank.”
BERT has been openly available to data scientists for just over a year now. In this time it has been heavily tested. We do not know the degree to which the BERT Google is using has changed from the BERT they presented in November 2018. However, anyone wanting to rank in 2020 should be mindful of the successes and failures BERT recorded in these tests.
Allison Ettinger of the University of Chicago discovered this particular quirk by using BERT to complete the following sentence “A Robin is a ___”. It resulted in the answer “bird.” However when BERT was shown “A robin is not a___” the result was still “bird.” In the short term, it may be prudent to ensure all copy is straight forward and where possible, states positives.
BERT represents a huge leap in artificial intelligence, however, it does not achieve true comprehension. When fed the following paragraph BERT was able to create a complete sentence, however, it clearly did not understand what was truly being said.
“Pablo wanted to cut the lumber he had bought to make some shelves. He asked his neighbour if he could borrow
her ____.” BERT suggested this gap could be filled with “car, house, room, truck, apartment.” Allyson Ettinger
Clearly copy written with a heavy bias towards rhetorical questions and implicit meaning will not be understood by Google. This may result in a slightly higher bounce rate as irrelevant traffic is directed to your site. Ensure copy is clear and unambiguous.
The good news is that SEO should become fairer. Google just hired a more intelligent umpire to judge the SEO competition, therefore websites that you might feel are unfairly outranking you will suffer.
Irrelevant websites should increasingly vanish from search results. Therefore the proven SEO advice of finding a niche and serving it well with relevant content is increasingly effective.
Keyword research is still useful. We recommend keyword data be used as a guide. Use it to interpret what demand exists in the market and meet it with a relevant solution in a way meets user’s demands more effectively than your competitors.