Google BERT Update Explained - Blogging for Beginners learn Seo online free step by step

Google BERT Update Explained

Google BERT Update Explained

The new update to Google's BERT will affect 10% of all searches. Discover all you need to know about Google recently announced its new BERT algorithm update. The BERT update from Google was revealed on October 24, 2019, but it was published for a couple of days. Google says this will impact 10% of queries, according to the release, meaning that this is one of the biggest updates to Google in the last 5 years!
 

Definition

BERT stands for “Bidirectional Encoder Representations from Transformers” and is a “neural network-based technique for natural language processing (NLP) pre-training”…

Sound confusing enough?

Don’t worry about the technical jargon; we’re going to break down exactly it means in this article!

October 25 will be a day we will remember for a long time. Google introduced the BERT update, the aim of which is to improve search results for users using more natural speech, conversational queries. As Google states on their blog “(..) when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.”

The BERT update happened because of Google’s work on the science of language understanding. This was possible thanks to machine learning and improvement in comprehending queries. Last year, Google introduced and open-sourced a neural technique for natural language processing (NLP) called Bidirectional Encoder Representations from Transformers (abbreviated as BERT). This enabled anyone to train their own question answering system. This was a result of the giant’s research on transformers, which are models that process words in relation to the other words in the query and not one-by-one in order. Therefore, they (the models) are able to consider the full context of any word by also checking the ones that come before and after it.

What is the BERT model being used by Google?

It is one of the most effective algorithm updates of the last few years, but what does it mean to you exactly? This is what you must know:

What does BERT stand for?


BERT stands for Bidirectional Encoder Representations from Transformers.

What was the BERT update?

With the BERT update, Google improved its understanding of natural language.

Google described BERT as its “biggest leap forward in the past five years.”

BERT was a ‘query understanding’ update. This means Google got better at identifying nuances and context in a search and surfacing the most relevant results.

BERT is most likely to affect longtail searches. Google provided these examples to illustrate BERT’s impact:

2019 brazil traveler to USA need a visa

do estheticians stand a lot at work

Can you get medicine for someone pharmacy

BERT is expected to impact 10% of all searches when it is rolled out globally. It will also have a significant impact on featured snippets across multiple languages.

BERT is NOT replacing RankBrain or other elements of the search algorithm that focus on language—it will be used in conjunction with those elements.

What does BERT mean for SEOs and marketers?

You can’t optimize for BERT. So, what can you do?

Focus on writing better content for your users

Focus on responding to the intent of a searcher

Focus on satisfying the specific motivation of a searcher

Focus on creating incredible experiences for your audience

Is BERT going to hurt my SEO performance?

Current industry chatter says the impact of BERT has been minimal (though there are mitigating circumstances). It’s worth paying close attention to:

Longtail keyword ranking performance

Organic acquisition performance for the top of funnel/awareness content

Featured snippet ownership

Positive competitor movements across any of the above

 


 

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.