- Joined
- Sep 3, 2014
- Messages
- 6,246
- Likes
- 13,132
- Degree
- 9
Let me try to summarize all of this as I consume everything I can find about it. It may seem garbled, because I'm doing this on the fly.
Official Announcement
Google dropped a post today (October 25th, 2019) on their official blog called Understanding searches better than ever before. It talks about how last year they introduced and open-sourced a neural network technique for natural language processing (NLP) pre-training. It's called Bidirectional Encoder Representations from Transformers (BERT).
It lets you train a neural network for questions and answers and is based on "transformers:" models that let you consider a word in relation to all other words in a sentence rather than one-by-one in order. It can understand the full context and is useful in understanding the intent behind search queries, in this case.
Here's an example of the improvements:
The key is that they better understand the word "to" in this context, which has previously been considered a stop-word or filler, I suppose to some degree. The link to the blog post above has several more examples if you're interested.
That's the info you get out of the official blog post, but there's been more talk going on that I'll try to summarize below.
More Industry Talk
Time to check the blogosphere / Twitter, etc.
This has been rolling out all week and is almost fully live according to Google. So take a look at your traffic for this week and see if you've had any impact.
It will be used globally in all languages in Search and in the Featured Snippets. It's been particularly useful in Hindi, Portuguese, and Korean.
BERT does not replace RankBrain but improves it as an additional measure. It sounds like it's either RankBrain or BERT that a query will be routed through, but not both. How they make that decision, who knows.
Google claims you can't optimize for BERT any more than you could for RankBrain. That makes sense since it's about them understanding a query itself rather than them pulling more relevant results.
Bill Lambert apparently predicted (leaked?) this. In that link, I told the starts of the Bill Lambert saga, it's pretty humorous. It looks like the guys that were originally trying to discredit him switched gears to now trying to take advantage of his juice. Bill Lambert said a change was coming that was a game changer and even told us to look for information on Friday (today), and even told us on Monday that an update was rolling out. Fun stuff.
My own traffic on my big traffic site has lost about maybe 100-200 visitors a day, but who knows if that's related to BERT or them correcting the previous algorithm changes they released. They always go too far then back off a bit.
How's it looking for you guys?
Official Announcement
Google dropped a post today (October 25th, 2019) on their official blog called Understanding searches better than ever before. It talks about how last year they introduced and open-sourced a neural network technique for natural language processing (NLP) pre-training. It's called Bidirectional Encoder Representations from Transformers (BERT).
It lets you train a neural network for questions and answers and is based on "transformers:" models that let you consider a word in relation to all other words in a sentence rather than one-by-one in order. It can understand the full context and is useful in understanding the intent behind search queries, in this case.
Here's an example of the improvements:
The key is that they better understand the word "to" in this context, which has previously been considered a stop-word or filler, I suppose to some degree. The link to the blog post above has several more examples if you're interested.
BERT is being applied to Search and to the Knowledge Graph and is expected to impact 1 in every 10 queries, or 10%. It's being called "the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search."
That's the info you get out of the official blog post, but there's been more talk going on that I'll try to summarize below.
More Industry Talk
Time to check the blogosphere / Twitter, etc.
This has been rolling out all week and is almost fully live according to Google. So take a look at your traffic for this week and see if you've had any impact.
It will be used globally in all languages in Search and in the Featured Snippets. It's been particularly useful in Hindi, Portuguese, and Korean.
BERT does not replace RankBrain but improves it as an additional measure. It sounds like it's either RankBrain or BERT that a query will be routed through, but not both. How they make that decision, who knows.
Google claims you can't optimize for BERT any more than you could for RankBrain. That makes sense since it's about them understanding a query itself rather than them pulling more relevant results.
Bill Lambert apparently predicted (leaked?) this. In that link, I told the starts of the Bill Lambert saga, it's pretty humorous. It looks like the guys that were originally trying to discredit him switched gears to now trying to take advantage of his juice. Bill Lambert said a change was coming that was a game changer and even told us to look for information on Friday (today), and even told us on Monday that an update was rolling out. Fun stuff.
My own traffic on my big traffic site has lost about maybe 100-200 visitors a day, but who knows if that's related to BERT or them correcting the previous algorithm changes they released. They always go too far then back off a bit.