Connect with us

Internet

Google Search Being Improved to Better Understand Natural Language Queries

Published

on

Google Search can prove to be a frustrating mess when it focuses more on a few keywords in a search query, rather than the contextual details in it. This shortcoming tends to bring up results that are of little to no use. Google is now working to improve its eponymous search engine to better understand user queries, thanks to the magic of machine learning and natural language processing. As a result of the advancements, Google search can now better understand linguistic nuances and search query elements such as prepositions and conjunctions.

Pandu Nayak, Vice-President of Search at Google, wrote in a blog post that the company is employing a technology called Bidirectional Encoder Representations from Transformers (BERT) to improve search. BERT is essentially a neural network-based technique for natural language processing (NLP) pre-training that helps in the creation of custom answering systems. The key advantage of BERT is that it analyses a search query by focusing on the context of a sentence similar to how humans naturally communicate and understand language, rather than just doing a word-by-word analysis.

“With the latest advancements from our research team in the science of language understanding–made possible by machine learning–we’re making a significant improvement to how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search”, Nayak wrote. He adds that improvements to Google Search are more pronounced in the case of English, but the results for languages such as Hindi, Korean, and Portuguese have been encouraging too. The gist is that BERT will allow the search engine to better understand linguistic elements such as “to”, “from”, and “for” among others and will accordingly bring up search results.

The focus here is on enabling users to look up their query on Google Search in the same way they would put it in a conversation with another person, rather than typing a keyword-heavy gibberish in the search field that would make no sense in a natural conversation. Google is currently testing BERT-backed search models in two dozen countries and aims to improve Google Search to an extent that it can bring up relevant results in response to queries of a conversational nature, rather than limiting them to a collection of hit-and-miss keyword-laden queries.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Internet

Google Pixel 6 and Pixel 6 Pro: Everything we know so far

Published

on

Google Pixel 6 is hot in the rumor mill with multiple leaks presenting a deeper peek into the upcoming Google smartphone. It is also speculated that the Pixel 6 will be accompanied by the bigger Pixel 6 Pro phone. It is highly likely, the Pixel Fold and even an XL model may debut alongside the Pixel 6 series smartphones. The … Continue reading

Continue Reading

Internet

Watch woodpecker destroy hidden camera one obnoxious peck at a time

Published

on

You’ve probably seen or heard a woodpecker in action at some point; they’re the small birds that can spend hours driving everyone in the vicinity to madness with incessant taps against trees or wood buildings. What you probably haven’t seen is what it looks like to be the object subjected to the woodpecker’s wrath — until now, that is. A … Continue reading

Continue Reading

Internet

MacBook Pro M1X and Mac mini may release Intel from service in 2021

Published

on

The release date for a new M1X version of the MacBook Pro from Apple, along with a new Mac mini appear to be headed for a special event. It’s been rumored that the most powerful and updated Mac mini with M1X chip inside will be revealed at an event in the forth quarter of this year, 2021. It’s also tipped … Continue reading

Continue Reading

Trending