Cadastre-se agora para um orçamento mais personalizado!

How we search the web is changing, so Google Search is changing too

28 de setembro de 2022 Hi-network.com
Image: Shutterstock/MS_studio

Since its beginnings, Google's search engine has offered a simple user interface: type your search query into a box, and you can find out what the internet has to offer you. The results are largely delivered in the form of blue text, hyperlinked to other sites. In recent years, the results have been upgraded with visual results, related news stories and other forms of related content. 

Even so, Google's own internal research has shown that a significant portion of younger internet users prefer searching for information on social apps, like Instagram and TikTok, over Google products. Those evolving user preferences help explain why Google is evolving its basic search tools to include more multi-modal forms of communication, and to include information from more "authentic" sources, such as social media posts. 

Innovation

  • I tried Apple Vision Pro and it's far ahead of where I expected
  • This tiny satellite communicator is packed full of features and peace of mind
  • How to use ChatGPT: Everything you need to know
  • These are my 5 favorite AI tools for work

"The way that people search and author information was never meant to be constrained to typing words," Cathy Edwards, Google VP and GM of Search, said to reporters ahead of Google's Search On event on Wednesday. "There's so much information out there on the web that comes in different formats and from different voices that have different authority and expertise."

Also:Best cheap 5G phone 2022: No need to pay flagship prices for quality devices

During Wednesday's event, Google announced it will present Search users with content from "creators on the open web." If a user searches the name of a specific city, for instance, they may get results that include visual stories and short videos from people who have visited that place. 

"We can really see, as we enter this new era of search, that you'll be able to find exactly what you're looking for by combining images, sounds, text and speech, organized in a way that makes sense to you," Edwards said. "And that ultimately helps you make sense of the world." 

Edwards acknowledged that "there's some really good content" on TikTok, since it has reduced the barriers to entry for content creation. "We are looking at more ways to bring that into our search results," she said. 

Google's "new era of search" also includes a greater emphasis on community-led conversations on forums like Reddit -another platform that serves as an alternative to Google Search. Google is launching a new feature in Search called Discussions in Forums that brings in those results. 

"There are people who obviously are keen to see more results from Reddit and other community forums in our results," Edwards said. "Fundamentally, this is just about giving people what they want, when and what's most helpful to them when they come to us."

At the same time, Edwards said Google is focused on ensuring that users can find "both the authentic information and the authoritative information" on its search tools. 

"We also see in our research that people come to Google specifically to verify their claims and... to help them decide whether they want to believe something that they might have found on a social feed," she said. "They really trust that they'll be able to find high quality information on Google, and I think that's really important, too."

Google's increased focus on multi-modal search isn't just about the results -it's also about how people are able to ask questions. 

Earlier this year, Google introduced multisearch in beta, allowing users to search a topic using both images and text simultaneously. Now, in the coming months, Google will be expanding that capability to more than 70 languages.

Meanwhile, Google is also improving its Lens capabilities. People already use Google to translate text in images more than one billion times a month, across more than 100 languages. Now, if you point your camera at an image with text on it, Lens will be able to translate the text and overlay the translated text onto the pictures underneath. For instance, if you are reading the label on a bag of chips, the text would appear in your preferred language on that bag of chips. 

"Instead of covering up the original text, we actually erase it and then rebuild the pixels underneath with an AI-generated background," Edwards explained. "And then we overlay the translated text on top of the image. So it really feels like you're just looking at that product package with the translated text." This feature is launching later this year.

Image: Google

Google

Every product unveiled at the Made by Google event: Pixel 8 Pro, Watch 2, Assistant, morePixel 8 Pro vs. Pixel 7 Pro: Is it worth the upgrade?Your Pixel Buds Pro are getting a major software upgrade, and it's totally freeHow to preorder the Google Pixel 8, Pixel Watch 2, and Pixel Buds Pro nowChatGPT vs. Bing Chat vs. Google Bard: Which is the best AI chatbot?
  • Every product unveiled at the Made by Google event: Pixel 8 Pro, Watch 2, Assistant, more
  • Pixel 8 Pro vs. Pixel 7 Pro: Is it worth the upgrade?
  • Your Pixel Buds Pro are getting a major software upgrade, and it's totally free
  • How to preorder the Google Pixel 8, Pixel Watch 2, and Pixel Buds Pro now
  • ChatGPT vs. Bing Chat vs. Google Bard: Which is the best AI chatbot?

tag-icon Tags quentes : Negócio Empresas

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.