- January 18, 2024
- AI Projects
During the presentation of its latest smartphones, Samsung also shared the spotlight that Google introduced an innovative search method for Android devices called “Touch Search”. This new feature allows users to initiate a search from anywhere on their device using gestures such as tracing, highlighting, notating, or tapping. Google’s intention with this feature is to make it easier to interact with its search engine whenever curiosity strikes, whether you’re watching a video, looking at images on a social platform, or texting with friends.
Despite its name, Touch Search goes beyond using swipe gestures. This variety of gestures offers several ways to activate search, such as identifying items in videos or images. Imagine you’re watching a cooking video and you see a Korean corn dog – you can wrap your finger around the snack and ask, for example, “Why are Korean corn dogs trendy?” The feature also extends to other gestures. For example, if you’re discussing dining options with a friend, simply clicking on the name of the restaurant can provide additional information. Also, swipe over text like the words “thrift flip” while watching a short fashion-related video. If something visual catches your eye, fencing or tagging it directly launches a Google search without even having to switch between apps. Google emphasizes that both images and text are searchable using this tagging gesture.
The search results displayed depend on each individual user’s query and participation in certain Google Labs products. A typical text query might return normal search results, while multisearch, which combines images and text (Google calls the feature “multisearch”), uses AI technology. Those experimenting with the Google Search Generative Experience (SGE) through Google Labs will encounter AI-generated responses similar to other queries in SGE. Google expects seamless access to search from any app to be very convenient, avoiding the need to pause what you’re doing to perform a search or take a screenshot for later use.
The rollout of the feature comes at a time when Google Search’s dominance is waning. The predominance of search-intensive pages and spam on the Internet has made it difficult to find reliable information. At the same time, the emergence of generative AI chatbots is challenging conventional search methods, which could affect Google’s main ad revenue stream if people turn to alternative sources of information. So the expansion of the Android operating system to include search capabilities represents more than convenience—it’s a recognition of the need to strengthen Google’s search division with closer integration into the mobile operating system.
The announcement covered several Google AI innovations related to Gemini, Google Messages and Android Auto during the event. An AI-driven resume for “multi-search” Google Lens was also unveiled.
Touch to Search is scheduled to debut on January 31st, and will initially be available on the newly announced Galaxy S24 series models, as well as flagship Android phones such as the Pixel 8 and Pixel 8 Pro. This feature will be available in every language and in every country where these phones are sold. Google expects that support for this feature will eventually roll out to additional Android phones.
- Innovative Ai Technology Predicts Data Trends To Improve Storage Efficiency
- OpenAI Introduces Sora, An Ai Tools That Turns Text Prompts Into Instant Video
- The Meta Calls For Standardized Labeling Of AI-Generated Visual Content
- Reinforcement Learning with OpenAI Gym
- Amazon Introduces Rufus, a New AI-Powered Shopping Assistant in Its Mobile App
- Scientists Create Algorithm to Analyze Users’ Eye Movements on Screens
- Using AI to Enhance Creative Expression in Art Therapy
- Theano Overview and Guide
- Anthropology Researchers Have Discovered That AI Models Can Be Taught To Cheat
- Google Introduces “Circle to Search” – Revolutionary On-The-Go Search Feature for Android Devices
Get regular updates on data science, artificial intelligence, machine