Google now lets you circle, scribble or tap to Search but only select Android users get the feature for now

by The Technical Blogs

[ad_1]

Google has a cool new way to search. In an effort to make Search more “natural and intuitive”, Google is rolling out a new feature in Search that will make asking questions on the platform easier for Android users. Right now, to get an answer to a query while you are surfing another app, you need to either switch apps or switch tabs to search for something on Google. However, starting January 31, Android users will be able to search for something on their device with a circle, tap, scribble or highlight, “whatever way comes naturally to you”.

Circle to Search

Essentially, with the new Circle to Search feature, users can use a simple gesture on the display of the Android device to search a query. You can select images, text or videos by circling, highlighting, scribbling or tapping, and without switching apps, you will see a pop-up from the bottom of the screen with the answer to your query.

For instance, you are watching a cat video on Instagram and you see the cat wearing a cute little bow tie. You wonder where you could get one too. So, you can simply pause the video and draw a circle or box around the bow-tie in the video, and you will see a Google Search drawer floating up with suggestions for whatever you searched for.

Unfortunately, the feature isn’t available to all Android users yet. Google says it will start rolling out the feature with Pixel 8, Pixel 8 Pro, Samsung Galaxy S24, Samsung Galaxy S24 Plus, and Samsung Galaxy S24 Ultra. Starting January 31, the mentioned Pixel and Galaxy devices will receive the feature. The search giant hasn’t yet shared the timeline on when the feature will rollout to other Android smartphones.

Search by pointing your camera at anything

Besides Circle to Search, Google has another interesting update to its existing feature. In 2022, Google rolled out a feature that allowed users to point their camera at anything and get information regarding it. This feature worked with both images and text. However, thanks to generative AI, Google has built on this feature and must we say, it’s a lot cooler than before.

Starting today, when you point your camera at something or upload a photo or screenshot, and ask a query using the Google app, the feature will show results that will be beyond just a visual match. For instance, you find a puzzle in your old box of toys, and you have lost the box and instructions that came with it. But you are now adamant on solving it. So, you can use the multisearch experience, point your camera at the puzzle, in the search box below write “how do you solve this puzzle”, and you will get a detailed answer on how you can go about it. There will also be supporting links, like you see in the AI-powered answers on Google Search on the web right now, which will help you dig a little deeper if you would like.

To use the feature, look for the Lens camera icon in the Google app on Android or iOS. The stable version of the feature will only be available in the US, however, for anyone who is not in the US, and if you are signed in to SGE (Search Generative Experience), you can preview this new experience in the Google app.

To turn on SGE, head to Chrome on your computer, make sure that you’re signed in to your Google Account with Incognito mode turned off. From here, click the New tab and then click the Labs icon at the top right of the page. This will turn on the experiment on the SGE card. Then just follow the on-screen instructions and you are done.

Published By:

Nandini Yadav

Published On:

Jan 17, 2024

[ad_2]

Source link

Related Posts

Leave a Comment

Recent Posts

Pigeons swarm Las Vegas neighborhood, nesting at church Study finds adult female elk are badass and can’t be... Vacancy: some more elephants needed in the bush THE TECHNICAL BLOGS

Our Policies

Userful Links

Shop Stores

Copyright @2020  All Right Reserved - Designed and Developed by DSF SEO COMPANY