Recently on the Google blog, it has been announced that the ARCore and Google lens will receive new updates that make them work more efficiently. The Google Lens was available for Pixel devices last year.
Just like the Google Assistant which was initially launched in the Pixel devices and later available widely for rest of the Android-based smartphone manufacturers; the Google lens will also going to be available widely in coming weeks. The best part is that it can be accessed by all the English users of Google Photos app (latest app version) even those are using the iOS platform.
The existing flagship devices those support the Google Assistant will also get the camera-based Google lens experience within the GA app. It is not cleared by the Google yet which are those flagship models but it is sure that it is not now only limited to Google Pixel devices.
Now the question is– What are the benefits of the Google Lens?
The Google Lens uses your smartphone camera in order to help you to understand this world in a better way. According to the Google, the Lens in Google Photos will provide you additional information about the images when you captured them.
The artificial intelligence of the Google Lens will support to recognize common animals and plants like different dog breeds and flower types. It also has the ability to recognize the text from an image in one tap to search, ability to create contacts and events from a photo in one tap.
For example, you click an image of a business card, the Google Lens will automatically identify the contact information; And in the same way, if you are in a park and want to know the name, breed or other information of some plant, insect or animal just clicked the picture and Lens will try to increase your knowledge.
ARCore, Google’s augmented reality SDK is for Android developers, that enable them to create apps those can understand the surroundings such as environment, place objects and information available in them. The Google Lens is an example of such app that builds using the ARCore. At Mobile World Congress, the ARCore 1.0 along with new support for developers released after a long preview period that began in the last year August. Any developer can use the ARCore to create an AR-infused app for Android.
ARCore works on 100 million Android smartphones, and advanced AR capabilities are available on all of these devices.
It works on 13 different models right now
- Google’s Pixel, Pixel XL, Pixel 2 and Pixel 2 XL
- Samsung’s Galaxy S8, S8+, Note8, S7 and S7 edge
- LGE’s V30 and V30+ (Android O only)
- ASUS’s Zenfone AR
- OnePlus’s OnePlus 5
Apart from these mobile devices, Google is also planning to partner with many manufacturers to enable their upcoming devices this year including
- Sony Mobile
ARCore 1.0 also allows the developers to let the apps understand and place virtual assets on textured surfaces like posters, furniture, toy boxes, books, cans and more. Android Studio Beta now supports ARCore in the Emulator, so you can quickly test your app in a virtual environment right from your desktop.