It’s weird to think that, as of now, smartphone cameras might be underutilized. Soon, though, that could change in a big way.
Wednesday’s Google I/O developer conference featured a variety of fascinating ideas the tech giant has up its sleeve. Perhaps the most interesting bit, though, pertained to Google Assistant, the company’s voice-embedded, artificially intelligent personal assistant.
In addition to coming to the iPhone, the service soon will be able to analyze what you point your smartphone’s camera at, according to The Verge. Clearly, Apple’s Siri and Amazon’s Alexa have a worthy adversary.
The new feature will be capable of some pretty amazing things. Don’t feel like searching for information on a restaurant? Point your camera at it.
With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17 pic.twitter.com/viOmWFjqk1
— Google (@Google) May 17, 2017
What do you think? Leave a comment.
Google reportedly relies on a technology referred to as Google Lens to make the technology work. The company also has deployed similar versions of the technology with its Google Translate and Google Goggles apps.