Jefferson Graham is in the hands-on room at the Made by Google event in San Francisco, showing off new phones, speakers, computer and a camera with a built-in "Google Assistant." TalkingTech. USA TODAY


SAN FRANCISCO — Google thinks smartphone technology should be, well, smarter, and do more of the work for us. 

So on Wednesday Google announced it's rolling out Google Lens as a preview with its new Pixel phones. Pixel users will be the first to try Google Lens in Google Photos and the Google Assistant. It will come to other devices "in time."

Instead of searching the Internet with words, you will be able to search the world with photos. Google Lens turns your smartphone camera into a search engine. You point the camera at something and Google figures out what it is, whether it's a photo from a family vacation five years ago or a painting hanging on the wall.

More: Hello, Google Pixel Buds. Goodbye, headphone jack.

More: Google Pixel 2 XL and Pixel 2: Google unveils its iPhone rival

More: Google launches $49 Google Home Mini, rival to Amazon Echo Dot, and $399 Home Max

More: Recap: Google reveals new Home speakers, Pixel 2 smartphones

It's a new frontier in search, creating an Internet search box that hovers over the real world. Spot a flyer for piano lessons on a telephone pole? Google Lens can grab the email address and shoot off an email. Can't decide whether to watch "Wonder Woman" on Friday night? Point the camera at the screen and ask: "Is this movie worth watching?" Ditto for that new book from Zadie Smith. 

More: Follow USA TODAY Money and Tech on Facebook

The Lens feature is part of Google's big push into an "AI first" world being led by chief executive Sundar Pichai. 

At the heart of Pichai's vision is the belief that we are increasingly moving toward a world that runs on artificial intelligence, meaning no matter what screen we are interacting with — a smartphone or a smart-home device — we will be helped by the invisible hands of smart machines that answer our questions and help us complete everyday tasks.

It's a big leap forward from the days of typing a string of words into the Google search engine, allowing the Internet giant to show lucrative search ads. Now Google is competing with other tech giants to assist consumers in their everyday lives.

Visual search with Lens, like voice search, is one way Google is adapting to how people want to retrieve information and complete tasks.

"In an AI-first world, I believe computers should adapt to how people live their lives rather than people having to adapt to computers," Pichai said at Wednesday's Google event. 

More: Google Pixel 2 XL and Pixel 2: Google unveils its iPhone rival

More: Live: Google reveals new Home speakers, Pixel 2 smartphones

More: Google launches $49 Google Home Mini, rival to Amazon Echo Dot, and $399 Home Max

Google first showed off Lens at its I/O conference for software developers in May. At the time, the use case that drew the most applause was the one that showed how Lens can help with a common and frustrating task: logging into your Wi-Fi network. With Lens, you can take a picture of the sticker on your router that has the name of the network and the password and your phone will automatically connect to it.

Other tech companies have developed visual search features, such as Samsung's Bixby Vision, Amazon's Firefly and Pinterest's Lens. 

How Google Lens works: It's built into Google Photos and Google Assistant. Eventually you will see a Lens button in the Google Photos and Google Assistant apps. Tap on the Lens icon and it will summon information for you.

"The really cool thing about Lens is that it represents a way to interact with the real world that we really haven't had a chance to do before from a search perspective," said Gartner analyst Brian Blau.

Show Thumbnails
Show Captions






Read or Share this story: