Never Get Lost Again, Promises Google. Here's How They're Using AI To Deliver.

You step out of a Lyft or Uber, looking for which building your appointment is in. You can’t tell which way to go. You might be running late now. This happens to millions of people millions of times a day. Google knows–they see you jogging back and forth on the block trying to self-orient. If the blue dot problem has ever enraged you, you’re going to like what Google’s changing this summer.

All that frustration is ending, promised Anna (Aparna) Chennapradaga, VP of Product for AR and VR. She addressed the crowd at Google I/O, sharing that the new Google Maps, shipping with new Android phones this summer, will use your phone’s camera to orient for you.

Google AI focuses on vision

“Vision is a fundamental shift in computing for us, and it’s a multi-year journey,” she shared. It’s certainly a deepening of Google’s mission to be an artificial intelligence-centered company, which CEO Sundar Pichai announced a year ago at the previous I/O developer conference. He said Alphabet is betting the company on AI. Looks like no lie. Google has made incredible progress, including Duplex, which wants to book appointments for you with the world’s most human-sounding artificial intelligence known to phones. They have an entire arm devoted to AI investments, too. Alphabet’s progress on artificial intelligence vision is also impressive.

The new Google Lens does three things that can save you serious time:

1)    Orienting you faster. Google has solved what’s referred to as the “blue dot problem.” You know, you’re in an urban environment, and you get directions through Maps, but you don’t know if you need to turn right or left first on the grid to start the pattern. See how Google Lens has made this a snap in the video below:

2)    Recognizing words.

Snap a photo of a menu, a sign or a document. Now you can highlight text on the image to get Google to translate or look up information. “Lens is not just understanding the shape of characters and the letter visually–it’s actually trying to get at the meaning and context of these words,” says Anna.

3)   Making personal recommendations. The Maps will also use your history across the Google platform to suggest information you may care about right where you are. As in, “Hey, surprise! Your old boyfriend lives here now!”–just kidding. Seriously, a new tab called “For You” tells you about places and events in your area, tailored to Google’s knowledge of you (which is vast). For example, a feature called “Match Score” gives you recommendations for restaurants predicting how much you’ll love the food.

Coming soon, Google Lens helps find gear

The new Google Lens can help you find similar products, too. Want a cheaper version of that high-end coffee mug you just spotted? No problem–Google Lens can bring you options.

“Sometimes your question is not ‘what’s that thing’–instead it’s, what’s like that?” Anna pointed out at I/O.  Lens is able to match similar couches, similar crackers, and similar cars. “Lens has to search through millions and millions and items, and we kinda know how to do that–search,” she said with a smile. “We’re using on-device intelligence and cloud TPUs. We want to overlay the live results directly on top of things like concerts, street signs, even a concert poster . . . This is an example of not just how the camera answers questions but putting the answers right where the questions are.”

Artificial intelligence that understands the relevance of what it sees

Google’s new suite of vision artificial intelligence capabilities is fueled by a desire to help computers see the world more like we do. Google continues to launch and test products that mine that idea, like VR Tours:

One of the most powerful uses of VR I see is as a tool for empathy — to look at the world through another person’s eyes, to see and share your story. Now you can easily create and share VR Tours that do just that. #TourCreator #googleio2018 https://t.co/miHJl0LtU2

— Aparna Chennapragada (@aparnacd) May 10, 2018

There’s no question we are highly visual creatures. Human vision is an extension of our brain. By integrating vision with meaning in these early products, Google’s artificial intelligence is taking a major step forward. This new paradigm has implications for literally everything you rely on your eyes for. And with Google’s developer integrations, visual positioning and integrated meaning will become a layer that developers and entrepreneurs will find ways to use for all sorts of new products. Get ready to see a whole new world a whole new way.

Related Posts:

  • No Related Posts