- At Google’s I/O conference earlier this month, the company announced that its new augmented reality (AR) feature for Google Maps was now available on its Pixel smartphones.
- Business Insider tried out the feature earlier in the year, and found it incredibly helpful.
- Using your phone’s camera, Google Maps identifies your location and places signs and arrows on your screen to show you where you need to go.
- Below, we describe what it was like to use the new Google Maps AR feature on the streets of San Francisco.
- Visit BusinessInsider.com for more stories.
Anyone visiting a new city knows the confusion of emerging from a subway station and trying to figure out in which direction to turn. Even with your maps app open, orienting yourself in an unfamiliar place can be difficult and lead to some missteps.
That’s why Google’s new augmented reality (AR) feature for Maps is so helpful. Using your phone’s camera, Google Maps identifies your location and places signs and arrows on your screen to show you where you need to go.
At Google’s I/O conference earlier this month, the company announced that the Maps AR feature was now available on its lineup of Pixel smartphones. Google has not yet said when the feature will be rolled out more broadly, but its now in the hands of real people, using real Pixel smartphones.
In February, the Google Maps team gave Business Insider the chance to try out its new AR feature, and ever since, we’ve been eagerly awaiting its arrival on more devices.
Here’s what it was like to use the new Google Maps AR feature on the streets of San Francisco:
SEE ALSO: I know I am part of Apple’s iPhone problem but even after doing all my research, I still don’t feel the need to upgrade
We met members of the Google Maps team at Rincon Park, which is right near the San Francisco-Oakland Bay Bridge. We decided we could all use some caffeine, so to test out the new AR feature, we headed to Blue Bottle Coffee on Sansome Street.
We plugged the destination into Google Maps to begin the journey.
This is where the AR magic happens. When you hold your phone up to eye level, the standard map shrinks into a small circle at the bottom of the screen. The majority of your screen shows you the real world that’s directly in front of you, as if you were looking through the camera.
But this is an “augmented” version of the real world. After a couple of seconds to process my location, the screen displayed big arrows, layered on top of the view, that pointed me in the right direction to start my walk.
But I wasn’t able to hold the phone up for too long. For safety purposes, an alert pops up after a few moments of using the AR feature, telling users to put their phones down while they walk.
And it’s a good thing that I did. My path was filled with electric scooter riders — an AR accident waiting to happen!
Google said it purposely only displays the arrows during “moments of confusion” so that users don’t walk around with their phones in the air, oblivious to those around them.
Those moments of confusion can include when users first start on their journey (like when exiting the subway), when a turn is approaching, or when arriving at a destination.
Still, it was hard not to hold up the phone to check out the new feature and graphics. I did sometimes notice, depending on the angle I held the phone, the arrow animations got a little wonky.
But for the most part, the arrow graphics worked quite nicely.
Rachel Inman, user-experience lead on the project, told me that her team went through multiple iterations before deciding on the arrows graphics. At one point they tried using a blue line, similar to what is used on the 2D maps, but projected onto the view of the real path in front of the user (think Follow the yellow brick road). However, tests found that users would try to stand on the blue lines exactly while walking along their path.
Arrows seem to offer the right amount of guidance.
The “proper” way to use the feature is to walk with your phone down when you know you’re on the right path. This will result in the standard Maps view.
When you need to decide where to turn next, you can tilt the phone upwards. Google Maps then tries to understand your location based on the imagery around you.
That’s right. Along with the fancy new AR feature, Google is also trying to improve the accuracy of your location by using Visual Positioning Service (VPS), Street View, and machine learning.
Google calls this technique “global localization.”
Today, the company relies on GPS and compass tools, but it says that both have limitations, especially in urban environments where there’s a ton of metal and other phones in a dense area.
The technicalities of how Google pinpoints your location can be confusing, but the team says if you’ve ever struggled with the “blue dot” problem — where the blinking blue dot on a map is across the street or on a different block from where you actually are — then you understand the problem they’re trying to solve with imagery and machine learning.
The feature works best if you point the phone towards more permanent structures, like office buildings or cafes, rather than shrubs or bushes that change their shape and color based on the season.
The multicolor dots around the building are meant to let you know that Maps’ visual recognition system is busy trying to identify the object in front of it.
Once Google knows exactly where you are, it displays arrows on your screen to make sure you’re on track and heading in the right direction.
Once you arrive at your destination, a pin-drop will appear with a fun animation to accompany it. Since we were heading to grab a coffee, Google Maps also pulled up information about Blue Bottle in case we wanted to check out photos and reviews before walking in.
After picking up a cappuccino, I turned in the Android demo phone, thanked the Google team, and headed back to the Business Insider office.
Not knowing exactly where I was in San Francisco’s Financial District, I felt myself longing for the AR maps feature.
Even though I knew I was generally heading in the right direction, the AR feature made it so easy to get a quick “sanity check” that I was correct.