This week, Bose made a surprise announcement that it was getting into the augmented reality game. But Bose makes headphones, right? And AR is all about glasses with visual overlays? Well, nobody told them, and that’s a good thing. The company believes that the classic approach works fine for many things, but it still presents barriers (cost of specific hardware, battery life and so on).
Visual distractions also aren’t always appropriate, and sometimes all you need is relevant info — restaurant opening times, points of interest, for example — whispered in your ear. That’s what Bose is offering, and we (me and my colleague Cherlynn Low in the pictures and video above) tried it out for ourselves in downtown Austin at SXSW.
When Bose announced its AR intentions, it did so with a pair of sunglasses, not headphones. This might lead you to think there’s still a visual component, but there isn’t. The reason Bose chose a pair of specs is because a set of “smart headphones” would be predictable, and Bose wanted to shake things up a bit. So, it put its technology in sunglasses to show that it can be used in any kind of head-worn wearable, opening it up to all sorts of possibilities.
When a Bose representative handed me a pair of the glasses, I asked if they used bone-conduction for the audio, but he said no. I slipped them on, and instantly heard music. It had been playing before I put them on, but I hadn’t realized, as it was barely audible until the glasses were sitting on my ears. Bose says it worked on a super-thin mini speaker that “projects” audio into your ears, and was designed with this specific project in mind.
I’ll be honest, Cherlynn and I were both pretty impressed with just the idea of music-playing sunglasses as they were, we hadn’t even moved on to the AR demo yet. The sound quality was very impressive and there was a built-in microphone for answering calls. The glasses were 3D-printed prototypes, but were still light and comfortable to wear.
The AR element works thanks to a nine-axis IMU sensor that, in combination with your phone’s GPS, knows where you are and exactly what direction you’re looking in.
Before we headed out into the world, Bose played us some example audio with local information, or opening times, and demonstrated direction-specific information being played only in the one ear (“to your left is the train station” for example). Those ideas are somewhat possible with a phone and headphones already, the point here is that you will be able to look at something, and call up information about it on request.
To test this for real, Bose took us out onto Austin’s bustling Rainey Street, a lively spot filled with quirky bars and eateries. At the top of the street, I looked at a block of apartments, and double-tapped the side of the glasses (the gesture programmed to call up info for our demo). Initially I was told there was no information available. But I then turned around and looked at a restaurant called “El Naranjo,” double tapped again, and was told the name, the chef, where they trained, opening hours, how long people typically stayed there for and the type of cuisine (Mexican). I repeated this all the way down the street, looking at different businesses, and the glasses responded with impressive accuracy.
Of course, this information was just a demo created by Bose; it’s the technology that’s important. All I can say is that it worked pretty well. Only once did I get info on a bar next to the one I was actually looking at, and that was rectified by a slight adjustment of my head to get my target central to my gaze. Oh, and all the while, I had music playing in my ears, which would dip in volume as information was served up. Bose said that, when using this technology in actual headphones with noise cancellation, developers would be able to focus your attention to alerts etc, by “turning off” ambient noise around you to make sure you hear important details.
And that’s a key point to mention here. Bose isn’t trying to invent everything here (though it does of course plan to use this in its own headphones). It wants product-makers, app developers and creators to use its technology however they want. Training apps could use it to tell you where popular cycle routes are, or even where other runners are relative to you during a race. Other natural fits for the technology include travel info and reviews, of course, but this could just as easily be applied to games and language learning and beyond.
To encourage companies to adopt Bose AR, the audio firm has a pool of $50 million up for grabs to entice developers. So, whether you’re working on a dating app, a food delivery service or anything that could profit from location-specific information, then know that Bose appears to be serious about making it mainstream.
Audio and AR aren’t entirely strangers. We’ve seen rumblings from companies like Harman, Here and games like good ole’ Pokémon Go have all dabbled in augmenting sound in our environment. What Bose seems to be doing differently is making it useful and ubiquitous. By knowing what you’re looking at, and being able to control with gestures (touch, voice recognition or nodding for example) you can interact with apps intuitively without looking at your phone. Whether this is a technology easily replicated by giants like Google (it’s be perfect for Pixel Buds) or Apple remains to be seen.
It’s worth noting that the demo we were given isn’t an indicator of what it might actually be like in real life. The world is big, maps are inaccurate, and sensors can be fooled and confused. But it’s a promising start. If Bose can lure those developers over, and get its platform into a variety of devices, simply looking at something could be the go-to way of learning about the world.
Credit: Engadget