#753: AR Spatial Audio on Bose AR Frames & QuietComfort 35 Headphones

Michael-Ludden
The Bose AR platform with the Bose AR Frames and the QuietComfort 35 headphones was one of the most exciting announcements that happened at SXSW. These Bose AR Frames wearables feature spatialized audio and serve as a bluetooth headset that has an accelerometer, gyrometer, and magnetometer that when paired with the GPS on your phone can detect where you are and where you’re looking. These are enough of the key ingredients to start to create an augmented layer of spatialized audio that iOS, Android and Unity app developers can start to target. There are a number of different head and body gestures that can also start to be detected including push-ups, squats, “Sup?” nod gesture, shake, double tap, look up, look down, spin around, roll head around — including lots of other potential gestures that could be trained through machine learning.

I had chance to talk with Michael Ludden of Bose AR developer relations to talk about the evolution of their AR platform, what types of apps were being launched at SXSW this year, and how Bose is going to be pushing an audio-first layer of augmentation that pulls people out of their screens so that we can be heads up, hands free, and more IN the world around us.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. So some of the biggest news in the future of immersive media that came out of South by Southwest was from the Bose AR house, where they had Bose AR frames, they had the QuietComfort 35 headphones, where there's a firmware update so that you can actually turn these headphones, these noise-canceling headphones, into a spatial audio platform to be able to interface with these different Bose AR applications that are now being released. So at the Bose AR house, they had all these different stations with different experiences that were showing what you could do with Bose AR platform. And to me, it seems like it's going to be one of the interim steps between where we are now to giving people a taste and experience of immersive media. And we're going to go eventually with getting more and more senses into this immersive reality that is on this horizon. this whole spatial computing revolution is going to eventually have all of our senses, but I don't necessarily expect that we're going to be walking around with AR headsets within the next couple of years. I think that's maybe a long-term vision. A lot of the visual aspect of AR I think is going to be in specific contexts to solve specific problems within your work, but I don't expect that people are going to be just wearing around AR headsets all the time. Maybe we'll start to see that with Snapchat and the Snap Spectacles. That's one of the first wearables that is really encouraging people to put these different devices on their face. And I think the Bose AR Frames are actually one of the next steps in that evolution where it's not necessarily about having a camera and recording people. It's more about putting a spatialized, over-the-ear directional sound into your ears, very much like if you've tried out the Oculus Go, similar to that, but a little bit more robust in terms of the spatialized audio that's coming from these glasses. But also it's a Bluetooth headset, so you can use it to be able to interface with your phone and to have conversations. And because it has all these different sensors that can detect your head pose, what cardinal direction you're looking at, your velocity, you know, there's so many different aspects of having that on your face now is opening up all sorts of new possibilities. So I had a chance to talk to Michael Ludden. He's on the developer relations team at Bose focusing on augmented reality. And so We really do this deep dive into the platform, what's available for developers and what type of applications you could start to build given all this technical capabilities and some of the early examples of what they were showing there at South by Southwest. So that's what we're covering on today's episode of the voices of VR podcast. So this interview with Michael happened on Sunday, March 10th, 2019 at South by Southwest in Austin, Texas. So with that, let's go ahead and dive right in.

[00:02:51.967] Michael Ludden: So my name is Michael Ludden. I'm on the developer relations team at Bose focusing on augmented reality.

[00:02:57.492] Kent Bye: Great. So maybe you could tell me a bit about what's being launched here at South by Southwest.

[00:03:00.935] Michael Ludden: Yeah. So last year, we kind of announced the frames form factor, and we were showing off a 3D printed model. We didn't have any live demos, but we had like a video showing people what it might look like using gestures, et cetera. This year, we're actually announcing the developer platform. We've announced that we're in beta, and it's available if you go to developer.bose.com and also we have the actual frames, which are audio sunglasses, basically Bluetooth audio speaker sunglasses that have Bose AR in them, and I'll tell you what Bose AR in a sec, but we're also talking about QuietComfort 35, so we're showcasing Bose AR stuff on QC35s, and we're also saying that if folks have a manufacture date on their QC35s of after July of last year, it likely has Bose AR in it, meaning it can consume Bose AR apps, and we're also, showcasing the first small crop of Bose AR-enabled apps available in the iOS App Store.

[00:03:56.102] Kent Bye: Great. So it sounds like that there's accelerometers that are both in the glasses and in the headset that is able to detect head pose. And because you're able to detect head pose, you're able to do triggers to see basically what you're looking at or do spatialized audio. Maybe talk a bit about what's actually enabling that.

[00:04:10.530] Michael Ludden: Yeah, that's a really great, succinct description. So we have an accelerometer, a gyroscope, and a magnetometer. And you can get compass heading, directional heading of where a person's looking, wearing the devices. You can do gesture support, so head nod, head shake, double tap. We're showing off a few apps here. There's like a New Balance demo where you do jumping jacks. You can do all sorts of custom gestures. And we're trying to showcase a few different categories of activity-based apps that you can use. Bose for so there's a headspace demo where it helps you like stretch your neck before you do your meditation. There's something called AutoCast which does walking tours and they give you like directional sound so it'll ping you off to the right and if you face it and you double tap it'll tell you the name of the tour and perhaps the name of the venue and you can sort of have an audio interface for consuming that as you walk around. And so those are sort of the building blocks for the platform. And in terms of what a Bose R app looks like, it's essentially a mobile app. We have three SDKs at the moment, one for iOS native, one for Android native, and then Unity for cross-platform.

[00:05:09.496] Kent Bye: All right, so when I was at the IDFA in Amsterdam, Lauren Hutchinson and the rest of her teams at Tomorrow Never Knows created an experience called Pilgrim, which at the time they weren't able to say that they were using Bose AR glasses, but they were using Bose AR glasses. And it was recorded audio of people walking. And then as you were walking, you were able to basically have this interactive audio experience. If you stopped, then it would detect that and then bring another pilgrim along that you can walk along with. And so maybe you could walk through all the stuff that you can detect as you're walking around with these glasses.

[00:05:40.537] Michael Ludden: So since I've joined Bose, I've learned so much more about sensor data and how that works. Essentially, the Bose AR devices can detect motion. So if you're accelerating forward, for example, the phone provides the GPS capabilities. So if you're doing a walking tour and locating where you are in the world, that would be coming from the phone. But you can do some velocity-based triggering of events in an application that says, OK, we know this person stood up, or this person moved forward, or this person bent over, or whatever. And you can trigger different things in the experience, like the Pilgrim demo did.

[00:06:12.992] Kent Bye: And I just did the auto place demo where we were walking around. You have the Bose AR frames. I would double tap based upon where I was with the geolocation, and then it would pull the Yelp information to see if there was a restaurant that was around at that location. I got the sense that it doesn't have any camera. It's not actually looking at what I'm looking at. It's just using that GPS data to correlate information. So it seems like you're able to start to now use the GPS as a locator to see where you're at and then trigger different audio experiences from that.

[00:06:42.658] Michael Ludden: Yeah, so it's a combination of GPS from the phone and then also compass directional heading from the frames. So it can know if you're looking at the building on the right or the building on the left. This is really the first crop of experiences that we have and we think it'll get more refined over time. So right now it's kind of like a large target, right? If there's too many businesses on top of each other, it might think you're looking at one instead of another. But we think that that's going to get more and more sophisticated as both we improve the tools available for developers and developers get smart with the sort of heuristics aspect of Google Maps. directional heading to acceleration using the gyroscope to Wi-Fi, Bluetooth, all the different signals that it can collect, and then it's making an educated guess, especially when you're driving as to what lane you're on and when you're close, but it doesn't actually know with that level of precision. But a user would never know that, nor do they care, right? And so we think that it's going to get more and more towards that as we go along.

[00:07:56.000] Kent Bye: I see. And so maybe you could tell me a bit about the radar as a tool that is going to enable people to, without having a code, be able to put these audio files embedded into the world that can be triggered with either the Bose AR or the other augmented reality experiences powered by the Bose.

[00:08:10.847] Michael Ludden: Yeah, so the creator tool is a web interface that lets anyone, doesn't have to be a developer, create pins with directional audio experiences in space, in the world. And then those pins appear via an app that we've deployed to iOS called Radar. And so we're working with brands and partners to create content experiences all over the place and also create an ecosystem online where it's easily configurable. It's kind of a WYSIWYG tool where anybody can create content.

[00:08:36.610] Kent Bye: So it sounds like that you're working with brand partners at first. Is this something that's going to be generally available for people to be able to upload whatever they want all over the world?

[00:08:43.414] Michael Ludden: So it's not available yet. We're saying later this year we're targeting it to be available. And I can't actually speak more specifically about that. But yeah, we're working on that.

[00:08:51.435] Kent Bye: People talk about the mirror world as you walk around. And I imagine that there's going to be something like the GPS is going to be like the lingua franca of where you're actually at. But there's the open web, there's apps that people have, and there's going to be ways of curating what you actually want to experience within this mirror world out in the world. So it sounds like you're starting with the radar as being this centralized app that people could tap into their AR glasses. And then when they're using this specific hardware, they're able to experience these Bose curated experiences, but eventually maybe curated from everybody. But imagine that people are going to have different apps maybe running simultaneously that are looking at where you're located and see if there's these notifications. And if there's something there, then you maybe double tap on the glasses, and then you're suddenly launched into an experience.

[00:09:38.324] Michael Ludden: Yeah, so I mean, of course, we want to avoid the overlapping, grabbing of your attention, dystopian situation. And I think one of the interesting architectural choices that we made early on, at least on iOS is, and this is a platform limitation as well, but it works in our favor. If you're using the SDKs I mentioned, and you're a developer building your own app that's Bose AR enabled, you have to obviously keep the app open on your phone in the background. And then that's running, and that will be the thing that is determining what your Bose AR experience is. We don't recommend you have multiples open. So Radar is one app that we put out there. But Headspace that we're showing off here has their own app. So does New Balance. So do all the demos we hear. They're their own self-contained apps. And in order to use them, you basically open the app, lock the phone, put it in your pocket, leave the app open, and that's your Boze R layered to your point that you're experiencing there.

[00:10:25.180] Kent Bye: Well, I think one of the big open questions when it comes to audio is that there's not really a agreed-upon standard for spatialized audio. There's ambisonics that are out there, and I understand that you're working with Mach 1, so maybe you could talk a bit about the spatialized audio solution that you are working with Mach 1 about.

[00:10:40.340] Michael Ludden: Well, um... I actually can't, because I don't know many details about that one, so I'm not going to touch the Mach 1, but that's interesting. I do know that, at least in Unity, Unity automatically spatializes sound, so it's really easy for developers to place sound on a game object, and wherever that appears in 3D space, it'll sound like it's coming from the frames of their QC35s. But there are other third-party tools that developers are using to further increase the accuracy of the spatialization. And I think what you're going to see going forward from us, this is what I can say, is we're going to be supporting developers more directly with spatialized audio solutions that we think are optimal for Bose AR-enabled experiences.

[00:11:21.130] Kent Bye: And can you talk a bit about the price points of these different pieces of hardware that people would need to buy in order to start to have these spatialized experiences with Bose AR?

[00:11:28.858] Michael Ludden: So this is the cool thing. Frames are $199. They're $200, basically. But the QC35s, they're $350. That's the normal price for quiet comforts that everybody already has. And in fact, if anybody's bought them after July of last year, there's a good chance they already have Bose R in them, and maybe they didn't know it. And so we've just pushed out a firmware to coincide with South by Southwest that if you update it, you can actually use the Bose R apps that are on iOS already.

[00:11:51.612] Kent Bye: And so, yeah, maybe talk a bit about some of the other apps that are being launched here.

[00:11:55.963] Michael Ludden: Yeah, so also I should say we are going to launch other form factors. We're going to be putting Bose AR on a bunch of devices and we've committed to have a target developer platform of over a million devices in the hands of consumers by the end of 2019. So it's going to pretty quickly become one of the largest XR target platforms that are out there. And so in terms of the other apps that are available, so let me see, we have an app called Comrade AR, which is kind of like a spy espionage interactive story experience. That's available in the App Store now. There is the tour that you said AutoCast. O-T-O-C-A-S-T. That is a walking tour solution and that has Bose AR in it. That's available now. Earplay is another fun app to try. It's easy to demo if somebody wanted to download that. That's probably one of the more quick ones to be able to get the aha moment. Earplay, E-A-R, play. And that's available in the App Store now. Let's see what else. Audiojack. Audiojack is available, thank you. And also NaviGuide is available, so that's what I think you were referring to where you double tap and it pulls in information about the restaurant. That one starts to show you a little bit of the utility, like something I might actually want to do if I just want to walk around a street during South by Southwest, leave my phone in my pocket and experience all the madness, you know? So, those are a few that are available now.

[00:13:04.632] Kent Bye: Is the Bose AR glasses something that you've been able to wear around or use in your day-to-day life? I'm just curious how you've been using them.

[00:13:10.888] Michael Ludden: Yeah, so I actually moved to Austin about six months ago from the Bay Area, so I'm a local here, and I scooted here using a lime bike, and it was sunny, and so I just put on my Bose frames, and I was listening to music. If somebody had called me saying they needed me to be somewhere or something, I could have taken the call just by double tapping. And that was, I mean, that's a really great experience. I was hearing the sound, you know, it doesn't have a lot of sound leaks, so I'm just scooting around on my little scooter with my sunglasses and listening to sound. It's great.

[00:13:37.842] Kent Bye: Well, that was the thing I was really surprised because it seems like there's some sort of directional audio that's almost like projecting the audio into my ear, but other people can't really hear it so much, but I can hear it really well. How does that work?

[00:13:50.273] Michael Ludden: It's pretty ingenious. So it's some new technology we're going to be putting in some other devices. So we're thinking of it as like a separate use case and category than the noise canceling stuff. So it's over ear and it is directional sound. It just basically, directs the sound right into your ear canal, right? And there's very little light leak. They've done a lot of smart engineering to shield the sound and try to reduce that. But the point is that I could still hear if somebody was yelling at me out of the corner of my ear, or if I needed to have a conversation. And it's kind of cool tech, yeah. You'll start to see it appear in more devices.

[00:14:18.810] Kent Bye: So if I was on a phone call and someone was standing right next to me, would they be able to stick their ear up to my ear and hear it? Or what's the limits of how other people can hear what's happening with what I'm listening to?

[00:14:31.663] Michael Ludden: So music, I would say, if it's a very quiet environment, yes. If it's an ambient environment like this or anywhere like a restaurant, not so much. If it's a phone conversation, if somebody is talking, unless you're in just like a dead zone where there's no other sound, you really have to be creepy and get right up to somebody's ear to understand what they're saying. You might hear a little, you know, if you're very close to them, but it's actually not bad. And it's designed for sort of social environments as well, or environments where there's noise, you know, you're at the beach or whatever.

[00:14:56.178] Kent Bye: So if you can take phone calls, it must have a microphone then as well, right?

[00:14:58.739] Michael Ludden: It has a microphone, yep.

[00:15:00.407] Kent Bye: So what kind of social experiences could you imagine that this might be able to enable?

[00:15:04.349] Michael Ludden: I mean, this is why we need developers. The sky's the limit. We've got a lot of building blocks. I mean, on the one hand, the frames are a Bluetooth headset. So you can do whatever you can do with a Bluetooth headset that has a microphone. On the other hand, we have all these cool sensors. So you can do gesture recognition. You can understand what direction a person's looking. You can build a mobile app that takes advantage of GPS, like we talked about. And really, honestly, we need developers to come up with really cool stuff and the killer use cases that are going to be useful to consumers.

[00:15:29.634] Kent Bye: So when I was doing the earplay, they had a number of these gestures, and one was like, look to the left, don't move, because I was moving, they were able to detect that I was moving. 360 degree spin, or ducking, like what are the other embodied gestures that you could use as input?

[00:15:44.383] Michael Ludden: So here's how I can say it. So I can tell you what I've seen developers build already. This isn't what you're limited to. I'm sure there are some we haven't thought of yet, but I've seen push-ups. I've seen squats. I've seen sup, which is just like a, you know, a bro dude kind of lift your chin briefly thing, right? Nod, shake, double tap, right? Like you said, turn around. look up, look down, you know, these sort of building blocks. I'd love to see some more interesting ones. The cool stuff that I would say, if there are any developers that are listening to this, people that buy the devices don't always know what the capabilities are and aren't. So just because you're saying do a jumping jack and you know that, you know, we don't have a tracker on your arm so we don't know if you raised your arm, they don't know that. So to a certain extent it can be kind of magic. for a user, right? You can just use simple heuristics to ask them to do things that are more involved, and they won't know the difference. And also, if you're using machine learning algorithms with neural networks to do gesture detection, you might even be able to get granular enough to where you can understand when a person is doing a jumping jack based on a slight variation in movement versus just jumping up and down. So there's really just a lot of green field, and I'm excited to see what people build.

[00:16:50.051] Kent Bye: I think one would be look north, which would require people to know where they're oriented, but because it has a compass, you could start to have people look in very specific directions.

[00:16:58.961] Michael Ludden: Yeah, so a note on that. I don't want to oversell this. So it has a magnetometer, and you can derive magnetic north. Right now, every time the device starts up, there's a brief calibration period where you wave it around, and then it will snap to magnetic north. But magnetic north is not true north. So you have to account for magnetic declination. It's totally doable. We have a bunch of partners, as you've tried, who have already solved that problem. And we're looking to make more robust developer tools and documentation around that. But yeah, I don't want to just sell it like, oh, it's a compass. You know where somebody is looking instantly. But you can definitely do that.

[00:17:30.563] Kent Bye: Okay, so wherever True North is relative to where you're at located on the earth and you have to I think there's all sorts of Magnetic maps that you have to update every now and again just to make sure that you're up to speed I think the VR headsets had that when they first came out They had some of the magnetic mapping was off. And so yeah, it can impact different things like that. But I was talking to the owner of AudioJack, and he was saying that what he's excited about is in the future, you're going to have all sorts of other wearables, whether it's shoes or watches or being able to feed other types of biometric data or your footsteps into these apps that are then able to overlay on top of that these interactive audio experiences.

[00:18:09.916] Michael Ludden: Yeah, I'm excited about that too. I often give the example of, okay, you're wearing Bose frames or QC35s, but you also have a phone that's in your pocket that's connected via Bluetooth, so you have now two things on your body that can help derive what you're doing, like inverse kinematics and things like that. And yeah, there's all sorts of other wearables. If you own an Apple Watch, that's another thing, right? So yeah, I think that's exciting. I don't even think you need to have full coverage, like every limb decked out in some sort of tracker. But yeah, we're starting to put more and more wearables on our body, and we're going to be able to do more and more things, mix and matching those technologies.

[00:18:42.737] Kent Bye: So what do you personally want to experience within immersive experiences?

[00:18:46.447] Michael Ludden: I'm really interested, first of all, I actually really am on board with the mission of audio first, so taking us out of the screen, I think we're reaching like critical overload with our visual cortex and I mean I have a phone, I have a tablet, everyone's always showing me their phones, I have my PC, I'm looking at a TV right now, there's probably 50,000 other screens around here and then we have wearables on your face where it's gonna give me a metadata on what we see and I love all that stuff. But I think there's also value in pulling people out of that and saying, there's no screen, this is audio. And I also just love the concept of moving beyond one specific sense, which is our most important sense that we dedicate a lot of brainpower to, and starting to combine and expand what augmented reality means. And I am specifically interested in the areas of accessibility. I think there's something really unique here that we're doing that could enable scenarios like this. Like imagine you're a senior citizen and you want to go to a movie, but the loud noises always make it hard for you to hear the dialogue. A company like Bose can't push a special version of the QC35 for that specific scenario, but an app developer can certainly build an app that maybe noise cancels just the loud sounds and enhances the audio, or even in combination with the screen reader, reads off what it is. Things like that, where I can now as a senior citizen go to a movie theater, open that app, turn it on, and I love my Bose because it can give me that functionality. for people who are hard of hearing, people who've lost eyesight, or were born blind. There's a lot of potential use cases for the Bose R technologies, and I hope developers will target that and we can kind of flesh that category out.

[00:20:15.119] Kent Bye: Well, I'm curious to hear, like, how is Bose thinking about immersion and immersive audio? Because without the head pose information, you've just been able to deliver things that are pretty linear and also not necessarily spatialized. But what is the spatialization giving you that you weren't able to do before? And how do you make sense of this level of immersion that's now possible?

[00:20:36.267] Michael Ludden: So, I mean, at its core, binaural audio, spatialized audio, these things can be done on other headsets, on all sorts of headsets, even non-Bose AR enabled Bose headsets. But I think where it starts to get interesting is when you can route the source of a sound in a particular location, and you can move your head and have that stay there. That's the kind of spatialization experience it can potentially enable. And if you close your eyes, like we have a bunch of demos right to the left here, I don't know if you were able to try them with the Creator tool, But it's just incredibly immersive. If you close your eyes and you turn around and you're hearing the waves and then you turn here and you hear the breeze and maybe a birds. There's all sorts of meditation applications for this sort of thing. But I think that is one of the killer features of it. Like you can have, like we're showing off here, a more immersive version of Headspace, right? Where it's a little more active but also the sound can be spatialized in a way that requires like an accelerometer or sensors like that.

[00:21:22.142] Kent Bye: Yeah. And Headspace is a meditation app?

[00:21:24.263] Michael Ludden: Yes, it is. I use it all the time. That's why I keep mentioning it.

[00:21:27.293] Kent Bye: So what can you do with the headspace with this visualization?

[00:21:30.795] Michael Ludden: So, specifically for this demo, they're doing some gesture recognition. So they say, roll your head around. Oh, that's another one. I actually forgot about that. So they actually have a gesture recognizer they built that can detect when you've rolled your head all the way around. And so it just sort of preps you. So it's not really about the spatialized sound right now in this demo that we're showing today, but you could imagine, you know, you could pick your scene. Headspace has these nice things where it helps you go to sleep and it simulates a fire in like a desert night sky, right? So you hear crickets in the background and you hear the crackling of the fire. You know, if you spatialize that and you place the fire in front of you and you move your head a little bit, that'll just root you in that scene a little more, maybe pull you in and relax you a little bit faster. So that's the sort of thing I'm thinking.

[00:22:14.507] Kent Bye: Great. And so for you, what are some of the either biggest open questions that you're trying to answer or open problems that you're trying to solve?

[00:22:22.653] Michael Ludden: I think we're trying to partially solve the screen problem. I know for Bose, what we want to get at, first of all, it's a free developer platform. We're just giving away the SDKs. You just have to sign up. For us, what it is is we want people who have Bose devices to be able to do things that they can't do anywhere else, things in the audio world, things that take them out of their screen. And I think that's part of the mission of why Bose AR.

[00:22:44.834] Kent Bye: Great. And finally, what do you think the ultimate potential of immersive media is and what it might be able to enable?

[00:22:52.756] Michael Ludden: Well, I'm hoping it can enable new senses. So this is not specific to Bose, but there's an example I like to give. There was a UK researcher in 2015 that created a magnetic belt that would rumble. Have you heard of this? You probably have. In the direction of magnetic north. And so people would wear it for a while and it would always be buzzing in that direction. Then they'd take it off and they'd have a little tingle. And it was interesting to me because people who were born sighted and had good eyesight and way found using like a virtual map in their head had a better experience and it really helped them orient themselves faster going forward so they basically gained a sixth sense. If you were born without sight or you lost your sight it was a little less effective because your way finding is more based on landmarks so you know turn left after two streets or at the big tree etc. But I think that sort of thing's fascinating, and I would love to see immersive technologies not just suck our attention and give us information, but give us abilities we couldn't have before. Like, we're kind of creating magic in a sense, right? With this new layer, this augmented layer.

[00:23:47.023] Kent Bye: Yeah, I think it goes back to 1969, sensory addition and sensory substitution, this concept. I did an interview with David Eagleman of NeoSensory, that for people who are deaf, they're able to take information and instead of going through your ear, it'd be sent through your body. So rewiring information into your brain. But yeah, that's a sensory substitution. The sensory addition is this, what you're just talking about. And there's another thing called the Northpaw, which is around your ankle. It's a very similar concept, but it's like a consumer thing that you can buy. You can buy a thing called Northpaw and actually do that. But yeah, just the capabilities of adding all these different layers of reality and to be able to put them into your body. For me, my question is, what are the things that are gonna actually make you feel more connected to what's happening around you? Because, you know, David Eagleman was like, I'm going to put stock market in my body. I'm like, no, that's not like, I don't want to do that at all. You know, I want something more of like, that's going to make me feel more connected to the world around me. And I think that that's the thing that I'm really excited about to see how you can create experiences that actually make you feel more connected to what's happening around you.

[00:24:48.680] Michael Ludden: Yeah, and that's a much more elegant way of putting what Bose R is kind of all about, right? We want to be heads up, hands free, in the world, experiencing it around you, not having to stare at a screen and be a little bit disconnected. So yeah, absolutely. And that's cool. I'm probably going to buy a North Pole. Yeah, that's cool.

[00:25:03.626] Kent Bye: Great. And is there anything else that's left unsaid that you'd like to say to the immersive community?

[00:25:08.312] Michael Ludden: Well, first of all, I love immersive technologies. All about XR. I would just say, yeah, give Bose AR a chance. This is a really big initiative for the company. We're really invested in it. We need great developers. We're willing to learn. We're iterating. This is the first time the company's done a big developer platform like this. We want to do it right. We think there's a lane for audio. And we really need help building that. So please check out developer.bose.com.

[00:25:30.828] Kent Bye: Awesome. Great. Thank you so much. Thanks, Kent. Yeah, thank you. So that was Michael Ludden. He's on the developer relations team at Bose focusing on augmented reality. So I have a number of different takeaways about this interview is that first of all, Well just the fact that there's those three devices of the accelerometer, the gyrometer, and the magnetometer means that you can detect what you're looking at, you can get the head pose, you can get a bit of your movements, they can detect different gestures like push-ups and squats and a what's up nod, or a shake, a double tap, looking up, looking down, spinning around, rolling your head around and then eventually using machine learning there may be all sorts of other gestures that are detectable given all the subtle nuances of you know as you move your body how the different combinations of all those sensors move they might be able to come up with all sorts of other gestures that we don't even know that can be detectable but what that's enabling is as people are walking around with these glasses on they're able to use their body as a controller to be able to interact with these different layers of immersive media that are going to be in these mirror worlds that are overlaid our actual reality and so the GPS is coming from the phone and so that's connected to your Bose R frames but there's going to be so many other wearables that are out there whether it's your Apple watch and so you could start to stream in heart rate variability or you could have these Nike shoes or Under Armour shoes that have these sensors. And so to get actual the footsteps and be able to correlate different feedback, we start to stream all sorts of other content into this platform. So I like to think of it as this like decentralization of these immersive media. There's going to be all these little incremental things that are going to be sending data to your phone to be able to synthesize all this information together to give you these really deep and immersive experiences. And so just to see that South by Southwest this year, some of the early indications of where that's going to go, you're able to like walk down the street and to be able to look in a specific direction. And they'll have a pretty good sense as to whether or not there's a restaurant there. And when you double tap, you'll get, you know, at the beginning, it's just like, oh, this is how many star rating this Yelp restaurant has. It's very simple, but you can imagine that eventually you're going to be able to geolocate people to very specific locations. And at that point, start to then overlay all sorts of deeper context with this audio soundscapes that are going to give people all sorts of different levels of immersion. And so it's this building of that world and that context to be able to move their body through space and to experience and edit, and to experience these different aspects of those spatialized narratives that you're gonna be able to put together. So I'm personally super excited to see where this goes, because this is just the very beginning of, I think, a lot of innovation for what's gonna even be possible. I'm excited to see what happens with their radar platform, which seems to be a pretty easy way to start to take these audio files and kind of a web interface. And so they're going to be working one-on-one with these different companies and brands. Just, you can imagine different big companies are going to be integrating these different aspects out in the world, very similar to something like Pokemon Go, where they were able to pick these very specific landmarks. And then from that, be able to create a whole game around that. I think Bose AR platform is going to be curating and working with a lot of different companies to kind of figure out what mix of things would people possibly want to do. I don't imagine that you'd be able to opt in or opt out to specific things because Michael said that you'd probably likely only have like one application running at the same time because there's going to be specific to the radar. Or you could open up another application where you're doing walking tours or you're able to be guided around the city to see all the different hotspots. And so I can imagine that there's going to be a huge role of people curating those locations and to be capturing these different aspects of those stories and to become the docent and to guide people on tours through these different locations. So there's a certain amount of when you go to a city, you are trying to figure out like what's the vibe of the city, what's happening in the city, what's the story of the city. And I think people who are residents there are going to be able to remix and create different journeys for people to go on. And I see that's going to be a huge application that, you know, since I do a lot of traveling, I would love to go into these different cities and experience these different curators and to go have these different immersive experiences that could be this light way where I'm still able to have my full agency to be able to make decisions to go see things, but maybe it would just help guide and direct me towards these different hotspots and maybe go on a walking tour to be able to see some of these different landmarks within the city, but also overlay different aspects of the story of that city that could be connected to these geographic locations. So it sounds like Bose is really trying to focus on this audio first and to have us get out of the screen and to have heads up, hands free and be in the world. And so I do see that this has the capability to actually connect us more deeply to the world, because that's, I think, a challenge that I see, I think, at least in the short term with You know, there's something like Pokemon Go where you are actually engaged in the world more, but at the same time, it's arguable that when people are immersed within those experiences, they could actually be more disconnected to the world around them because they're just so immersed in the game that they're just not really paying attention to what's happening around the world. and I think the thing that I see about the Bose AR frames is that it's much more about you being immersed in the world as the primary baseline and then you just have all these different layers of context that you can start to overlay the world that can maybe allow you to connect the different aspects to that world and not only to different places so you'll be able to map different places and context over to your physical location but you'd be able to do this time travel thing where you're able to be walking around a very specific location, but to be pulling in information that may be giving you more context and story for things that happened in that physical location. So I'm super excited to see where this all goes as somebody who's a believer of the audio medium. Obviously, as I do a podcast, my mind just got spurred up with all these different possibilities for where this could go. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of the Hour podcast. And this podcast is a listener-supported podcast, which means that I do rely upon the support and donations from my listeners to be able to continue to grow and sustain this type of journalistic coverage. I'm an independent journalist that relies upon the donations of my listeners to be able to support the coverage that I'm doing. If you've been a longtime listener and you've been wondering if now would be a good time to become a donor, yeah, now would be a really great time to join and to support all the work that I'm doing here on the Voices of VR podcast. So just do it, just become a member. You'd feel better about yourself and I think you'd get rid of that guilt of saying, oh, I've been wanting to become a member for a long time. Just do it and become a member and yeah, just support the work that I'm doing here at the Voices of VR podcast. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show