The new business models of the Metaverse have yet to be fully forged? Will the existing structures of surveillance capitalism be extended into the new frontiers of biometric & physiological data? Or will there be new emergent & disruptive models for value exchange. Seedbolt Studios‘ Mike Middleton finds the rules of finance and business to be like a game where it helps him make sense of the motivations of the Big Tech companies who are driving so many aspects of our digital future. He was the lead author on the white paper on Business, Finance, & Economics as a part of the IEEE Global Initiative on the Ethics of Extended Reality.
This paper is split into two parts with the first section covering Business and Industry Models, Governance and Structures with subsections on ways to shape and influence Industry Behavior via Business Governance and Operations, Customer Actions, and Consumer Advocacy Groups. The second part covers the shifting dynamics of the Economy, Financial Services, Banking, & Cryptocurrency. The paper spans an impressive amount of ground on these different areas, and helped me to better understand other vectors of influence for helping to shape the values of major tech companies through the successful efforts like the Environmental, Social, Governance (ESG) movement in the Financial Services industry.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
https://twitter.com/kentbye/status/1533861601681580033
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to The Voices of VR Podcast. So, continuing on in my series of the XR Ethics in collaboration with the IEEE Global Initiative on the Ethics of Extended Reality, today's episode is around business, finance, and economics, looking at all the different frontier business models and the big tech companies and the ways that we were exchanging value on the metaverse, whether that's from cryptocurrency or through virtual currencies, but also all the different rules and regulations around anti-laundering and lots of different implications here, mostly focusing first on the big tech companies and the businesses in terms of how they're interfacing with XR as an industry and what are the different things that you can either do from those companies to be able to put forth a set of ethical principles, but also the board of directors and different advocacy groups and investment firms, lots of different vectors to be able to bring about change, to move things in a certain direction. There's that. The second half is around the economy, financial services, banking, and cryptocurrency. All the ways in which we're actually having these financial infrastructures that are potentially being recreated or supplanted or undermined, and what is emerging field and how to make sense of it, and all the different risks and whatnot. But also, just banking in general. To be able to actually engage with these technologies, you need to have Established relationships with a credit card and everything else and so as we move forward into the future of these computing technologies Then are there going to be ways to have access to these technologies that don't require being tied to a specific financial account? So that's what we're coming on today's episode. Otherwise, it's a VR podcast. So this interview with Mike happened on Thursday April 7th 2022 so with that let's go ahead and dive right in and
[00:01:50.703] Michael Middleton: So my name is Mike Middleton, and I'm the managing director of Seedbolt Studios. And we are a company that does things both in the traditional video game and in the broader interactive experience and activation world. So, you know, your classic, whether it's AR activations or VR, we've been around between consulting for you know, large corporations that everyone knows are in this space or smaller corporations that are trying to get in or the really fun collaborations where you're working with a small team, that's doing something interesting. And so it really, uh, we've been around for five years and it came about a gentleman that used to work for me at another company was already working in the space and reached out and we just, uh, formed a group and, uh, so far so good.
[00:02:36.509] Kent Bye: Great. And maybe you could give me a bit more context as to your background and your journey into XR.
[00:02:41.295] Michael Middleton: Yeah, sure. So I got out of school right in the early internet bubble and ended up in market research for a while and learned through that process that I did not enjoy market research and through various adventures ended up in financial services at an early fintech firm that's still around. In fact, I'm actually back there. I was there for 13 years left until I came back. that does quantitative analysis and reporting, software, does some really interesting research, and has just been out there for a long time helping companies. But over the years, I became personally interested in the broader XR space through entertainment. I mean, let's be honest, entertainment, science fiction games, everything from seeing applications of it in movies like Jurassic Park that now look pretty basic. to even just the certain philosophical questions that rise as you become a user of new experiences. So, I mean, I was thinking just the other day about the early Windows screensavers, where I'd sit there and watch a maze that looks like it was from an 8-bit game. I'd sit there and watch myself go through a maze I wasn't even controlling. It was just a screensaver, right? Just those new moments. And I think for me, one of the tipping points was that I was born with some hearing problems. I have no real balance. And so as a kid, my father was, among other things, an outdoorsman and a writer, and I spent a lot of time on boats getting seasick. And so I started to learn that some of the most interesting technology that come along in my lifetime also causes people to get seasick. And I was thinking about that and wondering what the technological challenges for that are going to be. And so that was my personal investment. And so through just the meandering process that comes along with developing technology. It just, it just always fascinated me, the idea of, you know, being immersed and the idea of having that level of interactive capability with data, which I, I'm a big fan of and, you know, finding out the answers to far more rich questions. And there's lots of areas that from a hobbyist point of view, I enjoy. For example, space matters, for lack of a better word. And, you know, when you follow what's gonna be possible with the data we're getting back and with our ability to analyze that data now, it all just comes together, I think, into a number of points of fascination for me.
[00:04:54.392] Kent Bye: Yeah, I think, you know, the data that we're getting from XR technologies are going to enable all sorts of amazing new applications, but the contextual integrity of that data and where that goes and who's using it and how it may be used against us, I think is a big issue that comes up pretty consistently across all these different IEEE XR ethics papers. Yeah. There's one specifically on privacy that does a deep dive just on the privacy aspect. And then, you know, from my perspective, as I looked at this issue, I identified that the underlying economic business models, a lot of these companies and the larger economic context of everything has to also be a part of the conversation, because if we want to get rid of something like surveillance capitalism, there needs to be something else there that is going to be able to sustain what's already exists within the open web and the internet across all of its corners, but also in the future of the metaverse, what that is going to potentially look like. And so given the opportunity to get involved with this IEEE XR ethics initiative, what was the catalyst for you to take this on, to start to dive in into the different dimensions of business finance and economics?
[00:05:58.322] Michael Middleton: That's a good question. Years ago, in my ongoing effort to become an even bigger geek than I already was, I fell in love with financial compliance. I came to it, I think, too late in my career to focus on it. But I found it fascinating for the same reason that I find if you're a sports fan, knowing the rules of sports fascinating, because on the one hand, it gives you an understanding of the system and its behavior. And also it gives you an understanding of how people might adjust that system or try to work with it, you know, to accomplish their needs. And I found it fascinating, especially because when I started working in financial services, I have no shortage of friends that would say to me, you know, you work in the most soulless industry there is. And, you know, over the years, I've tried to talk to them and say, listen, first of all, it may be that way, but it's not alone. And second of all, I've always been fascinated by human behavior. And one of the things I say to my friends is I'll say to 100 people, do you trust Facebook? And they'll all say no. And I'll say, well, you still use Facebook. So you're acting as if you trust them. And in some ways, participation in the financial services system, participation in the business model or tacit approval of business models of companies that we use has always sort of been a part of the system. I think right now, as XR is growing and as those privacy dangers are already present and as they're growing at the same time, there are so many industries that are trying to sort of reckon with this ethical question and. Who knows how much what's a little too late and what's not. But I think that there's a certain responsibility to get involved and to say, you know, do I want to live in a world where the people that I didn't trust, but at least I knew they were down at the bank now can read my mind or see what I'm doing or or can watch me play a game and make assumptions about me, about other areas of my life, based on how I play that game, that they're going to affect my ability to get alone. It's pretty fascinating. I think in a way, you know, obviously this group has all sorts of selection bias already, and it was people that were interested in it. It's amazing to me how many people aren't particularly interested, you know.
[00:08:00.327] Kent Bye: Yeah, I think the way that you articulate that, I think really reflects the importance of this as a topic. And even though that FinTech and the economics and the finances, you know, may not be Eberti's favorite topic to dive into, it actually is driving so much of everything that we're experiencing. So as you start to break down this as an issue, I noticed that you broke it down into two major chapters in terms of the business and industry models and governance and structures. kind of the relationship between the industries and the users. And then you have the broader economic financial services, banking, and cryptocurrencies. So other broader economic models of exchange, fintech investment, and all these other dimensions of the financial services industry. So maybe let's first dive in into the introduction and you setting the broader context of these two major different areas, and then we'll start to dive into it.
[00:08:50.098] Michael Middleton: Sure. And let me make a quick note, as you know, this has been a volunteer driven effort. And so one of the reasons that this is one chapter right now is that the initial scoping of this all sort of lumped all these things together. I think as this process evolves, business and industry models, governance and structures is really a separate paper from economy, financial services, banking, cryptocurrency, all that kind of stuff. I think it'll play out that way. In this case, it was in one place and We happen to have a group that could tackle it, but I think it emerges that way. But there are certain commonalities right now. So, yeah, I think we can do that. So in terms of introduction, you know, I used, I think, an introduction somewhere. One of the most basic hypothetical examples I've ever seen that can really reach people. It was based on something that the World Economic Forum published. It was this hypothetical account of a kid named Riley playing a VR game and of data being captured about him while he played the game. And then years later, he wasn't able to get insurance of some kind because they detected that his movements are consistent with, let's say, early onset Parkinson's or some other situation like that. Right. It's a very easy to imagine situation because it plays into three specific areas of overlap that I think people understand are problems, but they don't know what to do anything about. I think people now understand that they're giving away more data than left to their own devices they would like to. I think they're also starting to realize they're giving it away for free or for such a nominal value, and it's an incredibly valuable thing. So I think that understanding is starting to slowly get baked into you know, everyday people. At the same time, I think people know that there are companies out there that make decisions based on your data that aren't necessarily going to be in your favor. And so when you add those two things together and then you add this idea that that data can be shared with other companies in a way you don't know who's being shared with, that's become a common theme now. So this particular example, I think, is enough to haunt people into saying, wait a minute, what am I doing? Because no one wants to get on a VR headset at the mall malls exist in five years and suddenly find out that they're not eligible for insurance in years. But, you know, as I say, the Internet's forever. And so that is one of the driving points I make, because, you know, I've said to so many people, I said, when you're playing Pokemon Go, however many years ago was five or six years. And if you thought about the data that you were handing over to this application, where you were going, how fast you could do it, who you were with. your work schedule, your regular schedule, all this information, it just isn't evident to us. And I think that as XR is able to more closely approximate either real life or far worse, the life we wish we had, it will become less evident to us, not more evident to us, what we're giving away and less evident to us that there are now AI models out there that can interpret what we're giving away because they have 5 billion test subjects to work with. I mean, even if it's just unstructured learning. That's just grabbing information and using it and all the black boxes out there that are doing God knows what I mean there's a lot of a lot of mayhem that's going on so we sort of started there because we wanted to make sure that we set the problem and we made it clear that businesses are going to profit. and therefore have an ethical responsibility to care, even if they're not necessarily in that industry itself. And in fact, I have to reach out because the current IE abstract for the paper says it's XR business models, but it's not. It's actually business models for companies that are interacting with XR technologies. And there's obviously a big difference there because sooner or later, everyone's going to be involved in some way. And, you know, right now in the financial services industry and other industries as well, there's this push for ESG, for this level of consciousness and action on environmental social governance factors. And that's such a broad term, it can mean anything. And that's one of the big criticisms of it. But you have corporate boards right now trying to figure out how do we bake in premises that A, look good to people, B, hold up and have some actual meaning and C, can affect our bottom line. That's often what gets people moving is when you say, oh, it's been shown companies do better when they also follow these circumstances. So we did try to set the problem there as an initial premise. And again, we had a lot of space constraints, but that was the idea behind our opening.
[00:13:05.279] Kent Bye: Yeah, I just wanted to provide a bit of context to that article from the World Economic Forum was actually a result from the VR Privacy Summit that I helped organize with Jessica Outlaw, Jeremy Bailenson, and Philip Rosedale in 2018 in Stanford. And one of the big outcomes from that gathering was that we probably should have some sort of institutional review board, just like the medical field has these oversight boards, just to be able to see if everything's in line and there's no transgressions of ethics. But also when you are doing research in the context of a college, then you have some sort of review board that you're living up to some sort of ethical standards. So having some sort of institutional review board for privacy to determine how data is flowing and if it's being used within some sort of standard of contextual integrity for the intent for how it was described that it was being gathered and how it's actually being used. And so that blog post by Jessica Alal and Susan Persky was one of the outputs from that VR privacy summit, which, you know, has continued to be an ongoing issue in terms of trying to figure out the broader approaches towards ethics and privacy that has sent me personally on a whole journey. But I just wanted to add that little additional context.
[00:14:14.545] Michael Middleton: No, that's important. And I knew that. And I think I'd forgotten about a little bit because you lose track of how the things flow over time. Obviously, you planted a good seed there, the root there, because it really is a very articulate exploration of a very simple but scary problem. And one of the reasons I think the privacy push is so important for anyone that cares about this topic is because there are certain types of laws that you can put in place that affect all businesses and allow you to get ahead of technology in ways that legislation simply can't. In the financial industry, for obvious reasons, regulation always follows bad behavior because legislation and regulation are slow processes. That's one of the reasons why in many countries, and even in various multi-country networks, There's nothing you can do to make yourself eligible, for example, for not being under fraud or anti-money laundering or terrorist financing rules, right? There's no magic ceiling you can pass where you say, oh, I've invented this new application of blockchain and it's not covered by this. Those laws are very broad for a reason. And a push from people about privacy may be one of the ways that people that really do care about the ethics can get ahead of the increasingly niche specific nature of lawmaking in a lot of countries, including the US, where you have some law that's specifically for medical device makers that really breaks the law down, but doesn't apply to their industries. So I think that that is one of the great battles to care about fighting, in my opinion.
[00:15:48.950] Kent Bye: Yeah. And before we start to dive into some of the recommendations in the media, the paper, I wanted to go back to the introduction. There's a very interesting little segment in the footnote that I noted where you talk about business models, and also you say the main function from the company perspective as described by. Novacola at all, is creating and capturing value. Specifically, three questions are asked. Who is your customer? What is your customer value? And how do you make money in this business? That seems to be a pretty core principle that I think is maybe worth calling out when we talk about this in terms of both what value is being captured, the customers, and what are the customers' value, and then how the business is actually making money.
[00:16:27.128] Michael Middleton: Yeah. And that quick note, that section was a contribution from Samira who was my rock during this process. Even though I wrote the bulk of the paper, she really brought in the idea to focus on what a business model is and what it really covers. Cause I think it was important to understand that that's obviously the academic definition of the business model. And this Lord knows the term business models got so many different approaches right now, but that question of who is your customer? What are they value? These are very serious questions. And when you think about major corporations, it's really fascinating to look through how a large corporation defines itself, how it defines who we are and how we make money and how we operate. And I'm talking about for profit corporations here because you learn a lot about the company. You learn a lot about what they think. And sometimes it's nonsense and you end up reading a bunch of marketing lingo. But other companies are pretty straightforward to the point of being boring, specifically in financial services firms. If you open up their disclosures, for example, to the SEC and you read their sections, who are you and what do you do? And they really spell it out and they really say, here's what we do and here's how we make money. And from those basics, you can see that it is not necessarily inherent above industry pressure and a general sense of morality, that it's necessarily always gonna be their responsibility to worry about their customer outside of the services they provide. And as XR brings us more closely together and we're all stuck in the metaverse together, those overlaps are going to be a lot more meaningful. And if you've got 10 companies providing 10 niche services that you feel are one experience, and none of them are concerned about your welfare beyond their service, then you could end up with a really nice looking life jacket that doesn't float. Lack of a better metaphor.
[00:18:14.773] Kent Bye: Yeah, I think as I was reading through your paper, I was thinking about who might be reading this and all these different recommendations. And when I take a step back and look at the XR industry and the dynamics of big tech, there's really like a handful of really huge technology companies that whatever they decide to do with their business models is actually going to drive so much of the other parts of the industry. And with Meta being so based upon having an existing model of surveillance capitalism, and then expanding that out into XR, it feels like they're, at least at this point, they have an app store model, but they've kind of already signaled that they're not really all that interested in maintaining that, but they really want to move to an ad-driven model, but to really get to that scale that they need to be in order to really have that be a viable model, then they have to make it accessible. So whenever I hear Meta talk about ethics, the number one thing that they'll always emphasize is that they're trying to make the technology accessible because they're subsidizing the technology that is $2.99 for a Quest. It costs a lot more for them to make that. We don't know exactly what it is, but they're selling it at a loss because they're trying to build up the ecosystem and eventually have more money that comes from these first-party apps and different aspects of being able to target ads to people. So because of that, when you look at the different business models, there is a bit of an ethical discussion in terms of making the technology cheap enough and accessible enough for people to be able to have access to and to use versus to what degree are we mortgaging our privacy into this invisible harms of the types of biometric and physiological data that are going to be made available. And then we're making that as a decision that we're signing an adhesion contract to give that away. But my thing is we don't know what the unintended consequences of harms of that might be to kind of be sleepwalking into a world where all of this data are being made available and profiling us in different ways. So when I read the first recommendations that you have, that's what I'm thinking about. I'm thinking about the biggest players that are out there and what they're thinking about in terms of what would be the best way to approach this, but also the trade-offs between making the technology more accessible for people through this business model that is capturing data and using that to subsidize everything.
[00:20:34.188] Michael Middleton: No, it's a good question. And it's funny because we definitely thought about that. And in fact, it drove specifically one of the recommendations midway through the paper. Our 11th recommendation specifically was a major XR technology and experience provider with first mover advantage should publicly take a position of transparent and ethical XR standards to create public education and set a high bar for future entrance. And that really is our way of saying, We have seen firsthand what first mover advantage means for platforms in the last 20 years when it comes to the internet, right? And we've seen how many people would say, oh, I am not going to leave Facebook because I use it for marketplace or I talk to someone on it or, you know, in the metaverse, I'm not really going to get rid of applications because I want to be connected to people. Right. When connection is the most important thing, then you're going to use the platform everybody else is on. You're going to use the services everyone else has. And if we've learned something, it's that people will will sleepwalk or in some cases walk with their eyes open. You know, I have an aunt who used to say, I have nothing to hide. So what do I have to worry about? Right. And I'm like, well, You do have something to hide. You have to you have to hide the fact that your hands going to shake and use the controller and they're going to say, oh, you have early onset Parkinson's and everything that you do is going to be affected by that. Right. So I think that what you outline there is why this is such an important topic right now. because we have an opportunity right now to educate the public and educate companies and to put pressure. And I don't know that that opportunity is going to be a recurring opportunity once you have widespread platform adoption. So.
[00:22:13.166] Kent Bye: Yeah. And in response to your number 11, I'd say that the different types of ethical frameworks that Metta have been putting forth, they're almost to justify their already existing behaviors rather than to be a standard that everybody would be able to follow because they're taking the rhetoric of responsible innovation, but paring it down to be very simple, almost ethics washing as a way to show the public that they're thinking about it, but not actually being a viable ethical framework that can really navigate the nuances of a lot of the complicated issues that we're navigating here. And Applin and Flick detailed that in a paper that they did, where they made the same assertion of ethics washing, where they were looking at comparing the responsible innovation principles of META with some of the best practices of what the academic approach for responsible innovation were, that isn't necessarily represented in META's public representation of what their ethical framework is.
[00:23:04.240] Michael Middleton: Yeah, no. And, you know, and it's a very good point. And I'm glad other large players are making it because they're also, of course, inviting the same scrutiny of themselves, whether or not that'll stop them is another matter entirely. Right. But, you know, unfortunately, the nature of ethics washing right now to switch industries for a second. You know, I recently started staying in hotels again after a long time not being in one. And all the hotels I'm staying in say in order to protect the environment, we have reduced our housekeeping. We won't do housekeeping during the day because it saves as much detergent, whatever, and so on. Now, you and I both know that this is a major financial saver for them. It allows them to eliminate staff. It allows them to eliminate the cost of all the bottles being put in. It allows them to turn rooms over quickly. But of course, those weren't the bullet points they put in place. What they did was they put in place an ethical first presentation of why they were doing something that they were going to do anyway. And, you know, it certainly resonates when you see certain corporations taking that approach.
[00:24:06.056] Kent Bye: Well, then I guess I think it's worth unpacking these first four recommendations because it is all about what are some of the different best practices and stakeholders that we would like to see with the caveat that some of the biggest players may look at this and shrug it off and do what they're going to do anyway. But in terms of some of these first principles, maybe we could just walk through these first four.
[00:24:24.980] Michael Middleton: Absolutely. And so our first recommendation was made. with the very same arguments, which is being used right now, which is accessibility. And it was specifically that stakeholders of all types should study how to make available free publicly available XR tools and spaces that do not require identification or collect personal data. These would be the equivalent of public parks. You know, you can walk into a public park, you don't have to show papers and say, I live in this town, right? And we did that to really encourage the idea that the ability to participate in the benefits of the XR world should have structures in place that don't require you to make that trade-off.
[00:25:06.553] Kent Bye: So yeah, it reminds me of like, in terms of public broadcasting, there's like cable access channels and different ways in which that these broadcast media were required to invest some money into local communities for them to be able to produce their own television shows. And the whole idea of cable access and the advent of the internet doesn't quite exist like it did where the government was trying to create these public park spaces. So I guess. As I read this and hear that broader context, I think about the ways in which these companies could create more public access points for these different technologies, whether it's at libraries or whether it's through the cable access of a lot of those entities, because of the decline of cable over the years have lost a lot of their funding. And so these communities spaces around the country don't have a lot of funding or resources. And so that's one thing that comes to mind when I read this first one.
[00:25:56.115] Michael Middleton: Well, absolutely. And also, let's be honest, there are communities in this country that have not had access early on to computers. to the internet, to the type of training that you need to get really good high paying jobs that you don't need a degree for now, things that would really benefit the community. And we're in a situation where if you need to buy a $300 headset and you have to have also a space to use it in and a computer and all the other stuff, you're going to be priced out if you're a member of so many communities. And so our first point was to make this everybody's responsibility. that if we're going to build this new world, there has to be a way to participate in it, where you have certain fundamental levels of access, you have certain rights, and you don't have to give any company, you know, I don't want to pick on any one company, every company has got their ups and downs, any company data that you don't want to give away, you don't have to give away the farm, right? And on that point, the second recommendation that we sort of flowed into was that stakeholders should define and defend these experiences and that grants should actually be made available to those seeking to provide these experiences for the public good. And we decided to merge this. First of all, we had a lot of recommendations in the paper, but also there has to be an understanding that as much as companies need to be well behaved, The XR experience and the XR privacy issues are stakeholder problems, and every stakeholder involved has a problem, and users are stakeholders. And so me or someone else that I know needs to care. I need to care about how to act in this environment, and I need to care about my data being at risk, and I need to care about my neighbor's data being at risk. And so we flowed that into this idea of public parks again to reinforce this idea and to sort of say grants should be tied to it because we felt it was a useful structure to get people to understand that there are past ways we've tackled this problem. We have made available funds for art to be accessible for parks, for swimming pools, for recreation spaces or places kids could go after school. All these different types of infrastructure. We have models we can use. And although we're entering a new world, it doesn't have to be a huge leap of imagination to sort of see what kind of problems we're going to need to solve in this new space.
[00:28:14.429] Kent Bye: Yeah. And the cable access is certainly one of those as well, but yeah, those are all good points. And yeah, maybe we could go to the next ones.
[00:28:21.544] Michael Middleton: Sure. So the third one had to do with basically in stakeholders promoting technology standards that would facilitate the widest possible participation and transparency in XR. And so that really comes down to the importance of what I think the IE is doing, which is, you know, let's really find standards, let's find protocols, let's find platforms that make this as widely available and as transparent as possible. So if there is a platform available, that you can get more services out of and use more tools and applications from that requires you to hand over less data, that is inherently a better platform from this perspective, in our opinion. And as we all know, from any example, the best platform doesn't always work. It doesn't always win. The best protocol doesn't always win. And so we really want to put that up front as one of our main areas. And our fourth recommendation was that these and any other broad scale programs should employ differential privacy or a similarly protective stance to ensure the rights of the privacy rights of those who participate. So we really wanted to call for a higher level of of interest in privacy specifically, making sure that those are participating. There's not just basic safeguards that they're really trying to think of the forefront of privacy and where it's going on how to protect privacy from, you know, the kind of models that are being developed now.
[00:29:35.993] Kent Bye: I really like these ideas. I guess one question I have is, do you feel like it would be the responsibility of the companies making it? Or do you feel like there should be like taxpayer money that would be dedicated towards creating these different places? Not only with a lot of this is based in the US, but this is really a global or international issue. And the caveat would be that not collecting data, it's in contradiction to the entire philosophy for a lot of companies who want to tie every use of every technology to a user account that could then be tied to the data, but also for other reasons for safety and security for them to participate in some of these different worlds is that it has to be tied to a user account and a headset. So some of this goes into direct contradiction to some of the fundamental ideas of those companies. So I'm just wondering, funding it, if you think that this should be something that the companies create a little carve out for these different public good use cases, or if it's something that if it's paid by the taxpayer dollars and from the government, then there's like the equivalent of the XR for business caveats that have a different terms of service that has different uses for this technology.
[00:30:41.395] Michael Middleton: Well, I think it should be both. I think that everyone acting on behalf of the public good has a vested interest in trying to make sure that these programs get attention, get funding. And so I think that waiting for corporations to do this by themselves We've sort of seen what's going to happen there because we have pretty similar models developing to how other elements of the Internet have played out. So I think it's got to start with the everyday user level. But I think corporations should be motivated by entities that are positioned to offer grants, to offer tax breaks, et cetera, to participate in a way that is to the benefit of the taxpayer, of an everyday citizen that's using these features. Why? why reward a company that's going to come in and get a big break to build their headquarters and then is going to take data from the people that live in that town and make their services more expensive? It just doesn't make any sense, right? So I think both those need to exist. And I think that everyday behavior. And one of the benefits of government, because that's a whole separate matter, is how strong should government have a hand in things. But one of the benefits is when it acts on behalf of the good interests of everyday people to, for example, protect them from money laundering or fraud, right? That's an area where it makes sense for it to band together and to perform that function. So I think that's one of the reasons why we really kept those first four recommendations as being out of the hands of companies specifically and being in the hands of anyone who's got a stake in this as a shared ethical responsibility.
[00:32:17.612] Kent Bye: Okay, yeah, that makes sense. And I think that helps set initial context. And as we move into the next section where you start to talk about industry behavior, you're talking about self-regulating organizations that are kind of like the industry trade groups or the lobbyist groups that are designed in a lot of ways to allow unfettered innovation to happen without the government stepping in. It reminds me of the Collingridge dilemma where you can have at the very beginning a lot of regulation, but you don't want to stifle innovation. And so you take a step back And then all of a sudden it's too late to do any meaningful regulation because it's already diffused out and you have to change the behaviors of too many people. And there's not gonna be as much political will once it's been adopted past a certain point. So there's this kind of middle point where it's evolved and matured enough to do some regulation, but not too much as to it being out of control. And so because of that, there's a lot of the industry trade groups that are pressuring governments to not do any regulation at all. But yet there's not the countervailing force to be able to come in and at some point and have some type of regulation. And so I feel like that these self-regulating bodies help facilitate this technology pacing gap. That means that they're just blazing forth at an exponential rate that they're five to 10 to 20 years ahead of what the regulatory bodies are. But yet you have some recommendations within that context to say how some of these different organizations should be behaving.
[00:33:41.423] Michael Middleton: Yeah, no, totally. And, you know, self-regulatory organizations also give an industry a way to express their views, their views of what they think they should be asked to do. It gives them a chance to unite together, not make it one member of the industry only. They can all say, hey, we all did this right. So there is tremendous benefit. There's sort of like those buses with the accordion element in the middle that allows it to flex and turn corners and to behave in a way where the industry can be able to compete and to participate in a very capitalist market, but also have some sort of breaks in place in part to keep government from interfering too much. Right. And that's one of the reasons these are so popular now, to the extent that they're always effective. No, of course not. But they do give that important breathing apparatus between the industry and the government. And so I think that placing responsibility on those groups to really define and make a statement, what does it mean to our self regulatory organization? that privacy and XR is a big problem. What does it mean for us as a responsibility? Do we want to put a time window when we're going to issue a statement? Right. Even those basic questions force the conversation to start happening. So when we start talking about companies, we decided to start at that top layer to say, you know, if an industry wants to be able to act in a pretty unchecked way, that's fine. But tell us what your thesis is.
[00:35:08.198] Kent Bye: Yeah, so maybe we could go through the different recommendations that you have in this section.
[00:35:13.239] Michael Middleton: Yeah, so that was its own recommendation, but then we sort of stressed that guidelines and interpretive statements should start being issued by regulators and lawmaking bodies, global cooperatives, to sort of guide this process, right? So often there's a lack of, whether it's someone saying, here's a model legislation we'd be interested in, or a model standard. Even this kind of basic document that doesn't have a lot of teeth, but sort of sets a starting element can become surprisingly foundational and give structure to companies as they're entering the space where they're saying, oh, you know what? Everyone's thinking about this. This might be a thing that's on lawmakers path and future. At the same time, we recommended it was our seventh recommendation that governments, foundations and other various tax levying and grant making bodies should integrate XR ethical standards into requirements for potential tax breaks and bonds, et cetera, and certain privacy standards include differential privacy. This is a sort of a repeat of one of our earlier recommendations, but we decided to stress it again because we wanted to make it very clear that there needs to be a link between access to public capital and some level of standard and that the self-regulatory organizations, in my opinion, if they're smart, find ways to issue standards that they think should be applicable there and to encourage governments and all these other organizations along and say, hey, listen, here's the standard you should ask for, because at least everyone knows what the rules are and there's something in place. Right. So that was sort of a general admonition to the broader question of industry. And then we really went into what we call governance sort of board of directors level operations, specifically passing resolutions on what they think should happen. Right. And now a board of directors resolution can mean anything. How many Greenwashing or ethics washing resolutions have come out of companies in the last few years. It's really quite significant. But it still gives you something to hang your hat on to say, hey, listen, you know, Company X, you said that you were resolved to adopt standards for XR privacy. You said that three years ago. What have you done? Right. What what can you point to? It does leave a trail for incremental change. and for incremental accountability. And to reinforce that, our ninth recommendation was that stockholders and limited partners, other company owners outside of that governance structure should push for that same action. You know, a public company does have mechanisms for pressure. It's got stockholder meetings. And, you know, if you have a partnership where you've got limited partners, you can talk to the general partner and you can say, you know, it's our desire that the company that we co-own or that we're affiliated with is taking this action, what are you doing about it? And that's where a lot of the pressure in the financial service industry has come with ESG. It's come from people saying, hey, we're part of this company. What is, just to pick on one company, Fidelity doing to show that they're at least taking the problem seriously? And so often I think when a company shows they're at least taking the problem seriously, people will be willing to put up with a lot of different conclusions and a lot of different outcomes, but they want to know that it was treated well. And as part of that, sort of as a trickle down, we also recommended that businesses should integrate those same guidelines into their vendor selection process. The vendor and the supplier due diligence process is such a part of the oxygen of how companies make money from each other, that if you're holding your vendors to have some sort of a position on XR privacy, on the use and collection, and disclosure of XR data on security for your users, it's going to bleed into the process pretty quickly. Because if you need to have something in place along that to get business from a fortune 500 company, you're going to have to do it to stay in business.
[00:38:57.397] Kent Bye: Yeah, as I hear you talk about all these initiatives and efforts from these self-regulating organizations, the thing that comes to mind is the XR Association, which is an industry trade group that represents both Meta, Microsoft, Sony, and HTC. but a number of different of these immersive companies in the XR space. And a lot of the work that they've been working on has been different issues around accessibility and access and making sure that there's a baseline in terms of making sure that these experiences are meeting a certain experiential quality that they're able to make them as accessible as possible. So at least some of the early movements that I've seen that's already starting to happen is around accessibility concerns, less so much around fundamental economic business models around surveillance capitalism. You won't hear much talk about that at all, especially because of XRSA represents a wide range of different companies beyond just meta companies that employ different business models altogether. So they don't tend to make strong stances there. And so if there's anything that's going to come on that front, I feel like it would need to be coming from the government. but at least some of the early movements that I've seen, even meta in their process of accepting experiences or not, they have certain guidelines for accessibility and certainly around harassment and trolling as well, in terms of their own internal systems that they're building to be able to maintain some of these different.
[00:40:17.875] Michael Middleton: And ask questions about what. what's age appropriate and so on. Yeah, they they've identified some major areas that we know we have to do something about. And it really is the classic case. We know it's coming. We know it's being pushed very hard by some parties. And it really will be a question of what what gets decided, what the standards are that become, you know, are we going to live in a world? You know, I'm old enough to remember there was a time when there was no such thing as PG-13. It was a pretty broad brush. In fact, I was talking to someone earlier today in the movie Poltergeist was rated PG. That's not a PG movie, in my opinion, or at least maybe it wasn't at the time. Nowadays, movies can get a lot more intense, right? But it could be that as our experiences in the metaverse change and as we realize how it's affecting people, there will become broad social pressure to make changes and things that we all can't anticipate right now. But it is nice to see the XR Association giving some of that structure, at least, and defining some of those broad categories, for sure.
[00:41:12.630] Kent Bye: Yeah, the ESRB is what does a lot of the ratings and they certainly have that within the store, within Meta. The caveat that I would give is that sometimes there's applications that are really quite large, like VRChat, that have things that are 18 plus type of environments. And there's not a really good way at this point in the absence of age verification to be able to control. Once you're on the app, even met as horizon worlds is rated by ESRB as 13 plus, but they have another policy that you have to be older than 18. So there's certain ways that they have no way of actually enforcing that unless they are looking at user age that's put in there. So yeah, the age appropriateness and the rating systems and getting that all in line that a lot of that has been. on that realm of self regulating, but it needs some updates in terms of really getting up to speed with all the different various concerns and the moderation and trolling and harassment issues. I think there's probably another area here where there's certainly some technological things that each of the companies can do. And that would be something that may be better in the hands of the companies rather than the government coming in and mandating specific protocols for how this technological architecture is playing out. So.
[00:42:23.973] Michael Middleton: Yeah, no, and I think that that kind of broad category, let's be honest, that's a major issue. It's an issue that has not yet been solved for computer applications well outside of XR, you know, cyberbullying. I mean, you can make an application suitable for kids. The moment you let people talk to each other, A kid can say something that's not suitable for the kids to get right. So this is an evolving area of human experience that only so much control can be put on. It's not going to have a perfect record in dealing against, you know, personal decisions or random moments or bad actors. And there has to be some recognition for that, you know.
[00:42:59.061] Kent Bye: Unfortunately, there's a whole paper on the issue of harassment and trolling that I'll be digging into that specific issue. But I just one last pass over the business governance and operations where we're talking about the different recommendations for not only stockholders, but the board of directors and then businesses generally putting forth ethical guidelines to be able to do the aspects of self-regulation, but more through the mechanisms internally, through the stockholders and the board of directors and other ways in which they're trying to look at some of these different issues. There may be opportunities for pressure to be put onto actual behaviors of these companies through these economic mechanisms of the shareholders that are asking for stuff. Is there anything else that you wanted to point out in terms of recommendation eight through 10.
[00:43:44.695] Michael Middleton: Well, the thing I get is I'd really reinforce that one of the reasons that we did the 10th recommendation, which was that one about putting guidelines or requirements into the vendor, the diligence area is that it's very easy for recommendations for this kind of a project to be very general and to be the start of a conversation. But one of the things that really has driven company behavior time and time again, quickly, has been this process. I mean, we've had companies reach out to us and say, we're doing our annual risk assessment. As risk has grown in its depth and its vocabulary within businesses, how many companies have reached out and said, you know, to their software companies, do you have programmers in Russia? Do you have programmers in Ukraine? It's a question we've received. It's a question a lot of my colleagues across the industry have received. It's a very quick way to screen. And it's a way for a company to also protect itself by saying, we asked our vendors, they said they didn't. We asked our vendors if they abused XR privacy and they said they had standards in place. Right. So in a litigious world that we're in, it really is a very practical focus point. We wanted to bring special focus to because it's a quick, easy way to try to get a natural flow of behavior change.
[00:44:59.802] Kent Bye: Nice. Well, I think that covers those sections. Maybe we'll move on to customers and different aspects of consumer data privacy and the ESG, which we're talking about the environmental social governance initiatives, as well as the role of consumer advocacy groups.
[00:45:14.432] Michael Middleton: Yeah, so our customer section was a little bit of a catch all, because on the one hand, we wanted to emphasize at the start of the paper that every living person that uses XR technology has a responsibility to try to think about these questions that we know it's not going to happen. But we wanted to come back to the role of a customer specifically and to talk about a few examples of that. You know, what we say in the paper specifically, you know, the General Data Protection Regulation in the EU, is a pretty significant expression of law and is the kind of thing that, you know, happens, for example, in the EU faster than happens in some other parts of the world. Right. We're going to see, I believe, a continued reaction to XR expressing itself through currently existing legislative and political and decision making bodies. And they're going to express themselves in their areas of expertise in their domain. And one of the challenges we've already seen happen in financial services now in decentralized financial services and in the broader scope of the Internet is that we're entering a realm of interconnected companies, human behavior and experiences. that sort of moots some of the traditional ways that we're organized. You can be a big player like the US and decide that your money laundering laws simply apply to anyone who does a transaction in or out of the US. And because the nature of the volume that you do, you're going to have a lot of companies fall on the line, right? And so that's a big benefit of these large political entities out there. But having something that speaks to data in general and speaks to the rights of people to their data is a good way, again, to create that layer above any company's domain to say, no matter what you're doing, personal private data is the domain of the people that would express it, right? That hold the qualities and possess the life that's generating that data. And so consumer data privacy to us, if the consumer data privacy push is not successful, then XR, It's going to be a pretty interesting place to live. So we wanted to give it its own attention there. And that was the reason we put it in. We called out the ESG and financial services as an example where a lot of consumer consciousness has been pressuring companies to act on this behavior, especially because right now, if you're in financial services, you know, we're early into one of the greatest wealth transfers in history. A lot of people that are getting very old very sick, are going to be leaving a lot of money to young people, right? It's a major generational problem. And, you know, anyone who's a financial advisor can tell you that most of the time when you inherit money, you dump your parents' financial advisor and get your own, right? For all sorts of reasons. So there's a lot of pressure in companies now to appeal to people that are more conscious of ESG problems. And of course, the problem is, is that there is a rising level of real consciousness about environmental issues, real, real consciousness about data that may make the era of issuing a blanket statement that you can see through and say, OK, that's just greenwashing may make it tough to compete in that environment. And so we wanted to call out this as an example where where fundamental pressure from users, in this case, investors, has actually caused years and years of industry contemplation on what kind of changes they have to make, can make, what kind of changes are beneficial for them to make. Some environmental and governance standards, you can say that they actually turn out to be to your advantage. An example I've always used was during the 2007 and 2008 financial crisis, there was a fund family that adhered to Islamic investing principles. They couldn't invest in companies that charge interest. They did great. you know, because they had an advantage in terms of the expression of their ethics and their investments. And so there's a great study being done right now. And we want people who are in the industry to think in terms of what benefits can we get from working with consumers to find positives from data privacy, positives from having ethical standards. And so we really wanted to reinforce that to tie that sort of internal introspection of companies right now about their behavior. to sort of try to sneak this in the door and make it part of your, your governance strategy.
[00:49:38.731] Kent Bye: Okay. And so you have, maybe you could go into recognition number 11. Yeah.
[00:49:42.573] Michael Middleton: So number 11 was that we, and again, this is what I highlighted earlier. We want a major XR technology or experience provider to publicly take a position because if people imagine a certain standard before they adopt, they're much more likely to demand it later. If you are getting on a boat for a cruise these days, you are going to hold that boat to certain levels of standards. You're going to say, are there lifeboats? Is anybody trained? Is there a large hole on the side of the boat, right? Yet people will go on vacation and they'll strap one of these things that allows you to basically jet pack above the water. to themselves and it's being operated by someone who doesn't even have a license because it's a new experience. And so our hope is that if one of these major companies with the voice and the power to drive change is able to figure out a way, and it's got to be beneficial to them, let's be honest, a way to drive a certain level of standards, it's going to put a lot of industry pressure on the other companies to match that while also meeting them from a competitive point of view. And it may be a pipe dream, but we wanted to specifically recommend it and focus on that point. You know, so many of these standards about letting people know areas where you might start focusing your conversations with these companies, with stakeholders, et cetera, pressure points that might be available. And I would certainly think that would be one of them.
[00:50:59.135] Kent Bye: Yeah, the thing that comes to mind here is this battle between Meta's Facebook and TikTok. So the social media platform of both Instagram and Facebook that's now part of Meta against something that's like TikTok, which is a Chinese company that the implications of the data, of where the data are going, especially because ByteDance, the parent company of TikTok purchased Pico VR, which is a VR company. So they have their own potential aspirations of VR. A lot of it's been focused within China, but it's also, you can get Pico VR headsets all over the world. So what happens if the Pico VR comes into the consumer XR space and it's a Chinese owned company, and you have the same type of issues of the data, where the data are going. And if Meta's approach to this is to just smear TikTok as a company and to create a lot of fearmongering, to create specific legislation that only is addressed to that one company, that doesn't address the larger issues of what happens with this biometric and physiological data that's being collected by these XR technologies, that it going to Facebook and Meta may be comfortable enough for a lot of Americans. But from people overseas, there's also this uncertainty of like, where is that data going, which led to SRAMs 2 and other dimensions of data sharing that got disrupted because of the amount of information that could get into the national security apparatus of the intelligence organizations because of that lack of transparency of data going from consumer context into like national security context with the United States because of the third party doctrine led to SRAMs 2. and other issues. So to me, as I look at this as an issue, there could be approaches that if you really wanted to holistically address a lot of these issues, especially when it comes to potential concerns of biometric and physiological data, if it's going to meta or even if it's going to the Chinese government, and how that data could be used. I think this is a good example of how we would actually really need something a lot more than a smear campaign against TikTok that was reported by Taylor Lorenz and another Washington Post journalist, but something that's actually getting to the root of the problem, which is a fundamental privacy law that addresses the core issue rather than the superficial issue.
[00:53:10.712] Michael Middleton: Yeah, no. And, you know, you've really put your finger on an important point, which is that some of our recommendations have a very dangerous inverse capability, which is that if a major XR technology or experience provider puts out with its first move advantage, a very self-serving and very powerless set of what looks like quality XR standards. They can do a lot of very quick damage, especially in our current echo chamber. And to your point, a lot of people do say to me, they say, oh, you know, what do I care? I'm just using it, I'm sitting in my house. And I think that sometimes you can't imagine the implications of things until years later. You can't imagine the impact that having 20 years of emails available in your inbox with Google may have, if someone looks at it from an outside perspective and say, oh, I can tell from this that you stopped being friends with people in this area, this area, this area, because you stopped emailing them. And that must mean you have certain political leanings. We're not used to this kind of data being available. And so someone now doesn't understand the scrutiny that might be available later. And a very good example I use, again, I always try to find examples that have nothing to do with XR, to get people to understand what it could be like, is if you go on YouTube, you can barely throw a rock without hitting some video that's like 10 things you never noticed about some childhood movie you loved. that someone is picked apart frame by frame and they notice something appalling in one frame or they notice that someone's hair changed twice or whatever. Things that we didn't catch for 30 or 40 years because no one had time, no one cared, and no one was sitting there watching it frame by frame. And what if you play a game for an hour now and you're using eye tracking in the game and 15 years from now they say, oh, based on the data, we noticed that your eyes narrow slightly when you see someone of another ethnicity. Well, how's that going to feel then, right? I mean, what does that data really mean? So I think that it's an interesting position where, and it's why I keep reinforcing the user, the user, the user. People need to very quickly learn from what we've already seen happen in the last 15, 20 years and really broadcast out the fetches of their imagination. Where could this go? And This is why we're trying to get them to think in terms of, can we get a technology provider to put some sort of stake in the ground? Because for the better or the worst, that might be one of the best God help us opportunities we have for a structural series of boundaries around this.
[00:55:41.728] Kent Bye: Yeah. And as we talk about these different consumer groups and you have a whole section on consumer advocacy groups and recommendation number 12, maybe you can get into that and 12 and 13.
[00:55:51.427] Michael Middleton: Yeah, so consumer advocacy groups can be far more powerful than people sometimes realize. You know, advocacy groups in general wield enormous swords. AARP in the United States of America is and it's only as people age is only going to get more powerful as an organization that taps you into. let's be honest, a bunch of people with large amounts of capital and influence. Right. And so if consumer groups are saying to their members, hey, we've got your back and we're watching this and recommendation 12, we actually said consumers, international and other advocacy groups should demonstrate their commitment to the space by championing high privacy standards while declaring XR ethics teams and so on. If AARP just every year, once a month said this is XR privacy month, And you should be thinking about this. It becomes one of those ideas like financial wellness that has been slowly making its way into the populace. But people realize, oh, there are levels of financial knowledge. I really need to be having for myself. Maybe I wasn't taught them this kind of slow drip educational campaign. can be so important because it's coming from a group that actually is on your side. And I'm not saying that there's companies out there that don't care about their consumers. I'm saying that the companies out there have a very specific set of things they're supposed to do. They have business models and they have corporate standards. Here's what we do. Here's why we exist. But consumer groups are advocacy groups. And so if advocacy groups take a stand and they say, if we feel that you are going to be acting in the worst interest of our group, our members, we're going to tell them, you know, and it's going to be a thing, right? Then that is a powerful moment. And that's why also in our recommendation 13, we said the individual stakeholders in the XR ethics space should target the relevant advocacy groups, submit articles, participate in boards. If you're a member of an advocacy group and you notice there is not an XR board for it, Maybe it's just starting one. Say, are you intending to address the situation? Because that kind of groundswell, that kind of grassroots pressure, I think is one of the quickest pathways you have as an everyday user of XR to the potential for change that's actually from a group acting in your best interests.
[00:58:01.870] Kent Bye: Yeah, I think that's a really interesting point. I hadn't really thought about the broader ecosystem of these different groups of building coalitions and awareness around these issues that are still frankly, really technical and wonky and confusing that even someone like myself, who's dove in pretty deep over the number of years, there's still so many aspects of this as an issue legally, and it's so big and confusing. And it's really, Timothy Morphin has this concept of a hyper object that gets into like something that's so confusing and over time and different philosophical implications of the philosophy of privacy and everything else that at the end of the day, I think it's contextual integrity around the information being used appropriately, but also to what degree do we own our data and have control and agency to make those decisions? Or Dr. Anita Allen, who talks about how aspects of our privacy, maybe they should be treated like their organs. And so you don't sell your organs. So just the same, you shouldn't be able to sell different aspects of your privacy away. So like, having a constellation of all these philosophical questions, I feel like, again, it's a really complicated issue where the harms are often invisible, and it's not really clear how to draw those lines. But we kind of know the far reaches of something like neuro rights, of our right to our mental privacy, our right to identity, and our right to agency. That to me starts to spell out, if it does cross that threshold, when you start to incur on our different thoughts or actions or behaviors or emotional reactions or physiological reactions, putting up a very complicated digital twin of our modeling our identity that then can then target us with those psychographic profiles, which you mentioned that there's like 52,000 as of 2018 of what Facebook is tracking in terms of these tens of thousands of traits already of what we're doing on their platform. And then on top of that, the additional psychographic and biometric information that's contextually dependent And then on top of that, can you create a model of someone that starts to nudge their behaviors that undermines their fundamental rights to autonomous action and intentional actions and free will and agency, which are all these deeply philosophical concepts that need to be defined and put into a law. So anyway, it's a big challenging issue. So I appreciate the kind of awareness, but also like, even as I've gone down the rabbit hole, I don't have a clear idea for how to specify those things and to draw those lines.
[01:00:19.402] Michael Middleton: Well, and it's funny you touched on a topic that, as you can see, I make a lot of sort of broad sort of like imagine this. I try to really take some of these conversations out of the XR specific space so I could talk to everyday people. But you nailed one in terms of those moments of digital twins and having all this data. And I say to someone, imagine the person in your life you trust most to talk about finances. Let's say it's a parent of yours. Now imagine that that parent has sold away the rights to their digital image as part of something they did. They didn't realize they were doing. So now imagine, you know, I lost my father in January, as an example. If I were in a very immersive metaverse, there could be a time when I'm walking down the street and my father comes up to me. and offers me financial advice that's packaged by Wells Fargo and it doesn't even disclose it. It may sound silly to say, but there's no ocean left between us and that world except for the will to do it, the data to do it, and the uncanny value. And so really thinking about this as a consumer advocacy problem is so important to me to sum up the section, because if an affinity group, and affinity groups do work, there's a reason that you can't sell financial products on military bases without really, without having the greatest excuse ever, because if people think it's part of the program, they assume it's trusted, right? If affinity groups say, We're as of two years from now, we're not going to offer our members insurance from any company that hasn't convinced us they have our users best interest in the XR space in mind. You're going to see insurance companies begin to suddenly catch up on technology, which they're historically laggards and pretty fast. So.
[01:02:04.798] Kent Bye: Yeah, I think that's a good segue into this topic of the data issues by topic, which we've been kind of dancing around at the entire discussion, but maybe we could dive into some of the specific types of data and recommendations you have around that data.
[01:02:17.671] Michael Middleton: Oh yeah. So data obviously is the real key to this whole conversation. And so location data, we wanted to call it specifically because it's an area of information that is now inherent in using this technology. Right. And our 14th recommendation was a specific focus must be made on the ethics of XR location, data collection, evaluation, and sharing specifically in cases where XR is experienced through core technology, such as a smartphone. So whenever I take out my phone and I, put some filter on myself to make something like a dog or actually one of the best to this day one of my favorite uses of AR ever was there was an app where you could just hold up your phone and it was the camera view but it would show you in which direction there was a place that sold Stella the beer and how far away you were and I was like if I had that for Guinness I'd That'd be my new GPS, right? But I'm also giving tremendous information. I'm giving information about where I am, what time of day it is, who I'm affiliating with, who I'm close to through the ability to look at these other phones. I'm giving away the store in a lot of ways. And this in the hands of the wrong folks is bad news. And Lord knows, as we're in a position now where the events of the last couple of months in Russia, the way that there is questions about sides forming and so on and worldviews and so on and data and access to information, possibility of the splinter net and all this other kind of stuff really makes clear just how important access, connectivity and data control really are in a way that is now visible fundamentally in an old trope, which is a physical war. Right. And so if I am handing over this information to someone who could say, hey, you know, two years ago, you played a game on a street corner and you were with 15 other people. We've all arrested for treason. We'd like to interview you. Well, I should have played the game, first of all, but also I may not have even realized I was handing over that information. Now, that's a pretty extreme example, although it's definitely a real life example. But what if it's you are hanging out with 15 other people and they've all got really low credit scores and bad histories? You being from the so-called wrong neighborhood by finance, and Lord knows financial services, the mortgage industry have terrible track records. and all sorts of things that have to do with racism and equality. And if we notice that you hang out with people with a bad credit score, that data could be in front of you the rest of your life. And you never even knew it was there because it was collected by someone else who was in the same place that you were. Right. And so this idea of location became an issue for me. I don't know if I would have thought of it this much without Pokemon Go when it first came up, when I was hearing that people would get hurt or they go on private property, playing the game. They thought they were allowed to do it because the app told them they could. I had people walking basically through the front area of my parents' house. because it wasn't their fault, there was something there. And so then there was this example we made, Overlands, which I don't know if everyone's familiar with, so I'll quickly define it. They divided the world into 1.6 trillion hexagons that you can buy and rent and trade and so on, but not the only company that's done this, right? There's lots of companies that have taken the physical earth and said, you can buy a piece and you can own it in the metaverse, right? And there's all sorts of elements to it. We need to understand, and this was our 15th recommendation, We need to interrogate our technologies and our experiences to understand the real world effects of ownership and investment and personal agency and the law in terms of the XR equivalency of real world private assets, unique locations, et cetera. If there is a part of the world that just doesn't have access to the kind of connectivity I have, maybe they're also underbanked. If I can just go in and I can buy entire swaths of that land, What does that mean? What does that mean for them in the metaverse in their ability to express themselves? What does it mean for their sovereignty? So we really wanted to bring this up to give people a sense that the data question matters because it's going to move quicker and there's different protocols and paradigms they're going to push for each other right now. And by the time some people catch up, it may be too late. It may be too late for someone who's under banked by the time they have access to a bank. People may know all about them because they play game systems and they may not have access to capital, to small business opportunities and development because they gave that data away before the institutions that offer them ever even existed in their space.
[01:06:35.950] Kent Bye: Yeah, a couple of quick points on that. Number one is even within the adhesion contract that you sign for using any technology from Meta, it's standard. That's even in their high level Facebook terms of service that is used as probably now Meta terms of service that you have to identify your location whenever you use an XR device, whether it's VR or potentially moving forward with AR, it's kind of already there. The legal framework that whenever you use these devices at these companies are putting it in their contracts. that they automatically will know where you are, which is basically, in terms of the third party doctrine, going to be made available to the governments, but also to what degree do they really need that data? Well, that's, again, we're getting back into how that data is being fed into this larger ecosystem of surveillance capitalism. So it's very valuable in that sense. But the other point that I just wanted to make is that kind of the OVR lands, this aspect of micro-targeting from an even closer to GPS, but creating unique identifiers that have this hexagon point of land that you can buy, sell, and trade. It reminds me of what was happening with Pokemon Go and an ethical issue where they had Pokemon Go being played at the Holocaust Museum as an example. So, to what degree are you on a private property, but yet you're playing this game? As the business owner, do you have the right to be able to say, no, I don't want this virtual context that's happening, that's disrupting what's happening in the physical context, because they're trying to honor this horrific event that happened? and to bear witness to the pain and trauma while people are playing video games and distracted in a way that disrupts that. And so that's another aspect of to what degree do property owners have the right to declare what can and cannot be in these realms of virtual spaces. But if the realms of virtual spaces are driving behaviors of social dynamics that are beyond the control of that business owner. Then you start to get into this issue of property rights and who has the ability to say, no, I don't want this level of augmentation that happens on my property. Or if it's more of a first amendment issue where it's a virtual space where anybody has the right to do whatever they want on any space, anywhere. These are dialectics that are being brought up by some of these different points that you're bringing up here.
[01:08:49.070] Michael Middleton: No, and you know, you raise an interesting point there, which is that what if I am a platform provider for VR technology? Let's say it's smartphone driven for the purposes of our conversation. And suppose that there is a town that I was trying to make some sort of deal with and they decide not to work with me. And I have the platform to redirect traffic through their town to actually incentivize or de-incentivize user visits. to reduce the number of Pokemon in the town, to reward other towns. I mean, you can sit down and just scope out so many areas where location data in the hands of a company that has a vested interest in its own profit and its own technology and its wide adoption can really be at the great disservice of others. And it doesn't have to be because someone else thinks they're a traitor or a protester. It could be as simple as you didn't pay a fee to be listed in this games that are demographic that have location based preferences are going to deemphasize the area. And you didn't even know it was happening. Right. It's really a wide, strange world we're walking into.
[01:09:59.000] Kent Bye: Yeah. And as we start to move into the next section here, where you start to lay out the core of what we've been talking about, we're talking about the data, but the business models around that data, we've been talking about a lot of Cal, the surveillance capitalism business model, where you're seizing and capturing this data and then monetizing it in some fashion to ads and psychographic profiling. That's the baseline. There's also other ad models that you mentioned here in terms of the subscription based model, whether it's a Netflix or PlayStation or Premium models, I'd say there's maybe less in the context of businesses, but more from creators where people are donating and there's more of a Patreon type of model. But in terms of businesses that are having value exchange, you don't tend to see very many donation-based businesses that are publicly traded. They all have some method of having more explicit value exchange through economic means. Maybe something like Twitch is maybe the closest because they're having a creator economy that is based upon certain percentages of those support that they're getting. So there are exceptions to that. Even Twitch has ad based business models for people who aren't subscribing. So there's a complex of all these different models and approaches and you, you have the recommendation number 16. Maybe you could elaborate on what you were trying to point out here in this section.
[01:11:10.223] Michael Middleton: Sure. So recommendation 16 was that the public should demand and reward business models that provide a clear understanding of rights, price, value, and ownership of XR data and experiences in line with contributions made by those funding them. You know, so much of the XR is going to be driven by user data, user experiences, user input. And so if I've been a Twitch streamer for a long time, Twitch has grown in part because I was able to make a new platform interesting, entertaining. I was able to make the platform work, right? Now, let's say that Twitch decides to ban me, and there's been all sorts of issues with this, right? Or to de-emphasize my channel, I've helped make this business model work. The same way that you might say that Netflix's early adopters help them be in a position to be able to finance the wide number of shows they're putting in place, right? So if my inputs, my experiences are building XR experiences out there that a company is going to use, we have a natural interest to say, what is the right of the creator here? You know, with Netflix, at least when I sit down, I am a passive participant in most of it. There are some interactive things there. You know, there's a Netflix has a trivia game that came out recently. But in general, I'm a passive person who is funding it through my interest in this topic. With XR, we're all creators. Where we go and what we do is driving the intelligence and the behavior of the ecosystem. What are the rights of those creators to have access to the worlds that they built is really we're trying to get there. This could be its own separate paper altogether, I think.
[01:12:40.544] Kent Bye: Yeah, for sure. So the next section of the social media, I had alluded that you had this citation in here from this 2018 statistic that Facebook had 52,000 analytics on each user, which looked like it was from some Fox news report or something. I don't know what the original number where that is coming from. I saw the article, but I didn't see exactly where they were getting it from. But anyway, there's tens of thousands of data points that are there potentially already with the context of social media. And so, yeah, maybe you could set the context for the social media and the recommendation that you have.
[01:13:12.973] Michael Middleton: Sure. And so with social media, you know, this is an experience that I will say the vast majority of people that are likely to be really metaverse participants understand. Right. I mean, I do have friends, even young friends who don't use social media at all, but in general, one form or another, you've seen the effect and you may even have gotten to the point where you understand now, as it was famously said, you're not the client, you're the product, right? So this data awareness has been growing. And so we should really learn from this because the amount of data that we have out there already about ourselves and the fact that for the better or the worst, The major social media companies are going to be very, very integrated into this push into the XR space really needs to give us a moment to step back and think through what we feel about this. So our 17th recommendation was that XR ethics advocates should align very closely with privacy and ethical advocates in other spheres, for example, social media, to improve the chances for a comprehensive and efficient movement for real outcomes. You know, included in this is the alignment of support of efforts within those spheres that promote general user independence and control. So those of us that care about XR ethics, about XR data privacy should have the backs of people that have been battling for social media privacy and for the broader question of I am the person that has this data that I'm creating and I should have control and a say from it.
[01:14:40.402] Kent Bye: Yeah, I feel like that there's a lot of things that are happening in the realm of social media that are going to be similar, but also things are going to be way different in terms of the virality of network effects of how fast things can spread within text-based or video-based mediums and experiential-based mediums. I feel like there's something that's different that maybe changes how quickly information can spread, but also the dynamics of immersive virtual worlds may be different than social media. But I do think that there are going to be certainly in terms of the harassment and trolling, it's going to be not only similar, but maybe even more difficult in terms of it being real time, action based and something that there's less of a clear cultural artifact that you could point to, because it's happening in a moment, it's very contextual, and it's something that other aspects of natural language processing and AI moderation and a whole other sets of technologies that will have to be put into place that maybe goes above and beyond the existing content ecosystems in terms of moderation, because we're talking about real-time environments rather than stuff that's been captured in some sort of media artifact. So although I do think there are a lot of similarities, I also think there's a lot of differences, but I agree with the point that there is enough similarities to have solidarity, but I also think that there's going to be stuff that the existing social media realms I've never even thought of in terms of the challenges that are going to be faced within these immersive virtual worlds.
[01:16:02.088] Michael Middleton: It's crazy. And one of the things that, again, I think helps is to try to contextualize for people just how much the structure of social media and the way it acts influences the nature of the content. This was always somewhat evident. Obviously, certain types of information do better on Instagram. You know, a quick image, you know, you're not going to go to a long read on Instagram, right? Twitter, a quick point or a back and forth argument, there's certain levels of control that the media being used will provide. This was reinforced to me years ago when TikTok was fairly new. A friend of mine who is an Irish dancer said TikTok was the best thing that ever happened to him because Irish dancing, you basically stay where you are. You barely move left to right. In other words, you can stay in the frame the whole time. Whereas if you are someone who's dancing around the room, how do you do that in TikTok? You'll be out of the frame most of the time, right? And I use that as an example of how we can really affect what kind of content we post. We post what's going to be useful to the consumptive mechanism of the device. And so you can look very quickly and you can tell someone who really knows how to get the most out of Twitch. You can tell, you know, YouTube content, you can recognize walking on the street and mind you, some of it's gotten so old, you get tired of seeing it. People develop ways to communicate that maximize the benefit of the algorithms in place. And as we're moving over into XR, those social media algorithms, which we know contain all kinds of bad behavior and some of the worst biases of people that can be expressed either consciously or unconsciously, all that's going to be there waiting for us. And so I think that for XR ethics advocates to not take a vested deep interest in what social media, what its promise and what its faults have put us in place is to walk blind into the metaverse.
[01:17:57.214] Kent Bye: Yeah, it's a really good point. Especially a lot of the algorithmic direction of attention that is still going to be recommending people or recommending events and places. So yeah, a lot of parallels there. Well, that was the first section. Like you said, that could have very well been an entire paper on its own, but maybe we could transition into the second part, which is more about the economy, financial services, banking, and cryptocurrency dimensions, which I think are each unique aspects that are part of the global picture that are worth unpacking a little bit here as well.
[01:18:27.401] Michael Middleton: Yeah, and so this was the area that originally drew me into the involvement. It's an area I'm very passionate about. I would say I found some of the passion for the former section through my conversations with Samira and also through sort of having to think about it since it was included. But what XR means for financial services, what it means in terms of the economy for the prospect of cryptocurrency, other financial applications of blockchain, or the distributed ledger. All that kind of stuff is really one of the areas that I see myself spending a lot of time on in the years to come. In fact, my company Seabolt, we're already starting to move into that space. And so we started with the economy. As you can see with our paper, we started with very broad elements and we tried to sort of narrow down. to see how fundamental we can make our recommendations. And so we started right off with government and international organizations and basically just said, recommendation 18, working groups within these major global and regional economic organizations should be targeted for awareness about this and taking positions. XR ethics as applies to their members and domains. There are just these broad bodies out there, like the G20 is an example. where you do have groups that are trying to sort of drive broad direction and if large multi-government organizations say we're going to band together and look out for the protection of our members and it could be coming from a good place they genuinely care it could be it's coming from a political place such as how many times we had it now where if you're a Chinese-based company, Russian-based company, you may not have access to certain opportunities in the U.S. government to buy or sell technology, right? It could come from both of those, but it is a major area of focus because that really is where a lot of attention of legislators goes.
[01:20:11.435] Kent Bye: Yeah, it's interesting that you're citing a 2019 report, Waking Up to a New Reality, a collaboration between Accenture and the G20 Young Entrepreneurs Alliance, that we're putting out these six specific risks. Number one, that we've been talking a lot about, the misuse of personal data, but fake experiences and false information, cybersecurity and identity theft, technology addiction, antisocial behavior, and digitally divided worlds of the haves and have nots. And as I have just general broad discussions about XR to the general public, then a lot of these different issues will inherently come up privacy and data, but also fake news, false information, misinformation. So it seems like that there's these economic groups of the G20 and Accenture that are identifying these, but then where does that go? Does that go into like government regulation or what kind of leverage do these groups like G20 have beyond just trying to push for legislation?
[01:21:05.587] Michael Middleton: It really depends on the group. And in many cases, the answer is not much. And unfortunately, when it comes to a lot of broad groups, you know, 180 nations can get together and make some resolution. And 10 years later, what is that resolution even meant? Right. And climate change is a very easy place to look for some of those broad resolutions that haven't translated. That's one of the reasons why we tried to change scope constantly, but try to get all levels of the zoom in our recommendations, because you don't want to leave anyone out. You want to say, let's go after every angle we can think of. Let's pressure from the ground level and let's pressure these huge, you know, multinational organizations that don't always move quickly. The hope being that somewhere water gets through the rock, you know, and really leads to meaningful public attention and either change or foundation setting that's in the public interest.
[01:22:00.167] Kent Bye: Okay. Well, there's a couple of sections here about currencies. Uh, you talk about it here in 3.1 0.2, and you come back to it later diving into some of the virtual currencies. Uh, but what's the main point of the currencies in this context?
[01:22:12.057] Michael Middleton: Sure. So we listed currencies because we knew that right now we are at the dawn of possible fundamental changes in what currency means, specifically as a government issued level of value. Right. And so we wanted to address it. Well, I think we actually stopped short in the actual currency section of making recommendations, but we just wanted to highlight it and say, we've got to keep our eye on this because currency in so many countries is access. It is participation. It is the ability to interact. I mean, you can't pay taxes online in many areas of the U.S. without paying a fee, right? And it should never be a fee, in my opinion, to file a government form. This is just a personal opinion. But when we get to the point where you might have digital currencies, you might have a government-issued, government-backed fiat currency that is digital, we need to think about what will that mean in an XR space? How will those things interact? So we didn't issue specific guidance there until later on when we got to cryptocurrencies, we actually did. We can jump to that section if you want quickly, or we can work up to it. It's up to you.
[01:23:17.649] Kent Bye: Uh, well maybe let's come back to it and wrap up this economic section and then we'll come back. Cause it is a much broader issue, but I guess in this you're just flagging that there could be distributed currencies and how do you integrate that?
[01:23:29.334] Michael Middleton: Yeah. And let's, let's call attention to it and let's call attention also to if you have non-governmental agencies start to present their own currencies that they have control over, whether it's through, through tokens or there's so many ways you can go with that, but. We did want to acknowledge it. We did a small section on raw materials manufacturing and trade. That's obviously a very broad area of economics. And we made a recommendation specifically that XR integration into those areas should be designed to ensure that the respective nations and their people have access to an agency with regards to the benefits that can be realized in their industries. So much of the labor involved in raw material gathering and extraction, manufacturing and trade, is labor that has not been given the advantage of fair trade incomes, access to banking products, savings, loans, interests. And so if you are someone who to feed your family is using XR technologies as part of your job, to mine, let's just say gold, for example, you are selling, for lack of a better word, you're giving away tremendous amounts of data about yourself as a person in exchange for a low rate of return. But you have to do it because you don't have any kind of agency to say, I'm not going to use this XR guided experience to do my job, because then you're not going to have access to the job either. So we wanted to call that out specifically because it's an area that can easily get overlooked. and really plays into broad conversations about what it means to have access agency, you know, et cetera.
[01:25:12.014] Kent Bye: Yeah. And just making sure the integrity of being in right relationship to all the labor and the people that not having child labor or exploiting people in terms of not having fair trade, even to build these devices. So.
[01:25:24.968] Michael Middleton: Absolutely. That's an excellent point that I haven't even thought about is the labor to build XR devices and people doing what they haven't done historically, which is caring about how are those people being treated? Where is it coming from? Working conditions, child labor, et cetera.
[01:25:39.960] Kent Bye: Yeah, which is an ongoing issue that is hard to completely validate. There's actually a piece at Sundance this year that dove into that, that I'll be diving into as a conversation in more detail. But the big section here is the financial services and banking, which I think covers a big aspect of the rest of the conversation. So maybe you could set the context here.
[01:26:00.950] Michael Middleton: Yeah, so banking is one of those things that if you have it, you take for granted. And it's one of the most fundamental ways of building wealth, stability, a sense of certainty, community pride that there is. If you have access to the ability to have your income someplace secure, to have the idea that it may even be incurring interest of some kind, that it has a value because it's being used by the bank to provide services. If you have access to even micro loans, these help grow things. There's this area of altruism that's focused on direct giving, saying someone who is in a poor circumstance is in a better position to decide how to use five hundred dollars than perhaps a charity is. A lot of evidence to support some of that. It's really a fascinating area that we don't have to dive into here. But Having access to a bank means you have access to stability. Lord knows banks are conservative. They tend to go into areas where they can operate at a profit. And so we really wanted to focus on this. One of the pieces of data we call out is that in 2017, the number of unbanked individuals on earth had fallen to 1.7 billion from 2.5 billion the year before. Now, that's interesting from all kinds of points of view, because I don't think anyone sat down all those people and gave them a course in financial wellness, right? This is a fast-moving monster. The ability to have a mobile phone before you have any kind of financial account is one that's taken for granted now. It's just simply out there, right? But having access to these accounts now, again means that you're handing over data, but also that you are in a race because if XR gets to you first and you give away lots of data, then you may not be able to partake. in financial services once they're available in your area for the right reasons or the wrong reasons. And it'll all be over before the shouting. Banking was here before XR in the United States. So I have a 25-year, at least, length credit history that I can use, where if I got denied for something because of XR-driven data, I could say, but look at my behavior, right? Someone else is not going to have that data to point to. They're not going to have that advantage. And we wanted to call it as well, alternative financial services and money systems. A lot of people in this country may not be familiar, but there is massive systems that do cash-based or even chip-based transactions. And all these things are going to be affected. So we could have people lose access to a system that exists in their area and then get denied by the new system that comes in. Right. So this we sort of generated into recommendation 20, which was that private firms, government advocacy groups, et cetera, should consider the ethical concerns and dangers of implementing products and services that may undermine those critical alternative financial systems, especially if there is a possibility that a new system will lack important features of the old one. So as people transition to having access to more products and as the systems they relied on maybe are affected by this, if they suddenly find that they can't use those products because of things that resulted from poor XR data management or privacy, then they're really out of luck.
[01:29:16.723] Kent Bye: So I guess to kind of bring this point home in order to have access to some of these XR technologies, you have to have a credit card account on file and that applies that you have a bank account. And that what you're saying is that these technologies inherently, because of that are excluding lots of people who, you know, if they are only working with cash and cash interactions may not have access to these technologies because they're not having actual integration to these alternative financial services.
[01:29:44.183] Michael Middleton: Well, there is that, but there's also, you know, some of these alternative financial services are ones where actually no money changes hands. It could be that I pay someone here and that person knows someone over there who gives someone else money, right? That the money never actually moves per se. So I could be taking part in a very simple paper-based or even some other form of tokenization based financial services system that could vanish as other people get up to speed and have access to more traditional banking. But that service was based on trust. It was based on knowing people locally, knowing you're down the street. I know where you are. If you don't pay, I know that you'll find a way to pay. That could all vanish. But you're right. In terms of the access element, The financial access to XR technologies is its own separate thing because it does require, it basically requires either credit or a banking relationship. And as someone who once didn't have a credit card or a debit card, I learned firsthand how many worlds you don't have access to. for example, having an easy pass, you know, in the U S you have to have one of those things. And if you don't, you're going to be sitting in the cashly now. And of course, now they're getting rid of those manual toll lanes and you have to have it, or they just, you know, track you on your license plate, but it really does become an access point. And so some of the access points to XR are actually outside of the XR ecosystem. And in many cases, that's in the financial and banking sector.
[01:31:09.549] Kent Bye: Yeah. And in terms of specific companies, it seems like the insurance companies we've mentioned at the top where, you know, there's an incentive to gather as much information as you possibly can. And because of that, there could be access to XR data that is driving decisions by insurance companies. And so maybe you could talk about recommendation 21.
[01:31:30.575] Michael Middleton: Yeah. So we said insurance providers and financial services companies should have clear limits on their ability to use data. not collected by them for insurance, underwriting and claims payment services and transparent open processes to exist for individuals to determine what data and for what providers was used to determine their viability. So let's just say I go to sign up for life insurance. I have to give a certain data over. Maybe I have to go through a medical examination. But if they then go in and they look at my 500 hours of Mario Kart and how I was holding the controller and they say, wow, this is indicative of you having a chronic injury and therefore you're not eligible for this. That's a whole other ball of wax. Right. And so really making it clear that just because the data is out there doesn't mean you can use it for these issues is massive one. And I'll tell you, a simple one would be car insurance. Imagine if car insurance providers really were able to tell who was texting while driving or who was in an experience. A lot of people's insurance is going to go up real quickly and mighty from a safety point of view. And since I don't text while I drive, I'd advocate that. But but let's be honest. Right. And so insurance governs so much of your life. It governs your peace of mind. That's what the insurance industry says when they sell life insurance. They say this is about peace of mind, knowing people will be protected when you pass away. And it's the same thing about having auto insurance, having the peace of mind that if you get in an accident, it's not going to be a $50,000 hit on your finances outside of any legal issues. So playing games and let's be honest, video gaming is such a massive industry. It's going to be a broad area of data production for AR and entertainment stuff in general. Having that affect those areas of your life. It hasn't counted historically. Now, if I sign up for life insurance, they may ask, do you go hand gliding on a regular basis? Right. But they don't have the ability to go watch what I do on a mountain. So. And the other two recommendations, are they just sort of reiterating that the 22 sort of, um, we wanted to specifically call out workers, workers required to use and adapt new devices need to have ergonomic considerations, and they should have coverage for XR related injuries. You know, if I'm being required to use an XR technology at work, and then I get injured and I lose access to my paycheck because of something that arose from the use of it, I should have coverage for that, right? That needs to be accounted for, and it often isn't accounted for now. And then Recommendation 23, as human work and play and identity enter the metaverse, insurance coverage should protect for risk, damage, and loss of assets of elements in that metaverse. And we put that in as well, again, from the question of agency, which is that if you lose your avatar representation, or if you lose access to an environment that has a therapeutic nature to you. You should have some level of agency about that. You should have some ability to protect yourself from that. In this case, maybe insurance is the best option. So that was sort of a, we were trying to do a wide reach here. It's pretty broad scope to start conversations in the area.
[01:34:36.675] Kent Bye: In terms of personal investing, I know there's been different data visualizations, ways of maybe giving additional context to different trends. So maybe you could elaborate on the recommendation number 24 around personal investing.
[01:34:49.447] Michael Middleton: We looked at a lot of angles here because it's one that's near and dear to my heart. And ultimately we decided that the most important thing right now is that people offering financial advice or products need to have clear rules and limitations to what kind of effects they can produce and XR to influence the decision of a customer. And they also need to understand extra metaverse of factors that may be affecting behavior without being apparent in the metaverse. And this is a tough one. And the example I will give is this. If I'm a financial advisor and I am going to the home of an 80 year old widow, to talk to her about her finances. Maybe she's a client. When I'm there, if I see that she is clearly not of a certain level of attention or she can't follow information, you know, I have responsibilities to act, you know, in a way that's suitable for her. So if I try then to sell her a bunch of complex products, we have elder protection for this reason, right? If I'm on a Zoom call with her that I don't even know her son set up, And I don't even know her son's in the room because I can only see her. Or maybe her son is screened out, even though I'm in a VR environment. Her son is screened out. I don't realize he's there, maybe threatening her. What does that mean in terms of financial services? Right. So there's a responsibility, I think, for people that are interacting through XR to clients to understand how much decisions can be affected that we're still biological creatures. And so if you say you can pick between five stocks and when you're describing two of the stocks, you're playing very pleasant music at a certain wavelength. And then during other stocks, you're playing a very subtle negative noise that could influence your recommendation. Right. All these kinds of things need to be there. And we're trying to promote this understanding of just how truly influential and immersive environment can be. And what rights people have to not have those things without their knowledge, driving their influence of their decisions and their involvement in this.
[01:36:45.497] Kent Bye: Hmm. Yeah. The kind of the environmental experiential design aspects that start to play into, maybe there's a kickback that the financial advisor is getting or something that's not disclosed, which is a whole other ethical issue. But yeah, I can see how there could be ways that you design an immersive experience that leads them towards decisions that may benefit them as an individual. Yep. So the big one that I think has come up a lot in terms of the context of virtual currencies is the different near your customer, anti money laundering, terrorist financing, fraud regulations that are trying to prevent the use of these virtual world currencies for the use of money laundering. And so. I know there's a lot of existing laws in the books that these different virtual currencies have to follow in terms of if there's certain amounts and whatnot that they have to do their due diligence to make sure that it's identified back to a specific individual and maybe digging into where that money came from if it's big cash transactions. But yeah, any other thoughts on the anti-money laundering dimensions of virtual currency world?
[01:37:45.957] Michael Middleton: No, it's such a broad world. We didn't feel the need to draw too many distinctions. The only one we wanted to focus on is that, you know, know your customer is a very important topic in this area. And who knows what VR is going to do in terms of being able to not really know if you're talking to the customer and their real circumstances, but that's its own separate ball of wax that we didn't think this paper could really tackle.
[01:38:07.810] Kent Bye: Okay. Well, the last section here at 3.3 is cryptocurrency, virtual assets, NFTs, and blockchain applications. I know broadly within the larger industry, there's been a lot of hype around the potential, but also a lot of more skeptical and critical videos and critiques around the different aspects of technical debt and ecological impacts and how decentralized are they really. So. you know, as you start to dive into it, what were some of the big takeaways that you got when looking at the role of cryptocurrencies, NFTs, and the distributed ledger and blockchain technologies as applied to the virtual worlds and the metaverse?
[01:38:43.405] Michael Middleton: This was a rough one, partly because as much as people in DeFi can see how different the world can be, a lot of people still think of currency and of financial systems and very traditional methods. So a lot of our earlier recommendations really focused on this. We wanted to call it out because there will be an ability to participate with value fundamentally differently in the metaverse. And that ability will be something that can be expressed much more closely with our currency, as our currency maybe takes on new forms, as ways of transferring value take on new forms. So we wanted to call it out. And there were questions that we asked ourselves, and I'll call one of them out, which is that, what if a bad actor is able to use your data, your data to pinpoint your personal mobile device, even though you're not using it on your phone, and then to offer XR experiences involving you into other people. So if I see someone in the metaverse that I think is interesting, and I somehow have the ability to buy their home address or something like that, these thought experiments we sort of started going through and saying, are there recommendations we can make specifically about XR ethics in this space? And we decided it was too early in its evolution to decide and the earlier stuff covered it. But we wanted to call it out in the paper as problems for contemplation because we don't think it's settled into a new form. And we don't know how jagged the frontier is going to be between traditional finance, decentralized finance, and how people experience value in XR. And so we closed the chapter, bringing it up, but without making any conclusions on it as a chastisement to ourself to issue another version of this in the future.
[01:40:31.362] Kent Bye: Yeah, I know that when I've looked at this issue, I've seen organizations like the Peer to Peer Foundation that make different differentiations between commons-based approaches to decentralized technology versus decentralized tech that's driven by libertarian values, which is in a lot of ways designed almost by design to circumvent government regulations and control, which means that you're really putting up a lot of trust into these different mantras of like code is law, which originated with Lawrence Lessig, but you know, it's kind of being applied here in the sense of immutable transactions that can't be rolled back. But any software code that's being created has codes and lack of security, which is creating this environment for fraud and ways in which that you could take money without having any way to revert it. Oh yeah. Clearly, which I think is a big part of financial services as consumer protections and that by going into these decentralized currencies, it has advantages, but it almost by design is circumventing a lot of those consumer protections.
[01:41:31.141] Michael Middleton: And there's a lot of misinformation. I mean, within the last year alone, a lot of people that thought their transactions were secure and were totally anonymous and untraceable found out that that wasn't the case. And you can look at various cases of busts for rings of people purchasing child pornography and other things like that, where people thought that they were secure because they were doing it through crypto. And well, turns out they weren't. So this is an area Just as we looked at the general concept of data privacy and social media, and we say we're sort of graduating past that in the XR question. In some ways, this idea of how this is going to commingle with value and your attachment to digital currency is the next frontier again, because if my money is tied to data and if I give you a dollar and you now are able to determine that I am allergic to peanut butter based on what I bought. That's a huge ball of wax there, right? I mean, and that's again, it's a very simplified example. But we are trying to think through all those kinds of angles and really start thinking about how will this tie back to economics, to finance, to our relationship to money as a structural entity.
[01:42:46.129] Kent Bye: Yeah, well, I think we went through all the different sections. And to the conclusion, I'm wondering if you have any sort of final thoughts as you were putting all this together, because we covered quite a lot of ground.
[01:42:55.422] Michael Middleton: Yeah, no, thank you. Be very patient. I appreciate it. No, I mean, I think for the better or the worse, when you attack the question of inherent structure, It's impossible in whatever there are 25 pages, whatever our paper was to even scratch the surface. And I think that my real takeaway here is that as we start to interrogate as consumers or as members of the various industries, what XR means for our industry, we can sort of glance over how it relates to other industries pretty quickly. And I think it's always worth remembering that these big ecosystems, the ecosystem of finance and banking and loans and credit scores, all these monsters out there are already interlinked. And we really should be interrogating how this whole monolith moves together. And we should not necessarily assume that by simply adopting a digital currency, for example, we're somehow freeing ourselves of the architecture of this mass because you can easily embrace the digital world and forget that there's still a physical one you have to move around in as well. So I don't know if I have any broad conclusions beyond that which we offer in the paper. I think that this is really a process that's just getting started and it's going to be a race against time.
[01:44:06.937] Kent Bye: Yeah, and I know that there's certainly a lot of the broader discussions of the metaverse with the cryptocurrencies that are diluting what that term even means. And even a lot of the big investment banks have just recently released a number of different papers of looking at the future investments and the trends in the metaverse. And so there's certainly a lot of financial services industry that's been looking at the next paradigm shift and what that means and kind of the shift into and more experiential based economies that Pine and Gilmore started to flesh out a number of years ago in the early nineties. And yeah, I think this paper is a good, at least starting to chip away at some of the broader discussions and even catalyzed a really in-depth conversation here going into a lot more detail than I've had other times. So just from that alone, I think is worth to push forward this broader conversation to the industry.
[01:44:52.826] Michael Middleton: There's one more quick thing I'll offer, which is that one of the things you can never tell is what part of this is going to move first. You know, and I'd reference, of all things, your podcast. I think five or six years ago, you interviewed a gentleman named Eric Greenbaum, and he was trying with his partner to dive into the question of using trading data and visualizing it in different ways. They just recently, and not him, other companies, just recently sort of relaunched this idea. This company is right now trying to market this. So six years later, it sort of went to sleep and it morphed and changed and so on, and we're sort of coming back to it. So it's amazing how quickly ideas can form, fall, change, and sort of be contemplated again. And I think that as much as we made all these large ecosystem-based recommendations, real life is going to poke through in little weird ways we can't anticipate. And one part of an industry will be way ahead of another, and it's going to be a very jagged experience. So I don't envy this entire group, the idea of trying to keep up with it and sort of see what the ecosystem looks like in a year. But it could cause us to issue 10 new recommendations we haven't even thought of yet, just based on what tips of the iceberg poke through.
[01:46:06.219] Kent Bye: Yeah. Yeah. It's still very early. I guess I want to ask one last question, which is what do you think the ultimate potential of virtual reality technologies and the context of these larger economic dimensions, what the ultimate potential might be and what it might be able to enable?
[01:46:21.137] Michael Middleton: I think it's a tremendous, I don't have the imagination to scope out the full ultimate potential. And I tend to think in metaphors a lot. I'd start speaking on a lot. So when I think about what computers did, this is World Autism Month, right? What coding jobs have offered to people with special types of skills that were as completely absent as an entity once, Who's to say that someone who is currently paralyzed from the neck down can't be participating in a full labor job in the future through the capabilities of XR, right? I mean that by itself from a practical point of view. Who's to say that my co-workers in building a house somewhere could all be on different continents? Those are simple ideas. I think fundamentally The ability is for us to change our actual relationships with people and how we behave as human beings. And I think it's going to be a train wreck in many ways, because we've seen firsthand that as amazing as the internet's been, it's also been an amazing train wreck and brought out some of our worst elements of us. I've had to cut back on how often I use my smartphone once they started letting me know how often I was using it. So I would say that the potential is the heights and probably more often the depths of human imagination are within reach. So that's why we need ethics for sure.
[01:47:53.618] Kent Bye: For sure. Well, Mike, is there anything else that's left and said that you'd like to say to the broader immersive community?
[01:47:58.040] Michael Middleton: No, I mean, thank you for having me on. Uh, you've been incredibly patient. I, I'm sorry that I wrote such a, uh, a granular slug of a paper, but it is an important series of topics. And we always look forward to pushing it forward. We're very interested in feedback from people within these communities who might have experiences, especially practical ones that just slipped our minds. So, you know, we don't speak ex cathedra here. We're not, we're not infallible. This is very much a frontier effort here and we'd welcome thoughts.
[01:48:29.765] Kent Bye: Yeah, well, Mike, thanks so much for participating in this IEEE Global Initiative on the Ethics of Extended Reality and to put together this white paper with your other collaborators of Samira Kodayat. How do you say her name?
[01:48:43.937] Michael Middleton: So Samira Hodai, and I'm doing my best, who, honest to God, she gave me structure. She gave me the ability to move forward. And she asked a lot of great questions. And again, the whole business model element, I had not even thought of some of the things that she brought in. So I definitely want to call her out. And then well into the process, Dr. Mark McGill came in and gave fresh edits. And both of them, you know, no one could be their own editor. So although I tried to do my best at this job, they just, you know, every new person brings insights that you just miss. But again, a big thank you to both of them for waking me out of the echo chamber and really giving me great things. And I'm sure they're going to listen to this. So it's me or Mark. Thanks.
[01:49:29.497] Kent Bye: Yeah. And thanks so much for putting it all together and helping to push forward the conversation. And yeah, hopefully other folks will be able to have a listen to these conversations and get different insights. Cause like I said, at the top, the economic issues are all pervasive and kind of shaping and forming the entire industry. And like you said. knowing what those rules are can help you understand how everything's going to start to play out and the different soft law ethical frameworks around that and all the different organizations and coalitions that are going to be a part of helping to shape that above and beyond the bottom line profits for each of these companies, I think is going to be key to living into the world that we want to all live into without sleepwalking into dystopia. So thanks for doing your part to help lay it all out.
[01:50:10.376] Michael Middleton: I'm trying. We'll see where it goes, but it's been a fun process.
[01:50:15.257] Kent Bye: So that was Mike Middleton. He's the Managing Director of Seabolt Studios, which was founded in collaboration with Yuri Kavechko, who happened to also be one of my collaborators on the Portland Indie Game Jam back in 2014. So shout out to Yuri, who helped me build one of the very first experiences in VR that I ever created back in 2014, just a few weeks into owning my DK1. Anyway, I have a number of different takeaways about this interview. First of all, I really appreciate Mike's perspective on understanding the rules of the game when it comes to the finance and all these different regulations. This is essentially what is driving the entire industry, these underlying ways that we have the exchange of value. In the footnotes citing Novikova et al., which is, creating and capturing value is the essence of a business. Specifically, three questions are asked. Who is your customer? What is your customer value? And how do you make money in this business? The ways in which the business model of the metaverse, as we continue to move into this new spatial computing future, a lot of the existing ways in which we've bootstrapped the economics of the open Internet has been around surveillance capitalism and advertising. But as we move forward, is that going to be really a viable solution? Is there something else that's better? How do you actually define who the customers are, what the exchange that's being valued, and how do you have these existing huge tech companies operate at scale as they are? There's a lot of ways in which this is trying to at least map everything out in terms of all the different vectors and spheres of influence, whether it's from business governance and the operations, the customers and consumer advocacy groups, and lots of different ways that there could be pressure that's put onto these companies to shift things into a specific direction that is going to be more in alignment with everybody's ethics and morals. He cited the ESG, which is the environmental, social and governance within financial services, and how that's been a big movement within the financial services industry to be able to shift corporate behaviors and have a real impact in terms of people demanding specific environmentally and ecologically sustainable practices and values embedded into these corporations. Whether or not the privacy is going to be one of those things that's on the forefront, it's still yet to be seen. But if nothing else, that could be another vector of trying to shift some of these different things in the absence of having clear direction that's coming from a government. Even when there's governments that are passing different stuff, there's always the ability to consent, and there's a bit of that consent loophole, meaning that you can go down these really dark paths. Previously, when I talked to Mark McGill, he phrased it as this consensual erosion of privacy, meaning that we're agreeing to these terms because the exchange of value of the services that we're getting are valuable enough, and the information that is being extracted is valuable enough to the customers, which are the advertisers, to be able to sustain all this. I don't have a clear answer in terms of anything that's any different than what's already happening. I think this is a good survey and discussion around all the different dynamics of this as an issue. Then, into the more economic, financial services, banking, and cryptocurrency, just a very brief part about cryptocurrency. I will say that in the last year, 2022 at least, I've seen a lot more critical discourse coming about cryptocurrencies and the engineering flaws around it. I featured that a little bit in my discussion with Anne McKinnon. I have some pending interviews that I'll be able to dive into a little bit more detail from some of those different discussions. Some of those more skeptical takes, not only from an ecological perspective, but also civil attacks, meaning that there's one entity that is able to control this decentralized network. If you have something that is controlling the decentralized network, then it's no longer decentralized. It's essentially centralized. core dynamic. In order to overcome that, either it has different ecological damage, or it's going to have to come from more of a cultural values level rather than something that's engineered at a technological level because it isn't actually shifting in the existing power dynamics that are in our economic infrastructures. So yeah, lots of different recommendations. The one that I'll just sort of highlight here in this takeaway is the Recommendation 11, which is that a major XR technology and or experience provider with first mover advantage should publicly take a position of transparent and ethical XR standards to create public education and set a high bar for future entrants. There's a conversation that I had with Salve Aplin and Catherine Flick back in Episode 991 that was really critiquing Facebook's responsible innovation principles as a bit of a form of ethics washing, where it's not really a viable ethical program. It's essentially very simple and self-justifying, but not really pushing back or delaying innovation. Responsible innovation is supposed to have this big stopgap to prevent things from being shipped if there tends to be too many ethical harms that may be happening there. It seems from even looking at the early indications of Project Aria that there was probably enough red flags to flag Project Aria, but their existing system didn't stop a lot of the things that they saw as transgressions of those different ethical boundaries. Their conclusion was that this is a system to justify what they're already going to do without really considering that as a framework. The issue here is that when you start to have other companies that are coming into this space that are based in China, let's say ByteDance, who bought Pico VR, if there ends up being a consumer VR headset that is based in China, then are you going to start to have different sets of rules that are happening for Pico VR than from what's happening from Facebook? Or are you going to try to pass a law that tries to give us protections for everybody in the United States? Right now, the latest draft of the federal privacy law that I saw actually calls out specific countries, saying, if this company is based in this country, then we're going to make sure that they're not going to get our data, but pretty much anyone else can have access to it. Anyway, I feel like it's a bit of this situation where, rather than having good GPR-style protections within the United States, we're moving down this balkanized approach that is singling out individual countries and creating privacy laws that are specific to China versus what's happening in the United States. I would like to see something more like the GDPR. But I guess there's a diversity in these different approaches, and it's giving birth to something like MEDA in the first place that may not even exist if it was based in the European Union. Anyway, I think it's a huge open question. Again, like I said, privacy is going across all these different conversations. This specific conversation has its own way of addressing these different recommendations. The big question is whether or not these companies are going to follow any of these recommendations, or at least maybe this is something to push back and say, hey, maybe this is an opportunity for you to look at some of these different ways to do this kind of self-regulation. Or if not, then there may be some of these other vectors for the consumers to be able to start to communicate what they want in terms of XR privacy. So that's all I have for today, and I just wanted to thank you for listening to the Voices of VR podcast, and if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a Minnesota-supported podcast, and I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.