#684: Insight Engineering: Data Portability, Identity, VR, & The IoT Edge

Samantha Matthews Chase calls GDPR – General Data Portability Regulations rather than data protection. GDPR has forced many companies to make over a decade’s worth of our data available to use, and Chase sees this as a goldmine that can be mined for insights into ourselves.

She’s been looking at how the combination of VR and edge devices on Internet of Things and are going to catalyze a lot of data collection about ourselves. Who is going to own that data? And what can we do with it?

These questions have driven Chase to investigate the protocols around Self-Sovereign Identity, and to experiment with how all of this data can be remixed into chatbot art projects that can archive our identities and digital representations of ourselves. And perhaps all of this data can be used beyond the generation of psychographic data profiles and advertising, and maybe it’ll be able to provide some genuine contemplative self-reflection to help us understand more about ourselves.

I talk with Chase about believes that there may be many market opportunities for this type of “Insight Engineering” that could come from the data mining, processing, and virutal spatialization of all of this data.


This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. So with GDPR being passed in the EU, that's the General Data Protection Regulations, there were a lot of things around privacy, but there was also things around being able to actually download your data from a specific website and be able to, you know, have an archive of it. I had a chance to talk to Samantha Matthews-Chase, who refers to GDPR as the general data portability rights, because it's now allowing you to have access to all this data, like 10 years of social media data that has been collected now. So what can you do with that? So she's really thinking about how do you mine this information, but also do things like self-sovereign identity and have just better data protections and privacy, but for you to have more ownership around your data, but to actually have access to it so you can mine it and get real deep insights about yourself. So Samantha has created the Venn Agency, which is really looking at this cross-section between virtual reality, web technologies, the Internet of Things, these edge devices, and peer-to-peer, and different decentralization technologies in general, and looking at the confluence of all these different things and trying to figure out how to solve some real pragmatic problems with the combination of all these things. So that's what we're covering on today's episode of the Wastes of VR podcast. So this interview with Samantha happened on Saturday, June 16th, 2018 at the VRTO conference in Toronto, Canada. So with that, let's go ahead and dive right in.

[00:01:36.923] Samantha Mathews Chase: I'm Samantha Matthews-Chase, and I founded Venn Agency, which is a creative lab, I'd call it. We work right at the intersection of VR and the web and edge IoT mesh peer-to-peer, trying to figure out how we can live in a future where Everything around us is intelligent and we can interact with it just as we're intelligent. And that sort of happens at an infrastructure level. So that's sort of where our main bread and butter is, is in disrupting the site visits of everything from enterprise to military planning of refugee sites and whatnot. It's taking a scan of the environment and making it available in a shared web environment that anybody can collaborate in so that's our background with the agency and then I recently just started a not-for-profit think tank and accelerator around unlocking the life cycle of our personal data because as I've been building these more and more immersive tools and working with the web more intimately and having sort of been a part of the more of the sort of underground or the sort of edge movements towards decentralizing and distributing the web. While I was at South by Southwest I felt very concerned with how siloed all the conversations were between like blockchain and then there's like a here's a workshop on how to get acquired by Facebook or Google and like just nobody really sort of looking at the sort of major problem happening. This was right before the Cambridge Analytica became more publicly known. That's what I was there to speak about with our panel. And I just was like, I don't know what else to do. I kind of woke up in a cold sweat. I saw Peter Diamandis announce an X Prize to like embody intelligent machines. And I was like, we don't know what it means to be human. Like had an epic like sweaty sleep and kind of in a tangent. called my backers and sort of presented some theories and decided to put together. I got 10 million backed for a prize initially. And then I thought a prize isn't a good enough model to solve a problem long term. So I've switched the focus to it being an accelerator now that can really create innovation funnels that can be tracked long term and have more impact than just, say, a prize.

[00:03:58.248] Kent Bye: I know last year in 2017, in May, there was the announcement of the Decentralized Identity Foundation, which included Microsoft, a number of blockchain companies, and they basically open-sourced their IP and their technology using the blockchain, and they created some W3C standards, so they're able to create these open standards now for decentralized identity. They've created these protocols that I think are now going to be starting to be implemented. So now that we have these protocols, what is it that you as an accelerator are going to start to be doing when it comes to decentralized identity or self-sovereign identity?

[00:04:31.377] Samantha Mathews Chase: So after I announced it, I attracted the right people who pointed me towards the groups at the W3C and the IEEE that are working at these problems from a protocol level. And I went to the internet identity workshop about a month and a half ago and had another freak out. What was that one about? Well, I've been following the work of a lot of the people that speak at these things for a while, and I never jumped in, but it felt, again, very siloed. So people had gone off and were having very granular conversations about certain ways of doing things, which is important. But because of this sort of argument being made, which I think is noble on their behalf, to like, OK, well, if we turn the big ships, And we show them that, oh, actually, identity and the misuse and mismanagement of identity costs you millions and billions of dollars every year. If you implement this protocol, you'll save money, and it's better for humanity. Great. I love that argument. Unfortunately, because what's happened, though, is that all of the smartest people who care about our human rights are now working at an enterprise level again. So we have all of these people that are like, guess how much money your business is going to save? Like, guess this? And people are like, I've been getting a lot of privacy emails lately. Hmm. And there's still this chasm between what people understand. Like, if you Google what's my personal data, it's like, do you want a better cell phone plan? It's just monumental how far we've let the education gap between users, which we call users, not people, by the way, and our platforms. And so I went to a workshop and I got a little bit overwhelmed. And then I went to my car and I wrote out this soapbox speech. And I went in and gave it at the end of the day there as well, where I said, I think taking terms like self-sovereign identity and turning them into jargon will be as damaging as blockchain and whatnot if we're not careful.

[00:06:33.618] Kent Bye: And when you say, I mean, self-sovereign identity within itself is a bit of jargons, but you mean sort of furthering the jargonese about it? Or what do you mean by that?

[00:06:40.923] Samantha Mathews Chase: Yeah, I don't think of it. To me, it's a very important concept. What I've noticed a lot of and what I've always tried to have my approach with this is because my background is in music production, artist, and sort of tumbled into this rabbit hole very specifically around the virtual web. and what that promised, and then understanding how you change infrastructure. And when I look at some of the words, they just end up meaning something just in that context. And to me, if I explain to somebody what it means to be self-sovereign or the idea of being the captain of your ship, it's important that it doesn't just become some Microsoft partnership.

[00:07:19.085] Kent Bye: Like a branded term that then sort of is dissociated from the concept of self-sovereignty.

[00:07:23.086] Samantha Mathews Chase: Exactly. Yeah. And if we don't have, and it's great because since then I've joined the credentials group, the W3C credentials group. And right now, sort of since that conversation started, we have been writing out use cases with the user at the center. So finding really interesting use cases that we can present to the next working group that will write those up into sort of possibilities of protocols to work on. So I think that that's where there's a lot of opportunity for people that they maybe they get freaked out by these things or they like are trying to learn about blockchain because they think that's the future but there's something really important about walking in there as a human and continuously advocating for yourself as such. It's really easy to go into those groups and go into those working groups and get caught up in whatever the most promising, like, oh, these people are going to turn towards us and adopt this protocol if we just do these things for them and whatever, and you just forget why we're doing it in the first place. Yeah.

[00:08:20.719] Kent Bye: Well, I see that part of the big dynamic is this tension between centralization and decentralization. So you have the centralized companies that are like the Googles and Facebook, and they're providing a service that is an amazing service for a lot of people in that the fact that they've been able to centralize allows them to get money and income to be able to improve the technology that they're developing. I think the challenge is that they have an economic model that is based upon surveillance capitalism that is taking our data and then being able to mine that and create these psychographic profiles on us, which is being used in an advertising context, but there's these deeper privacy implications that come with that. And when I hear about these concepts of self-sovereign identity, I'm like, that's great that there's a decentralized solution that's kind of like the antidote, not only for these big, large institutions, but potentially for you to log in into any website instead of you having to come up with a new username and password that could get hacked or You know if there's some sort of like equivalent of a digital ID card that you flash that you have control over then you know obviously there's gonna always be problems with losing your ID cards and what that and like if your device that gets stolen I mean I think there's there's still always gonna be issues around you know having a virtualized digital identity and how to manage that but I could imagine that where this is all going is to actually create a user experience with the self-sovereign identity that has just a better experience for people. And the thing that I don't know is, what does that look like? What are those experiences that are really at the leading edge of implementing these W3C standards and self-sovereign identity that are going to be such a great use case that maybe create a shift away from the powers that get accumulated in the centralized entities and companies that are out there?

[00:10:00.737] Samantha Mathews Chase: Yeah, I think it's interesting because part of it that I find a little backwards is I don't think, like, DIDs are interesting because, yeah, you should be able to have IDs for multiple contexts of what you're doing. But I like the possibility of persistent ID. And I think that's where it starts to unlock really a lot more interesting concepts, especially when you start to think about things like a multiverse or a metaverse. I don't know if you've, I have these really awesome Bluetooth speakers at home that just like whatever the Bluetooth chip that's in there, it works so well and I wish it was in everything because every time I put my keys in the door for my Bluetooth headphones, they pick up in my living room and it's just like, I'm like wow, for one moment in my like techno crazy future, I'm like the future could be kind of cool. Like if that just kept working that way, this like seamless transition that has to do with where I am and where my movement is. What I think is frustrating about the current state of the model of these platforms like Facebook and Google is that they really undercut what's available to you. And I think that it's interesting because you You'll never, because you've asked for privacy, and this is a really unpopular opinion, but one that I'm trying to kind of push forward more, you'll never have a personalized experience in this ecosystem ever. And so people that are really important in this that I think will sort of carry these and push these use cases over that maybe are as powerful as Facebook and Google would be like car companies or auto companies joining together to standardize AI. cars and self-driving cars, I think, will become the first widespread use immersive chambers. And when cars have to decide who you are, how you like your seat, if you live or die, any of those things, will your identifier be in there? When you step into the threshold of that car, how does it know it's you? Is it going to be your Facebook or Google address? Do you want that? Do you want them in charge of all of your preferences. And so I think the promise of, as we start to make more immersive environments and more partnerships, sort of things like Salesforce that combine the power of customer experience across different brands, I think that putting the human at the center of that, at least in a way that controls their identity, will become so necessary because how else will you be able to be influenced or interacted with in your physical life when you're not on your screen?

[00:12:24.610] Kent Bye: Yeah, there seems to be this fundamental tension between the benefits of the personalization of sharing your data, but yet the risks to losing that privacy. I know that Microsoft's keynote, they were talking about homomorphic encryption in the keynote, which was like, oh, wow, this is like the ability to be able to have data stored somewhere, but to do processing on that data, but they don't ever actually have to have access to be able to see that data. So there's different cutting edge encryption technologies so that that data can be private. I mean, the model that I think of is that, Our data is our data, and we kind of enter into this a little bit of a nebulous state where if another company is involved in helping capture the data that is normally our data, then is it still our data? I mean, it is our data, but is it a shared relationship where we co-own the data? but is there a way that I still maintain complete control over it? Is that, especially when it comes to biometric data, which I think is gonna be this next frontier of the biometric data privacy wars, is who has access to it and what are the implications of what type of information they can glean from biometric data about our unconscious behaviors and patterns, and then there's the challenge of that's very powerful information to either predict behavior, which then, if you can predict behavior, then the line between prediction and control Starts to blur and starts to go away. So I think that's the issue is that it's not just a privacy issue It's an issue of them having undue power influence and control over our psyches And I think especially if they're creating entire virtual worlds that are around that so my conceptualization of this is that potentially there's perhaps some way that if this data is collected does it have to be going up to the server is their way to do a on-edge device processing of it that is able to send back maybe a high-level reduced takeaway of whatever model or framework they have, but yet the raw data is still within my control and ownership.

[00:14:12.511] Samantha Mathews Chase: You're touching on exactly why I started SOL. SOL, which stands for Sovereign Ownership Under Law, was just this idea that even right down to the devices that we have, we don't own them anymore. And Apple changed our universal IDs to an ad ID.

[00:14:26.867] Kent Bye: What's that mean?

[00:14:27.467] Samantha Mathews Chase: An ad ID is literally what it sounds like, an advertiser's ID. And this is my issue with privacy. It's an elitist concept. It's fairly new to humans. It's the right to disclude. I think it's a purposefully elusive concept to be leveraged for ultimately keeping people ignorant. It's interesting, this idea of privacy, because really, there's been 10 years of the most personal data collected on us. That's what GDPR represents to me. It's like, OK, you have 10 years of data that's been collected on people, never been used at a personal level. And now it's available to you. But it's going to come in this totally ridiculous format, like 40 pages of your Tinder information, or all of these ways. And privacy is sort of this tool that's leveraged, but also this right to information, which is a human right, which to me is a right to confuse or a right to overwhelm. What I'm hoping for and what I'm asking is just sort of like hoping this movement, if nothing else causes, is that we call this the age of insight. We're not in the information age anymore. All that will matter and who evolves and who doesn't is who is able to gain access from the information that's flowing all around us, who can gain insight into whatever it is, who can zoom in and zoom out. And I think GDPR is exciting because I call it the general data portability rights, not general data protection regulation, because they're actually rights. This is the first integration of human rights and digital rights. And the idea that you can now make your data portable means that you possibly could mine in, and in a model where you put the users back at the center, say. Because right now, the customers for Facebook and Google, they're those third-party data brokers. And they're parasitic. I think Facebook and Google, they don't know how to get out of it. 98% of their revenue is aggregate advertising. All of their money comes from this third-party ecosystem. The lifecycle of a Google product gets shorter and shorter each year. And I'm not sure if they'd know how to retool them to be able to make money in that way. However, they have already storing all this data on us. So we're able to say to use a decentralized ID that can maybe unlock each of those data pools. Think of them like a locker, like all the data that's on you on Facebook, all the data that's on you in Google. And you were able to sort of treat those more like a stream, not so much like property, but as more of a resource, like a common pooled resource that you both have access to. the collector and the person being collected. So if that's a shared resource, as opposed to this treating it like property, which is like download it and keep it safe, keep it away from people, that's just like keeping you away from insight. And that's the problem. That's like ensuring that you do not move on. ensuring that you stay the same. Privacy is a prison, also. It's a wall. And so asking to keep things private, like your name or something that you wrote down on a gym membership or that you're wearing on a necklace, is so pointless at this stage because we have movement signatures, which are impossible to fake. We have so many profiles on us that it's like, by asking for us to have privacy, we've actually asked to be completely left out of the equation. We've asked to have nothing ever made for us. And we've asked to essentially be kept safe. But to me, that's like the beginning of the womb in the matrix. And what I'm trying to get on the horn about is the portability aspect of the data and what our first mission will be. at the fund is to start a, I imagine a device, like I imagine it as a chip or some sort of standard, but it could also be a cloud, an encrypted cloud, but a place that allows you to turn on taps of your data. and mine insight from it. And just so people are clear on, like, all of this data is already taxonomized and tagged. Like, we're spending real money on ICOs for companies that are only a white paper. And yet, data is packeted up and sent around the web all day, every day. Its value is tied to its use. It's tied to its uniqueness. And it's tied to its correlative power. And the thing to be concerned about is the correlative power. Because as we allow more and more types of data to be taxonomized across multiple platforms, that correlative power becomes more and more powerful. And that's when the predictive modeling becomes control. And I think what we're presented with is, if everybody were to say, get all their data, and the first goal would be to have a data interpretation. So maybe the same way we're sequencing the human genome, we could sequence all the ways that we're sensed. The same way that there's ethics committees for things like editing DNA, we should be having ethics committees for sensing data. We are 100% post-private. I will argue till I'm blue in the face about that, because a movement signature can be collected from your Wi-Fi router. A movement signature can be collected from your accelerometer. A movement signature can be collected from a camera. That's three totally different places that can identify your movement in an environment. So it's easier to copy your DNA or rip out your eyeball than it is to grow your bones to the exact length and move around like you. And because of that, what we need to be concerned about is how the data is connecting behind the scenes. And so if I think of a world where maybe Facebook and Google can modify themselves with minimal damage to their innovation, it would be to take the same people who are creating predictive models to figure out how sad to make me before I buy pants. And use those same predictive models to be like, hey, did you know that these things make you really sad? And then you buy pants. And you're like, oh my god. Thank you for telling me that. Here's $5. Or here's an Insight app. So the same people that are making all these data scientists that find out something troubling like, oh, we can predict just by somebody's Facebook likes if they're going to have a manic state or they're going to be depressed on this day, let's sell them these things. So the people who discover these things, I think data scientists are in a really difficult position right now because they know that all of their work ultimately will end up in this ecosystem. that any psychologist doing work on the five-factor model or figuring out how social interactions, like what they tell about a person, anybody doing work in that field must have some sort of weight on their conscience, I imagine. And I think what we can try, at least as a pilot program, would be giving people a little bit of access to their data in a way that's mineable. And I think this would be a way to transition users from users into actual customers. If data is only going to increase in value, and especially data that's more meaningful, like a 0.01% click-through rate should not be any sort of measurement of success. So data that's more personalized, that you're contributing to, will be way more valuable. Like for an example, on Spotify, if I've liked three songs from an artist, but I haven't followed them yet, that artist should know that they should be advertising to me. Because they have a show coming up in my city, and I like three of their songs. That's a good use of targeted advertising and that's the type of information that will never is made available because it's all about no these are women in their 30s who like these types of artists so you should pay for this package and the more you pay the more we'll tell you and let it get a little bit closer to somebody but not necessarily. And so thinking about how we could have more personalized experiences with our data, if Facebook were to open up, say, a little data pool, and it was the same amount of data that was, like, same vocabulary as the data from Google, say, and you got this nice little data pool, think of it like an oil field, and then they released apps like, here's a, therapist app or a coach app or here's a way like hey do you want to find out what friends you actually talk to the most or that you're most relieved by or like that you know track behaviors over time and once we start to see that I think what it'll allow them to do at least as a test is to see how much people are willing to pay for insight. Because we know that these third party ecosystems, people want to pay for insight to control people. But how much are people themselves willing to pay for insight? And because GDPR is already happening, this isn't a question that's sort of like, oh, should we do this? It's like, if you don't do this. you're going to fall far behind, because people are going to figure out that they can start mining their own data, that they can have access to their own data, and they can actually move their data from you over to somewhere else into an insight pool. And we're going to have some sort of intelligence race, almost like a moon race again, between Europe and the rest of the world, where essentially now you can have personalized learning. for your kids, or you can know when your kid's heart rate goes up if they're learning about fractions or something like that. And then you'd know that you should maybe download a cooking program that'll teach your kids fractions at home. It allows you to be a sort of a parent when you need to, as opposed to having some report on your kid that he's falling behind or something like that. So what you'll see is we'll have all of a sudden these really customized learning available for kids, available for people, and people will have a divergence again. where there's just people who are a data divergence, essentially, people who know how to use data, people who know how to mine it and gain insight, and people who don't.

[00:23:49.098] Kent Bye: Yeah, I have a lot of different thoughts that come up with this. First of all, it reminds me of the quantified self movement, the ability to be able to record data on yourself to be able to get different insights. And what I would say is that, to a certain extent, trying to take quantified data about yourself and to be able to draw qualitative insights, I think, a huge open problem as to the data that has already been collective. It's in the past and what is the utility? I would question at least what has already been recorded, but as we move forward, there's a philosophical question there as to how much do we want to keep on mining this and how much actual genuine insight are we going to be able to get out of behaviors that may be unconscious in the first place, clicking on something or behaviors that, you know, may be something that is a behavior that you're not even fully conscious of or fully aware of. So I think the potential here is that with this exalted potential, the quantified self is to get genuine insight about how to become a better person. Now, the other thing is this future of the GDPR, which is the data portability and there is going to be much more higher bandwidth of types of information that could be gathered with biometric information that could start to draw into some maybe these qualitative aspects or dimensions or maybe just higher fidelity higher bandwidth information and you know yes there's going to be cameras that are around that can do gate detection and identify us and I think there's going to be a larger ethical considerations that are out there. I don't necessarily agree that we're automatically in a post-privacy world because it's more of an ethical question as to whether or not us as a culture and a collective decide that that's what we want. If we collectively say, no, actually this is not what we want, I don't think it's a done deal. But at the same time, we're in this, Faustian bargain with all these companies where they are able to provide a lot of genuine personalization with the more data that we give so it's this tension of the trajectory of the companies that are doing that but I have hesitations around just saying that we're post private and then privacy is a prison because like there's value and actually not recording anything and be just being in the moment and I think that's sort of the inside of apps like like snapchat in the sense where things disappear and So do we actually need to capture and record things? And the things that we have recorded, how conscious are we while we're recording it and how much actual utility or insight are we going to be able to gain from that? And I think eventually we'll be able to get there, but to be able to do that translation from the quantified signifiers between taking that and being able to actually give genuine insight I think is like a huge problem.

[00:26:21.664] Samantha Mathews Chase: Yeah, I totally hear what you're saying. When I say we're post-private, it's not that this is immediate right now, but privacy comes after and never before the act of exposure. And so because things like, well, Cambridge Analytica shows that they're using information. Yeah, they got some people to take surveys, but you just can see somebody's public likes and get a lot of information about them. And I think we need to do more to explain what we mean by privacy, especially when it comes to things like, data and our thoughts and our mind, our own personal experiences. It's like I understand the argument of the mind out of the body, like IP, like, you know, if I write down a memory and somebody has Alzheimer's and I write down a memory and I access my memory, we're both accessing our memory, one's just in the external world and one's in my brain. But IP law doesn't take into account all of the things that went into that person to make that thought. And it doesn't say the timeline on how long it would logically take somebody else to come up with that thought. So it locks things in. And in the same way, if we say, well, we're not totally post-private yet, we should still be trying to have some privacy. And we don't really know what that means. Keeping data safe to people isn't the same as being able to have a private conversation. and I think without any nuance to that conversation and allowing like technologists to sort of make it a cryptographic thing or just like a thing that's necessary that we must have and sort of be it's either about surveillance or not and not really necessarily about evolution and access. I think that's just why I find the conversation so halting to what's possible because the desire for privacy It comes with a lot. It comes with a lot and I think at certain points we have to be able to change the conversation to what we really need to be doing as opposed to what we have been doing or what we think we should be doing.

[00:28:15.584] Kent Bye: Yeah, and I'm curious to hear that you're working with virtual reality, immersive technologies, and you talked about the lifecycle of private data. And I'm just curious, as you move forward, how you see the Internet of Things, these edge devices with personalized data lifecycle, self-sovereign identity, and virtual reality. What is it that you see in the future that we could experience with all these things coming together? What are some use cases?

[00:28:40.535] Samantha Mathews Chase: Yeah, one of them I'm working on right now is a cornerstone, like a digital cornerstone or a future-proof cornerstone. So a self-powered little server that allows various levels of permissions. So it hosts the scan of the building, its blueprints. It can host whatever you want. It's write-to only. You can have a permission for a fire department, for the city, building inspectors. And it's a way for people, if you own the home and you're going to have a plumber over, they could connect to your contractor network and see all the relevant information to them. If you're just a passerby on the street and you want to know more about your neighborhood and you want to feel more connected and understand more about zoning, you'd be able to just access the basic low power Wi-Fi network that's public and just see, oh, this is whatever I have to go dig up at the city. This is available to me right here.

[00:29:32.045] Kent Bye: And I can imagine that you would maybe have an AR window into that as well to spatialize it. So it's like a little internet of things edge device that is broadcasting Wi-Fi. So it's presumably sending out information about something. It's like a node that you're able to connect to. And then you could potentially broadcast that and see a spatialized version of that.

[00:29:50.791] Samantha Mathews Chase: Yeah, exactly. So if you're a firefighter and you're coming into this big blaze and you have no idea where the rooms are or what the place looks like, putting yourself at a huge risk. There's lots of other things like builders and contractors are always making fake basements and things that it's really hard for the city to keep up with. And generally, I mean, other than when a building changes owners and it's like a big public sort of thing, like a church. There isn't much change to the building. We scanned Amoeba Records in LA, because I think they're tearing it down, and I've been trying to get in touch with the contractor building, the condo, because I think it would allow you, in a cornerstone, to, even if you change the building completely, if it's that site, then you would be able to have, like, connect to the lobby of the old Amoeba. And what I believe is possible for the future, and what my vision of the singularity is, In big data, you can really zoom in and zoom out. And I think what I understand of the singularity is not necessarily us having wider focus or an ability to know all at once, but the ability to gaze upon something or focus upon something or connect to something and know about it through time. I think if we were to start making everything have a virtual cornerstone, say, or a digital cornerstone. If everything had sort of a black box, right? You've got computers that IBM are making that are the size of a grain of salt. So you could imagine that even a rose that you buy could come with its own network. And when everything starts to just have its own little information inside of it, its own little tag, then when you walk out your door in a few years, you might be like, maybe I just want to be on plant watch today. I want to know that that species came over on Spanish ships in the 1800s. I want to know that that's not a native species, but that is. Maybe I just want to see where all the planes are above me and where they're going. I just want to zoom in on stuff. And I think about people like beachcombers who just track the shores and the erosion. And if they were able to just consistently add to a network that was on the beach their information, then when you went to the beach, you'd be able to see patterns over time. I think our ability to understand sort of the construct of time will become more powerful when we're able to see patterns of an area or of a thing over long periods of time. If you start to see the repetition of it, then you know what its future is. You can kind of see into its future in a way because you understand it from its origin. When I think about what a singularity would be or what God is, it would be the ability to have relationship with whatever you're focusing in on in a way that takes time away into a state of timelessness because I think God is just relationship. So when you can just connect to a rose, And you know that Rose's history is in that through time. Or you can just connect to this. Or you can even reference your own human history or your data history through your family. I'm not too sure. But I think that, to me, is what the singularity is. It's being able to know anything at any moment by creating relationship of it through thought or gaze or intention.

[00:32:57.940] Kent Bye: Whenever I hear the singularity, I automatically think of Ray Kurzweil and sort of the technological singularity by which technology is evolving so quickly that you can't basically keep up with it. And I think of it as more of an AI and technological thing. But the things that you're talking about, I associate more with contemplative spiritual practices of either ego disillusionment or almost like an enlightened samadhi state of consciousness where you kind of are able to get these type of timeless qualities. I think that anybody who meditates can already get access to that timelessness. But yeah, so why why does it have to be associated to the singularity or what is the singularity? How is the singularity going to bring this this dimension of timelessness?

[00:33:37.149] Samantha Mathews Chase: Well, I think right now when people refer to the singularity they're thinking of it as because we're putting everything in a cloud But what if we didn't put it all in a cloud? What if the cloud was in everything? Wouldn't that change what singularity is and make it more of an appearance of God or an experience of universe in a way? It's just like we're looking at it as this thing that's sort of collecting into this cloud that's going to be this thing that overpowers us. But if we are able to distribute it, then how will it be more powerful than having it all around us? I think right now I didn't expect people to just be making devices that all connect to a cloud. I thought that we'd be having, I thought that I'd be able to just connect to anything wherever I went. I thought I'd be able to have my oven, you know, I know there's smart ovens and things like that, but I didn't think it would all be run through my phone. And I didn't think it would all be going to a server somewhere. Because we have the ability to have intelligence right in front of us in a grain of sand. Why are we sending it away? And I think maybe my version of or I understanding of the singularity is different, but I just have a completely different vision of the future. And it has absolutely nothing to do with the interconnectedness of like a crappy internet and botnets and sending away something to Iceland just to say hi to a friend. None of that is the future that I want. And so everything that I'm building is about, I guess, edge. And I didn't even know it was called edge until I looked up a market research thing and somebody kept telling me what I was doing was edge. Now they have Fog, which is like the edge of the cloud, sort of networked devices. And I see people grouping, and it feels a lot like if you were to take a really high level view, like there's always these sort of different competing kind of intelligences trying to come online, you know. We split from an octopus 60 million years or so ago, but they're super conscious creatures. Their whole body's a brain. And some people think that our stomach was like at some point a different group of cells and organisms that have come together. So when I think about consciousness, and what it means to be aware of yourself. I think a lot of it has to do with a group consciousness and these sort of like mirror neuron network that came online like 80,000 years ago that gave us these sort of empathy neurons and this ability to mimic each other. It's like when, you know, the Neanderthals sort of fell away and they didn't have the same access to sort of learn. Like, we can learn. And having that experience of just being able to see something and then know it, taken to the extreme, and with the communication network of the web, we see people, like, hashtagging things that used to be a way to, like, oh, this is a way for me to find something else or find community. And now we're hashtagging things to train an intelligence on how we see or how we do something and we never had intended that and so in a way that sort of idea of an intelligence overtaking us or this sort of maybe the scarier side of a singularity we're training into doing these things and in a way like unwittingly and I think the power of groups especially in a psychological perspective because we're training AIs on our hashtags and our hashtags are more and more political or speak to more and more about what we believe in or what side we fall on, they can become weaponized so much more easily. You know, it's like I never expected to be in the future where we name our genders and our race and everything like that as hashtags in this way that perhaps originally was intended to find community but now seems like no this is my identity right and as we sort of train those tags on outside appearances and societal constructs we further sort of build these psychographic profiles as you say on what motivates us so i'm i'm concerned that if we continue to like allow group intelligence through group communication and we don't give people individualized and personalized control to sort of fine-tune their own reality, their own perception of self, then we'll end up in a, I don't know, not a good place. Feels like it's happening in a lot of ways already. What are your, do you see that?

[00:37:49.222] Kent Bye: I see a number of different huge trends that are happening. First of all, there's a sort of an underlying ethical crisis within technology that is kind of the equivalent of when the physicists created the nuclear bomb or chemists created chemical weapons. There's like, there's things that the technology could do that impact the collective in a way that has implications that are moral implications, which means that there's ethical discussions that are happening right now. So that's happening in all different dimensions of our culture, society, political, technological, ethics is a big trend. So I think that we have these technologies that are there and that there's an inherent polarity of the exalted potentials and also the terrible uses of anything that humans have ever created. Any tool that we create could be used for doing great benefit but also great harm. And so it then becomes a matter of the values of the ethics of the culture then how to use it. But there's also a deeper dimensions of the battle between the centralization and the decentralization as well as these inherent parallel curve dynamics by which that those who have access to power money and resources have an exponentially increasing amount of potential to get more and then that the game of Monopoly as Phil Rosedale says kind of plays out very quickly where the rich get richer and the poor get poorer. So in that context we have these power dynamics. And so we have the power dynamics of big centralized companies, we have the acceleration of technology, And we have people who are more and more having less agency over how they interface with this. And I think if there's any trend that I see, it's like these decentralized systems are trying to perhaps take power away from those centralized solutions to create alternatives. But that at the end of the day, we have to say, what is this all for? Why are we doing this? And to me it's more than anything else to be more connected to ourselves, to be more connected to other people, but to be more connected to the planet. Because we're also in the whole context of a deeper ecological crisis where we've fallen out of relationship to the Earth. And so I see that these immersive technologies can cultivate this awareness of presence that can actually bring us back to being in connection and in harmony with the Earth. trying to find out ways to get all of this in harmony in a way that's ecologically sustainable as well as ethically sustainable as well as this balance between the centralized powers and decentralized powers. That to me is like the overall arching trend and the concepts of the singularity are more of a vague thing that like I associate that more with contemplative spiritual practices that are inner or noetic that you can go and have but I don't expect to have a transcendent experience mediated through technology. I think it's going to be from my own ability to connect to myself, to connect to other people and to be in connection and harmony with the rest of the planet.

[00:40:31.187] Samantha Mathews Chase: Mm-hmm. And are you familiar with like commons or common pool resources or if you heard of the tragedy of the commons?

[00:40:38.130] Kent Bye: Yeah, I've heard of the tragedy of the commons.

[00:40:39.490] Samantha Mathews Chase: Yeah, so I've been really fascinated. There's some really interesting things happening right now with large amounts of data and people trying to figure out how to use them and commons have kind of reemerged. They've never really gone away. Eleanor Ostrom won the Nobel Peace Prize for her book Governing the Commons and I think anybody who's interested in blockchain or cryptocurrency you could really especially people that are in the ICO madness, should take a moment and read that book. It's about all the different ways that people have successfully come up with governance models to manage long-term natural resources. Usually those are finite, but like in Alanya, like a sort of a fishing grid that has all the best spots that people draw each day that are continuously changing and it's been self-governed for quite a while. The tragedy of the commons essentially just says that, you know, if I'm going to get mine, if you're going to get yours, I should get mine. But it doesn't take into account anything like the transparency of the community who's living around this sort of pasture or whatever sort of context you want to put it into. And what blockchain and all these new protocols provide are transparent methods of governance, which is essentially how people govern common resources. So in how it's reemerging now is a lot in health data. There's like Cancer Commons, gut biome, citizen scientists that are sort of trying to sequence the gut biome. And there's usually sort of a personal reason to want to contribute data. to one of these pools, these commons, or your two organizations, like two huge health organizations that want to integrate the data for research. And how that all gets integrated and built is what would eventually become the basis of a like a mining pool for insight. And so I think what can happen and shift is those parasitic third parties can become like, take the gut biome, for example, like you poop in a bag and you mail it to these people. And it's like your contribution is that you're a part of it. Great. Hopefully you'll help people solve some, you know, crones or something like that. Maybe you're a researcher and you want to go start contributing. So you're just doing that out of the goodness of your heart. But if you think about something like more of like a model that you see a lot in the blockchain community, which is I'm going to invest in this with my poop or whatever it is. I'm investing with my data sample. And I get some sort of hash or wallet. And then if I'm a researcher, I want to start contributing to my researcher, I get some sort of token in there as well. If the Mayo Clinic wants to come along and use this data, they buy it, then we all get paid. There's so much money and information and understanding that can come from certain kinds of data pools that I think there'll be a real emergence of commons coming out. I'm seeing more and more there's a blockchain commons, there's people who are like, OK, how do we regulate these things differently? IP is an archaic concept. How do we treat this more like electricity or something to be metered, something to be accessed? And I encourage people to dive into a lot of these old use cases of commons that have worked, because I think they really are really relevant now.

[00:43:46.463] Kent Bye: Yeah, and as we're talking about all this stuff, you're at the cross-section of all these different technologies. And is there anything that you personally want to experience with the confluence of all these things when it comes to data lifecycle or the edge devices or immersive technologies? And what is it that you personally want to experience?

[00:44:04.831] Samantha Mathews Chase: I want to be more adaptive. I had a really interesting last couple of years. I was a touring musician and I was in a documentary that turned my character into a comic. During that time I was moving out of being an artist and into starting my own company and it was a really weird transition but I felt like how do I let go of this thing in a way that's loving and how can I archive myself in a way that's not like I feel a lot of times we're like, oh, I broke up with that person, unpublished. Or like, oh, I was in my 20s, I was such a mess. And you don't sort of have this acknowledgment that that brought you to this moment. So through the comic, which is about these entertainment robots in the future that become sentient and play music based off people's emotion. And I was working with Janice early on, and there wasn't any music in the rooms. And they had a jukebox script. I was like, the skin didn't load. And it was just this node floating there in this room and music playing. And I was like, oh, you could put that in anything. Can you put it in my bot? And then we made this likeness bot from my comic that plays all my music. And it's like this centralized point that I can put all my plays and put all of my views and everything like that. So from a use case for an artist's perspective, especially as things disappear like vine and whatnot, it's nice to be able to have a place that holds and tracks your online weight. But for me, what happened was I was able to look at myself separate. I was able to step out of my story world that I'd been so much a part of and be like, that's in a book now, or that's there now, that's her, so what am I? And it allowed me to kind of be like, I need a dope origin story if I have this comic book hero. So I got a cool dog and a cool car and a hacker garage, and truly in my heart, I see things now as like, well, what's this chapter? Who's this character? And I think we all have that. I mean, aspects of our humanness that can't be broken down into anything else. And those are our archetypes. And I think we have so many different versions of ourselves inside of us. And for me being able to move on from my 20s in a way that was like, I actually even took all my old, everything I ever wrote in my 20s, every article I ever wrote, everything I ever said privately, and I made a chat bot. and I took all my photos and melded it into one so you can go talk to her and she just says random sort of things I used to say and it's awkward and very first-gen but you get this experience I feel like that there's something about it I can't put my finger on but that holds this energy of me then so what I want is to be able to not only archive myself in that way, to go back and interact with myself in that time, not just be like, look at those memories, but then also to be like, okay, I'm running on the treadmill right now. This is right when I usually quit, train my AI to be like, okay, tell me this is when you're a little bitch. And instead of having somebody be like, great job, I'm like, hey, This is when you're a little bitch, remember? Don't quit. And you're like, yeah, OK, right, right, right. Instead of keep up the great work. That's the kind of person I am. I want to have versions of myself that knows how to deal with me or that I can put into situations. I don't want Google, I don't know if you saw the I-O, making it super easy for me to invite people over for a play date. I'd rather be like, oh, this is boss Samantha, and she needs to go be a boss in this situation. it starts off with, here's all the times that you were a boss, and this is how you acted, and this is what you look like when you're a boss, and you get, like, and you have this intense moment of, like, maybe just this reminder, and maybe it just starts as this kind of coach, but that coach can then sort of, you training that version of yourself, and you're honing in on that aspect of your personality, and to me, like, when you're taking adaptogens to trying to switch focus and switch tasks, I want to have AIs that are very much in the presence of mind that I need for that situation, and I want to be able to version myself so I can send out better versions of myself. I don't want to just be like always like, oh, I'm flaily Samantha, awkward about the world, thinking about my ex. I kind of have to pee. Like, hi, what are you talking to me? Like, that doesn't, it's like not the best all the time. There's so many other things I have to keep track of. So if I can start training better versions of myself, knowing when to archive myself, and that could kind of become like a Burning Man type experience where you're like, hey, I was camped by doing VR stuff for a while, and I'm doing something else, but instead of just being like, I'm moving on from it, farewell, it's like, here's camped by from the first four years of VR, right? And like, here's camped by now, but you could go back and have this experience of yourself that isn't like some shoot-away file.

[00:48:34.215] Kent Bye: Yeah, I get this theme that you keep coming back to is this time and seeing things back in time and seeing these sort of, from the Kairos time perspective, there's these loops and cycles of time. So we like fall into, you know, seeing the, looking into the past to be able to look into the future. So yeah, that's beautiful. That's great. And finally, what do you think is kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:48:59.106] Samantha Mathews Chase: I think virtual reality allows us the perfect test ground for everything that we want to build in the real world. To me, virtual reality is just a screen. It's just like a TV or something. It's like what I'm excited about is the plane that it expands upon. It's like just all of a sudden adding a new access changes everything to how we do everything. And I think the potential is only just starting to get unlocked, and it'll get unlocked the more people try and get right down to the very bottom protocol layer and do work like you're doing, where you're getting right down even to the consciousness of the mind. Because just like we have to start from a protocol layer if we want to build an entirely new vision for the future, we also have to do the exact same work in our mind.

[00:49:44.266] Kent Bye: Great. And is there anything else that's left unsaid that you'd like to say to the immersive community?

[00:49:48.668] Samantha Mathews Chase: Yeah, if anyone's concerned about the way data works, I mean you talk about this a lot on your podcast, I really encourage people to join a working group. There's people that just meet every week and discuss ideas and they all have their own lives and their own jobs, but you can pick a place to be really particular. And I think that it's beyond just like writing a medium article or a tweet or things like that. Like if you really want to have any effective change, if you really want to see anything work differently, you got to get down way below the waves, but, but they're right there. They're free. The people are great. They're very approachable and you get to like literally change the fabric of reality a little bit. It's kind of cool. Yeah.

[00:50:31.435] Kent Bye: Yeah, when I was at Sundance, I had a chance to see a documentary about Ruth Bader Ginsburg and the work that she's been doing in terms of the laws of the land. And I think that the laws of the internet, to some extent, are these protocols. And what you're saying is very true in the sense that if you really do actually want to bring about change at a gross, massive scale, then those internet protocol layer dimensions of reality are really impacting base reality of this whole thing that we're creating.

[00:50:58.024] Samantha Mathews Chase: Exactly. Yeah, it's kind of everything. It's not even that we need, this internet isn't going to be the only internet, but if we want to even make new internets and new communication systems, you want to be working with people that are going to fight you and stay in the room until you guys both agree on something. Those are the people you want to surround yourself with in this time.

[00:51:18.007] Kent Bye: Awesome. Well, I just want to thank you for joining me today on the podcast, so thank you.

[00:51:23.949] Samantha Mathews Chase: It's a real honor. I've been a fan and just love your work, so thank you.

[00:51:28.338] Kent Bye: So that was Samantha Matthews-Chaste of the Venn Agency talking about VR, the web, Internet of Things, edge devices, peer-to-peer, and decentralization. So I have a number of different takeaways about this interview is that, first of all, things that pop into my mind is Samantha's declaration that privacy is dead. And I think that what she means is that there's going to be certain biometric markers that you have that is going to be pretty difficult to hide. you have a certain bone length, the way that you move is very distinct, you have gait detection. And so if anybody wanted to do training algorithms on the way that you move through a space, either in real reality or in virtual reality, then there's a certain amount of our anonymity that is going to be very difficult to put that genie back in the bottle. And so it then becomes a larger issue around the ethics around what do we do with our privacy and what are the collective decisions we're going to have a society around that. But also she was really thinking about this data portability, you know, with the GDPR and thinking about it as not just the data protection, but the data portability to be able to actually download the data. And now that you have all this data, then what kind of insights can you mine from that? And so this is an interesting concept, just that, you know, the concept of the blockchain is that you have this kind of immutable data, it's permissionless, it's out there, and that, you know, you have a free market of different companies that are able to come in and do different things and remix that data and provide other different types of values and services on top of that. It's like the backbone of this decentralized future. Now, when it comes to your private data, though, you don't want to be having it all out there and accessible for anybody to look at. And so you're going to potentially have these different companies that are able to take all of this data that you have and be able to extrapolate some sort of qualitative insights about you becoming more aware of yourself. And I think this is something that is a bit of an inevitability. I think that, you know, a lot of these technologies do provide this capacity for us for self-reflection and to learn about ourselves. And I think the danger is that all this biometric data is also like these signifiers of our unconscious behaviors and our patterns. And so that if there's these bad actors that want to gather up all this unconscious data that is being radiated through these biometric indicators, then they could start to create predictive models that are allowing them to predict and potentially control our behaviors in different ways. And so the larger movement of the self-sovereign identity is this kind of taking back ownership of all of our data and being able to have much more agency around it. And so another inspiring thing to hear from Samantha is that she actually heard this interview that I did with Alberto Elias and started to really dive into the self-sovereign identity specs. And she actually got a lot more involved within the process of helping to figure out some of these specifications around identity. And so just really taking a proactive approach of looking at the protocol layer, how these protocols are going to be Helping to shape and form our experience of reality in the future. So you're really kind of modulating reality at the protocol level and so that was really just encouraging to hear that she's been really diving into that and also that she's an artist and musician that you know kind of went through this phase in her 20s and then and she created this archive of herself which she describes as this chatbot with all of her social media and soundcloud and you know just trying to fuse together all these different experiences that she has from her 20s and create these different virtual worlds that are trying to I guess archive aspects of her past because She said that she had been kind of kicked off various different social media platforms and having that experience of Putting out these different videos of yourself and having them out in the world and controlled by the centralized entity But if for whatever reason that entity decides to take you off whether it's from copyright violations or using a sample or whatever it may be She just felt like she wanted to you know take more ownership around the controlling her own identity and be able to archive herself in different ways and so using these technologies as almost like this art project to be able to kind of remix different aspects of the expressions of your identity and to have these face-to-face conversations with an AI bot based upon, you know, your tweets and your previous conversations. I think this is something that we're going to see more and more people starting to do these various things in the future. And finally just this concept of the edge device these devices that are going to be at the edge of the network and that They may not be storing information at all, but they may be able to process different information And so what's it mean for these Internet of Things? devices to be out there and to for example put something that may be broadcasting the blueprints of a my home or any building that's downtown, so that if there is some sort of emergency, a firefighter could have these augmented reality glasses. And if these different devices that are in some sort of mesh network are broadcasting these different information, if you have the right augmented reality headsets, then if there's a fire, it may actually help to save lives or to prevent a lot more damage from happening. So going to Microsoft's Build Conference, I heard a lot more about the specific edge devices that they're starting to build and that we're just going to see, I think, a lot more of this process of having edge devices all around us that are gathering information and data about our environment. And the thing that Samantha wants is to be able to actually have access to the data that those edge devices are creating so that she can both mine it for these deeper insights to learn more about herself, kind of inspired by this deeper quantified self movement. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do tell your friends, spread the word, and consider becoming a member to the Patreon. This is a listener-supported podcast, and so I do rely upon your donations in order to continue to bring you this coverage. So, you can donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show