#585: Consciousness Hacking & VR: Will immersive tech drive addictions or help us transcend compulsions?

There is a growing backlash against technology being catalyzed by some of the architects of the persuasive habit-forming techniques. The Guardian does a survey of user experience designers and engineers who are taking drastic actions to curtail their personal technology addiction behaviors, and asking some deeper questions about the ethical responsibility of major companies in Silicon Valley to be socially-responsible guardians of the attention economy. Tristan Harris is one of these former persuasive designers who has formed a non-profit called Time Well Spent focused on gathering quantified data for how happy people are using different mobile apps, and Harris shared some data on Sam Harris’ podcast that people are happy with 1/3 of the time they’ve spent on the most popular apps, but that they’re unhappy with or regret how they’re spending 2/3 of their time.

julia-mossbridgeMost companies are optimizing for duration on their websites, but it’s difficult for them to measure the first-person phenomenological experience of that time spent on their site. There are more and more people who feel like they are being manipulated and hooked into forming habits on apps that are designed to reward compulsive behaviors. There’s a growing counter movement of consciousness hackers who are trying to take a more mindful and purposeful approach to how they use technology. They’re using biosensors to get feedback and insights on their behaviors, and are trying to cultivate more flourishing, well-being, connection, compassion, mental health, and sanity in their lives.

mikey-siegelAt the Institute of Noetic Science Conference in July, I had a chance to sit down with the founder of Consciousness Hacking Mikey Siegel as well as with Julia Mossbridge, who is the director of the Innovation Lab at the Institute of Noetic Sciences. We talk about transcendence technology, quantified self applications designed for transformation, designing human-aware artificial intelligence optimized for emotional intelligence and cultivating compassion, the matching problem, and the insights of neurophenomenology for combining first person and third person data.


This is a listener supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Support Voices of VR

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. So... There is a huge backlash that is brewing against technology and I'm doing the Voices of VR podcast so on one hand I am advocating for technology but on the other hand I am feeling and experiencing a lot of that same backlash against technology in terms of how I treat my cell phone and how I interact with social media and how I just feel like there's this abusive relationship that I have cultivated with my cell phone and with technology. that I could get very caught up into the searching for connection or likes or distraction and looking at the latest news. And there's a larger movement of people within Silicon Valley that's probably best demonstrated by Tristan Harris, who was at Stanford. He learned how to do all these persuasive design techniques embedded within technology. And so that technology is basically being designed to hook you in and to just keep you compulsively interacting with the products. And that the more that you are compulsively hooked in and have screen on time, then the more likely that you are to click on the ads and then the more money that these companies make. And so Time Well Spent is one of the organizations that Tristan Harris has created. And if you haven't listened to the interview that Sam Harris did with Tristan Harris back on April 14, 2017, about what technology is doing to us, I highly recommend it. It's a podcast that I listen to, and it's been something that I've been referring to over and over again. And it's something that I wanted to cover here on The Voices of VR, which I'll be diving in today. But just to set the scene a little bit more, Tristan Harris was at Stanford, he learned about all these persuasive design techniques, and he was at Google. And he was basically saying, hey, what are we doing? How are we hooking people in? And maybe we should think about this in a little bit deeper way. And then he was put in charge of design ethics and overall philosopher at Google to think about this, but he wasn't getting a lot of support and he eventually just left because he felt like he could actually probably do more good if he was outside of Google based upon what he was able to see from being trained in these behavioral techniques to be able to hook you into technology. So Tristan has all sorts of different techniques for how to break this addiction to technology whether it's like turning off notifications or turning your screen black and white so that you don't get those little red bubbles on your cell phone that are like triggering you to push the button to make them go away but it's also this trigger to these fixed action patterns that they're created in order to create these addictive loops. It's like technology companies are creating us into these addicted gamblers or drug users and they're using our connections to our friends and family to be able to hook us in into this. Now, that said, there's also a huge amount of free will that we have in terms of, like, are we able to maintain control over technology? Are we able to, like, have good boundaries? Are we able to create habits that are healthy? The problem is that the entire tech industry is not supporting us in that effort to have a healthy relationship with our cell phones. And that in fact, what they're doing is that they're just optimizing for profit and maximizing algorithms to be able to have a spend more screen time on these applications. And as a result, we've created this attention economy where about one third of our time we're happy with, and about two thirds of our time, we're filled with regret and unhappiness for how we're spending time on our phone. And so on today's episode, I actually have a chance to sit down with a couple of people who are in this movement to rethink our relationship to technology and to recontextualize, like, what are the deeper things that we're going for? Are we, like, searching for connection or love? And are there other ways that you can actually sustain yourself and to think about, like, these larger ethical questions within the tech industry? So I talked to Mikey Siegel, he's the founder of Consciousness Hacking, and he's been looking at these different transformational technologies that are trying to help improve our lives in different ways. And also Julia Mossbridge, who is the Director of the Innovation Lab at the Institute of Noetic Sciences. they do a lot of consciousness research. And so they're looking at technology through the lens of a cultivation of presence through these different contemplative practices. And what is the connection between those practices and what is happening within technology? And what are some insights that they could perhaps give to us so that we could maybe take control of our lives a little bit more? I get a lot of people that when I talk about virtual reality, they are just disgusted because they're like, I don't need more addiction to technology in my lives. And I think that virtual reality actually could provide an antidote to some of the screen-based addictions that we have with our cell phones because it has a potential to be able to cultivate within us the ability to get more present and with more ambient computing technologies, the ability to move us away from screens and have more natural intuitive conversational interfaces, but at the same time, if the companies that are driving the innovation of virtual reality technologies are not taking into consideration some of these deeper questions, then we're going to be recreating these dystopian futures of addiction that I think a lot of people rightly have as a fear. I don't think that this virtual reality and immersive technologies is going to end up to be more utopian or dystopian is a closed case. It's kind of up to us, but it's also up to us to challenge these major companies that are designing algorithms that are going to be driving the behavior. So we're going to be taking a step back and looking at all this stuff on today's episode of the Voices of VR podcast. So this interview with Mikey and Julia happened at the Institute of Noetic Sciences in Oakland, California on Saturday, July 22, 2017. So with that, let's go ahead and dive right in.

[00:06:14.861] Julia Mossbridge: Hi, my name is Julia Mossbridge. I'm the Director of the Innovation Lab at the Institute of Nomadic Sciences and I have a joint appointment at Northwestern University and I'm also the Science Director for Focus at Will Labs. And what's exciting to me about transcendence technology or kind of what I think transcendence technology is is technology that allows us to go up and beyond ourselves. So what we're transcending is ourselves and joining with what some people might call the sacred, some people might call your higher power, some people might call your community, some people might call it God. So there's something that is above and beyond our individual selves and also something that works through our individual selves. So it's part and parcel of ourselves and yet larger than ourselves. So any technology has always been used to help basically amplify any kind of human experience. So this is another example of a particular human experience that I'm hoping to contribute to amplifying. But it's also been used to examine and to sort of enlighten or shed light on our understanding of particular human experiences. So in all those cases, I feel that transcendence technology is about all of those things, amplifying, characterizing, shedding light on. And so that's where I'm coming from.

[00:07:36.157] Mikey Seagel: Thanks, Julia. My name is Mikey Siegel, and I'm the founder of Consciousness Hacking, which is a global movement focusing on the intersection of technology, consciousness, and well-being. I also co-founded the Transformative Technology Conference, and until recently have been teaching at Stanford University and taking a little break right now. And I love hearing Julia present what she's doing because I feel so resonant and so aligned and I don't describe what I'm doing in the same way, so it's great. We kind of, I think, have a similar deeper intention and then use different words to talk about it. I come from an engineering and technology background and so that's often my lens and so I really think of the thousands of years of spiritual and religious and yogic traditions even culminating now in positive psychology and other forms of personal growth and mental, emotional, psychological, spiritual well-being. All of these tools is technologies. These are systems, structures, techniques, protocols that we've created in order to really help us reach our greatest possible potential. And what I'm interested in is the cutting edge of those tools for transformation. How do they evolve and change as culture and humanity evolves and changes? And we are entering, we're not entering into, we are in a rocket just speeding with its blasters at full force into a technological culture where we are just surrounded by gadgets and information and technology and everything that we do. And the growing feeling that I have is that from one perspective, there's an incredible potential that if we harness that incredible scientific and technological capability to be in support of human flourishing, of human wellbeing, of human connection, of compassion, of mental health and sanity, then we have a power to uplift humanity in a profound way. But there's also kind of a doom and gloom perspective on it as well. There's a quote I love from the United Nations. It says, since wars begin in the minds of men, it's in the minds of men and women that the defenses of peace must be constructed. And what I take from that quote is first the recognition that the crises that we face as a humanity are not these sort of magical, spontaneous things that are just arising. The majority of the conflict and crises that we face, these are human-made problems. And they're human-made problems that come from human minds, that come from internal conflict, that come from human pain and human suffering. And so if we can create better tools for dealing with human pain and human suffering and human internal conflict, then I believe we can change the world from the inside out. And if we don't create better tools to do that, then I think that we're kind of screwed.

[00:10:41.811] Kent Bye: I think the deeper context of a lot of these technologies are within the context of companies that are trying to make a profit. And so they have an incentive to hook people into getting into these addictive loops of just escaping into these worlds where they kind of hack our fixed action patterns to be able to keep us engaged. optimize for those numbers of engagement without perhaps a larger concern of the deeper ethical implications of what we're actually creating. And so I think that's a lot of the fear that people have around technology is that is it going to be yet another manifestation of things that are just going to keep us in these addictive loops that get us disconnected from ourselves, from each other, and from the world. And so I'm just curious to hear each of your perspectives on, is it a matter of ethics? Is it a matter of deeper intention of having a framework around spiritual transformation? Or what does it mean to live the good life? And how you're each taking an approach of trying to embed that into different technological applications.

[00:11:40.308] Julia Mossbridge: So yeah, it is going to be like that. I mean, so there are going to be people who, as always, produce technologies that are going to exploit the human brain's reward system, right? Because that's a pretty good way to make money, right? So you have a Starbucks that says, you know, that's a great way to make money. Let's sell an addictive product. So same with gamifying anything. You know, you get your reward system going, you get intermittent reinforcement, it becomes like gambling, and then it's addictive. So there will always be people who are doing that. And then we're hoping to expand, and I guess I feel safe speaking for both Mikey and myself, We are trying to expand the number of people, the number of groups around the world who are saying, OK, we could do that. And it doesn't feel good to do that. It doesn't feel good to actually put people in that position, even if it does make money. And so we're going to try to make money and do good in the world in a different way. And this way is going to be about actually, so something I'm really passionate about, my background's in neuroscience. And something I'm really passionate about is actually testing whether the thing that we're creating does any good empirically, so having metrics and deciding actually what the metrics are. But even before that, that's getting down to the nitty-gritty, even before that, you need to have a culture in which this tech is created, in which people are asking, first of all, acknowledging that your inner experience of what you're creating matters. So it really matters to technologists. I mean, the ones I've spoken with anyway. I've spoken with plenty, not all of them, but plenty. And it really matters that people feel good about what they're creating. So that's one piece. So to acknowledge that that is not just something that's a byproduct, it's actually something that matters and it matters in the creation of the technology itself. So the technology that gets created is technology that has a good feeling to it to the extent that the people who were creating it had a good feeling about themselves when they were making it. People catch on to those things. And negative feelings come through too, right? And so one of the ways that I'm trying to help technologists get it around that their own internal experience really matters now that Silicon Valley is kind of like the new Hollywood. is to work with technologists in groups and one-on-one around integrating practices of self-love and also group care or group love with their work. So you're used to jamming on something and say you're a software engineer and you're coding all day. So now here's a new practice, right? Every time you have the urge to reach for your phone, to text someone or to call someone or to do some kind of a message, instead of reaching for your phone, or maybe just before you reach for your phone, you check in on yourself and you just send yourself a wave of love, right? And then the next time you have the urge to check for your phone, send your coworker a wave of love. These kind of small things that you can get into a practice of actually shift the entire dynamic and allow us to connect with our creativity better. So it's pretty cool.

[00:14:46.273] Mikey Seagel: Thanks, Julia. I totally agree with what Julia is saying, and I think that the work that she's doing is so important because I think that addressing some of the issues that you're bringing up are so fundamentally tied with the way that we create, the place from which the technology emerges. I like to say that we are what we build and we build what we are. And the intention, the motivation, the desire, the vision behind the tech will be reflected in not just what the tech looks like and how it feels, but how people feel when they're using it. And the fact is, is that startups are the medicine people, the therapists, the healers of the future, whether we like it or not, because there is going to be an increasing number of technologies that are going to be impacting us at a psychological, emotional, and deeper levels that are interventions and self-help and healing tools and medical tools. And they are going to have a much, much wider impact than any kind of intervention that we've ever created. And so you wouldn't walk into a therapist's office who didn't have some kind of certification on their wall, you know, and sit down with them. You'd want to know, like, do they have the skill set to actually really be sensitive and understand your issues? Because you can actually hurt people. And the same holds true, but even more importantly, with the startups that are potentially creating these types of technologies. And so the ramifications of not having that in place is already clear. We can look at a food industry, for example, where the primary motive is profit, not human health. And the result is that some of the top causes of death in the Western world are food-related. Because there's no concern about whether or not someone is better off by eating a food. The concern is just simply whether or not they will eat the food and buy it. And so all the energy is spent engineering the tastiest and most addictive processed foods that you can possibly make. Unfortunately, the same thing is happening in the media and tech industries. If you open your phone and you look at the apps on your screen, every single one of them is, instead of competing for your taste buds, competing for your attention. With no real concern, for the most part, this is not universal, but no real concern for, are you receiving any benefit proportional to how much attention you give to the thing? No, it's just how much attention. So you have an attention economy that has no real vested interest in the welfare of humanity. That's the potential scary future that we have to become really, really sensitive to and aware of. And there's nothing wrong with making profit. And I would even say the folks that are steering towards profit that have good intention, meaning the more money they make, the better off the world is, those people should make a billion, trillion, zillion dollars because those two things need to be tied. Those two things need to be on the bottom line, linked and inseparable from each other. And that's a change in the culture of how we create. And I feel optimistic that actually that's already happening. I see startups, I see founders, I see companies, I see people like Julia that are recognizing the need to have this intention, to have this global perspective that can understand that it's not just a mechanical kind of profit-driven world that we live in, but actually the human heart has a place, that human connection has a place, and that if we don't weave these into what we're doing, then we're never going to weave humanity into a cohesive whole.

[00:18:36.690] Kent Bye: And I'm wondering if we could dive into some specific examples of either consciousness hacking best practices or applications or technologies that are encouraging transcendence or spiritual transformation and kind of have a discussion of what's out there in terms of the best of.

[00:18:54.111] Julia Mossbridge: Yeah, this is a question I get a lot, or one of the best of, in terms of transcendence tech. And I wrote a whole paper on this, which you can get, by the way, at the noetic.org slash innovation website if you want to go. But just very briefly, the ones that are top of mind are, for instance, I'm really enthralled with Humanitas AI, which is this new organization. I'm not even sure they're launched yet in the public view, but anyway, you'll hear about it soon. And they're creating AI-powered chatbots for refugees, because refugees have this problem of being able to talk about their experiences and needing to talk to someone who's qualified to talk with them and help them feel better in terms of well-being and physical and psychological issues. But everything they say can be used against them in terms of their ability to get a visa or go somewhere. And so having a confidential channel for that, that's actually not a human being and that can be replicated over many people, turns out to be a really neat solution. So that's Humanitas AI, I think it's pretty cool. So that's all about translating the self and doing something larger, which I think is neat. And then something in terms of spiritual transformation right now, two apps I'm thinking of. One is called SoundSelf, which you chant into a microphone and it does some pretty cool, tricky algorithmic stuff to send that back to you, echo your chant back to you in this way that sounds really beautiful, but also you can play with your voice and as your voice changes frequencies, what you hear changes frequencies. And it's a transformative process to hear a voice in that way. And then another one, I'm partial to because I created it, Choice Compass, which is an app that came out in 2012, so at this point the wrapper is a little bit out of date, but the algorithm works fine and what it does is use your phone's camera to get your heart rhythms while you're thinking about life choices and then do a little math on that. to try to help people separate out positive from negative life choices. And those kind of apps that are, that's one of several apps, there's another one called Sensi that's coming out there. These kind of apps that use biosensing technology to help provide mirrors into our own experience I think are really valuable at this point because we tend to look to the outside world at this point to try to get inner validation.

[00:21:09.055] Mikey Seagel: Yeah, good list. And Choice Compass is awesome. And I'm on the advisory board for Sensi, which is also really awesome. Julia is actually working on a very cool project, which big picture I see as one of the most promising and important directions. We hear a lot about AI, we hear a lot about the dangers and fears around AI. and the various ways of trying to deal with that, where we have to limit the development of AI, or we have to build ethics into AI, you know, these different ideas of how to deal with this. The approach that Julia's taking is, I believe, one of the most promising approaches, which is really trying to construct the AI at a foundational level, from the perspective of unconditional love, or another way of saying it is that the core DNA of the AI is to act from and perpetuate love in the world. And if that's how you start off in terms of how you create this intelligence, then my sense is the chances of it going off the beaten track and doing crazy sci-fi stuff is less likely. Because we don't realize it, but most of the way we think about the world and relate to the world is very fear-based. is based on a sense of separateness and competition from one group to another, which is very different than a sort of a love-based perspective. And there's some basic technologies which you can go and buy today that I actually think are really cool and are paving the way. There's still early steps in the big picture, but The Muse headband, for example, made by a company called Interaxon, is really cool. It's a headband that reads your brainwaves and connects to your phone, and you put on headphones, and it guides you through a meditation training process. where you're actually hearing your own mental noise. And so the more your mind wanders, the more the sound shows up, like wind and rain. And the more your mind quiets, the more the soundscape quiets. And when you're just learning how to meditate, this is like having a really intuitive meditation teacher that's actually reading your mind and kind of bringing you back gently to the present moment experience. Spire has a great technology. It's a wearable that monitors your breathing. You take it with you throughout the day and it senses when your breathing gets into more of a stressful pattern and gives you a little nudge. There's a great tech coming out called Leaf, L-I-E-F, made by a friend of mine, Rohan, and this is another wearable. It sticks right on your body, right on your torso, and it monitors your heart rate variability, which is one way of measuring your stress. And it gives you gentle feedback throughout the day, as opposed to just doing it when you sit down in the evening or something at a designated time. It gives you the feedback in the time that you need it the most, which I think is actually an emerging form of feedback, which is really powerful. It's what I would call full-time, real-time feedback. And that's where the wearable or the feedback begins to look more like an external sensory organ, right? Something that's really kind of built into as a feedback loop into your kind of lived experience. And there's some different stuff emerging in that pipeline. So those are a few examples.

[00:24:34.342] Kent Bye: So today in the presentation, there's a number of questions that you often get a lot. And one that I thought was pretty provocative was, can we quantify enlightenment? Is there a way to define it in any certain way? And I think that for me, within virtual reality, I have a similar challenge, which is the subjective and interior experience of being present is something that's very difficult to put numbers on. But yet there's different frameworks for experiential design such that if you know the dimensions of how to cultivate different dimensions of presence and how to best generate these different flow states, then you can create an experience as engaging and perhaps achieving that level of internal states that you're trying to achieve. So you start with the state you're trying to achieve and then you use different frameworks and maps and models and mental frameworks to try to then create an objective experience that is trying to achieve that. And so, just curious to hear your thoughts on that dilemma of quantification of something that's primarily a qualitative experience, but yet the potential benefit of that.

[00:25:35.761] Julia Mossbridge: That's one of my favorite questions. So first of all, just off the top, anyone who's interested in this ought to read this book called Understanding Consciousness by Max Vilmans and also read Francisco Varela's work. You can just go to Google Scholar and find it. So he talks about he was a big pioneer of neurophenomenology. So trying to get both first and third person reports combined, right? Because that's the way to do it, right? You get EEG or fMRI or some kind of functional connectivity information at the same time as you're getting a person's first person report and then you can make some characterizations of what's going on in any sort of internal state. But generally the question comes from the tradition, the scientific tradition that we have that makes this assumption that what's real is like the table that's over there and the chairs that we're sitting on and the floor and what's maybe not so real is our internal experience. And I think what VR is going to teach us, actually, is that that's backwards. So that that, in fact, what is the most real, in fact, this is, you know, Descartes and Plato and all sorts of philosophers have spoken to this, but what's most real is our experience. And physical objects are, of course, inferred, right? Because all we have about, all the information I have about this microphone that is in my face right now, all the information I have about it is inferred through my experience. So that's an inference, but the experience is primary. And so we kind of have it backwards. We have it backwards for multiple reasons, but we have it backwards. And when we start doing more work in VR and more play in VR, when we're just playing around in VR, we're going to recognize that that reality is really first-person experience. Now, having said that, we have agreement about different first-person experiences. If you and I are in the same VR room, we have agreement about what's going on. If we're in different VR rooms, we have non-agreement. So we call, you know, objective reality the subjective agreement between people. So reframing it that way, Then what you're looking for is really first, third person. You're really looking for agreement between two people about what's happening in their individual subjective experiences, which is not so hard to get. And it's what mystics have been getting for thousands of years by going through rigorous training, where a teacher would say, okay, tell me what you felt like when you went into meditation or when you did that particular ritual. And then they would hone it based on the history of knowing, matching between people, right? And Ken Wilber pointed this out in his wonderful book, Marriage of Sense and Soul. He said, look, scientists are sitting and talking with one another about what they've discovered. Well, that's what mystics have done forever, to try to understand the reality, and there's literally no difference.

[00:28:21.935] Mikey Seagel: Yeah, I take a sort of an engineer's perspective on this, which is more about making shit happen in the world than about trying to come up with a sort of an ultimate theory of what is going on. And in that sense, for me, and this isn't like agreeing or disagreeing with Julia, just saying that you don't You don't need to have a particular theory of consciousness or what enlightenment is specifically or what its sort of ultimate meaning is to be able to make the basic connection that what we experience can be observed in the body, that there's a relationship between experience and the body. And the body includes the brain, the body includes the heart, the body includes the skin, the body includes the eyes. And so we have methods in science that continue to get better and better and better and better at correlating between ways of measuring the body and what people are experiencing internally. And that's an evolving scientific practice. And as we get better and better and better at that, my belief is that there's nothing that we can't measure. And anything that we can measure, we can measure increasingly accurately. And I believe that goes to infinity. I don't know what the limits of it are. Maybe there's limits. I don't know. I don't know what they are. And so that means that you have a basic, you have like a terrain, right? And then you can build better and better and better maps, right? But the map is still never the terrain, right? It's still just a representation of the thing. Now, the map is really useful because it helps you to navigate the terrain. And so the more accurate our maps are, the better technology we can design and the better we can facilitate other people reaching those same states of experience. So if we can bring in yogis that have spent tens of years and tens of thousands of hours in meditation, and we can really precisely understand the changes that their body and their brain have gone through with incredible detail, then we actually stand the chance of making those experiences more accessible to humanity as a whole, without them needing to go into a cave for 10 years. And if we can do that, and we can do that safely, and we can do that skillfully, that I actually believe we can radically change the course of human culture and the course of human evolution in a relatively short amount of time. And I would sort of boldly say that the quantification of enlightenment, and what I mean by that is really pushing what science is willing to look at out to its furthest reaches of human experience is arguably one of the most important spaces of scientific inquiry and discovery for the welfare of humanity right now.

[00:31:19.628] Kent Bye: Great. And finally, what do you think is kind of the ultimate potential of either virtual reality or transcendent technologies and what it might be able to enable?

[00:31:31.385] Julia Mossbridge: Well, let's just start with virtual reality. I mean, so if you, once people get it that they're not their bodies, I mean, because the best way to have an out-of-body experience at this point now that we have virtual reality is to just go into virtual reality. And within minutes, you're having an out-of-body experience where you're in a different place, you're feeling different things, you swear you're there, you're not here. In fact, the new here is there. And that's a lesson in the flexibility of the mind to locate itself in different places. And once people get that the reason the mind is so flexible is because, frankly, it's doing that a lot, much more often than we think. So for instance, when we go to sleep and dream, Right or potentially when you have an out-of-body experience, that's not a virtual reality or if you have a near-death experience These are common situations in which consciousness seems to not be as associated with the place in which it is Physically as one would think and so that's a huge gift to humanity because once we start getting the separation Yes, I mean I'm raised up in neuroscience. Yes, the brain and the mind are very related and yet they're not always related and And so getting the separation, so just to make a really important sort of point, just to be very, very clear. I'm not saying that when you're in a virtual reality state somehow your brain and mind become disconnected. Clearly you're getting these sensations through your neural activity. And your position of where you put the you, of who you are as a person, that shifts from the room you're in to the virtual reality environment. And that you is not something that's physical. That you is something that is mental, if you want to call it that. It's something that doesn't have mass, it doesn't have spin, it doesn't have charge. It's not a physical thing. So getting it that the non-physical exists, and then that the non-physical is really so real that it basically defines our experience, is crucial to the evolution of our culture, because it leads to interconnectivity. It leads to the rules of the physical, which is about separation. You know, there's me, there's you, and we're different, our bodies are different. don't have to apply to the mental, right? So the you that's in it, or the me that's in a virtual reality situation, or the me that has an out-of-body experience, it's non-physical, therefore it doesn't have to have this separation. And once you get that, then you get this interconnective experience, which brings people together and makes it less likely that I'm going to go and decide to starve my neighbor for something. or not have health insurance, or decide that it's a good idea for me to get really wealthy, but I'm living in a town where people can't feed themselves.

[00:34:13.823] Mikey Seagel: Yeah, one of the greatest gifts that technology can bring us is to help us recognize that we already are connected to everyone and everything. That what we are extends to include all of that. And when we realize that, like Julia is saying, totally in agreement, it's like, It's like you wouldn't walk into your house and like throw trash on the ground, you know Or you wouldn't like be running out of resources So start pulling bricks out of your wall to like build something else, you know, because it's your house and when we recognize that actually all of this is our home Then we might treat it very differently So kind of thinking future vision, I have lots of sort of sci-fi stories of what the future might be like rolling around in my head. But one, I kind of want to elaborate a little on the AI thing, sort of honing this vision. So I see different, kind of out of the zeitgeist, I see certain idea patterns that emerge. And there are sort of people or lots of different projects and groups that have similar kinds of ideas. For example, Julia's amazing choice compass and then you have Sensi. These are ways of actually tapping bodies intelligence to help you make decisions and there are other types of certain types of wearable biofeedback devices that are beginning to emerge that are very similar and what one of the Spaces of development that I think is really interesting is solving what I call the matching problem And the matching problem is where, on one hand, you've got tens of thousands of different interventions, from meditation techniques and yogic techniques to different types of therapy, even different types of technologies or breathing patterns, whatever it is, exercise approaches. And then on the other hand, you have lots and lots of people that are in need, people that are suffering, people that are looking to actually be helped in some way. And you have a whole kind of spiritual marketplace that sort of exists right now where basically it's sort of marketing that is kind of casting all these wide nets trying to kind of draw people in and the matching is happening on basically who is attracted to what type of marketing. That's kind of how it works. And most of what happens is kind of a one-size-fits-all approach. So Eckhart Tolle or whoever will publish a book and expect that everyone that reads this thing has to conform to that one book. But the amazing thing about technology is it can do something different. It can be dynamic. It can actually use large amounts of data to help understand things better. It's infinitely patient. And it has a certain type of potential intelligence. And so what I see as emerging is an AI system, sort of like the way Netflix works, that can begin to match, kind of as a marketplace, these approaches, these interventions, to the people that are in need. And the way this is showing up is starting off pretty simple, sort of very, a course kind of way. You know, you show up, you fill out a bunch of stuff, almost like a dating website about your personality, what your needs are, what you want. Maybe you even use some sensors to kind of measure where your brain is at, where your heart is at. and then it might leverage data from 100,000 other people that have done the same thing and say, okay, actually, the thing that you should focus on right now is like yoga and running, or you need to change your diet in this way, and also you should try this kind of transcendental meditation. But as this thing progresses and gets more sophisticated, you could imagine that maybe it starts to seem a bit more like Siri. Or maybe it doesn't say to you, oh you should try transcendental meditation. Maybe it actually guides you in a meditation. Right? And maybe while it's guiding you in the meditation, it's actually monitoring your brain and your heart, and it's actually changing the words, and it's changing the tone of voice, and it's changing the subtle nuances of how it's guiding you to best meet you and match you. Right? And then maybe it helps you by like throwing out a few things on Amazon that you should order that are, you know, that are like healthy kinds of foods or something like that. And then maybe it kind of guides you to take a walk in the park, you know, and it's sort of, it can sort of be multimodal and sort of, you know, weave into different aspects of your life. And so imagine now that this intelligence, if it's not corrupted, right, by greed or by fear, but really stays with the true intention, gets more sophisticated and more intelligent and more sophisticated and more intelligent all of a sudden what you've created is an incredibly powerful artificial intelligence, but one whose sole purpose has evolved over time to get better and better and better and better at understanding what people's needs are on their own transformative or developmental path, and understanding how to interface with them to best move them along that path. And for me, that is one sort of ultimate expression of artificial intelligence. And for that to be true, A, that AI would begin to look a lot like unconditional love, and that AI would begin to look a lot like the expression of wisdom. And if that AI, let's say, becomes ubiquitous, right? Embedded into the systems around us, you know, at this point, a hundred years from now, whenever it is, you know, we wouldn't be walking around with phones necessarily, but in the intelligence all around us, well, all of a sudden then what we've created is almost like the technological version of like divine guidance. Right? Which is like the big sister, the great grandmother of humanity, right? With wisdom that is sort of helping and supporting and ushering us into whatever the next stage of our evolution is.

[00:40:13.123] Kent Bye: Wow. That sounds amazing. Yeah, let's do that. Awesome. Well, thank you both for joining me today.

[00:40:19.907] Julia Mossbridge: Thanks so much. It was great.

[00:40:22.048] Mikey Seagel: Thank you very much. Love this conversation.

[00:40:24.785] Kent Bye: So that was Julia Mossberg. She's the director of the Innovation Lab at the Institute of Noetic Sciences, as well as Mikey Siegel, who's the founder of Consciousness Hacking. So I have a number of different takeaways about this interview. First of all, when Julia said that before you go and reach for your phone to send yourself self-love, I had no idea what she was talking about or what she really meant in that moment. And then it wasn't until I actually started to try it a little bit to really start to unpack what she meant by that. I think that I have a very compulsive and addictive personality when it comes to my relationship to technology. I get very into social media and distraction and looking at the latest news and in some ways as a podcaster and trying to keep up with what the latest information is in the virtual reality community, it's part of my job to kind of keep up with that. But I think there's a different dimension of because I have like a certain amount of connections to different people when I use social media when I talk about something I can kind of get hooked up and using social media both in sending out information but also like getting into that addictive loop of seeing if people are responding to me and I think that there's a subtle part in my own behaviors of this question that I'm asking, do people love me? Do I feel connected? And I think that social media and these websites are trying to give us that feeling of that, and it gives us the illusion of that connection, but it is so much different than this embodied interactions that I have with other people. So, the dilemma that I see is that I find that the less that I use technology, the happier that I am. Now, there's some things that I do in virtual reality to be able to either get into flow states or to be able to learn, but I think overall I find that the more that I stay away from technology, the more connected and present that I am, the happier that I can be. So I think in a lot of ways, these technologies, whether it's social media or artificial intelligence or virtual reality, all these are just mirrors to ourselves. And this is something that Mikey said. He said, we are what we build and we build what we are. So the underlying intention, motivation, desire, and vision behind what we're making, we're kind of creating a mirror of ourselves in a way in that these companies that are manipulating us to be hooked into these different websites, their motive is to profit. And you could say that they're trying to connect people and provide a service and to be able to connect us to the products that we are interested in. As they start to dial up the different knobs to be able to control how we're engaging, I think what we're finding is that there's a lot of our behavior that's completely unconscious and completely compulsive and being driven by emotions that we're not even necessarily aware of. So the degree to which your online behavior is driven by intellectual rational thinking, I think that's not what I've personally experienced. I experienced like the worst parts of myself being exhibited by these just sort of distracted compulsive behaviors. And so in some ways the companies are agnostic and there's a lot of that we're responsible for in our own lives. And I think this is part of the backlash that I'm seeing is that there's these different efforts to be like, okay, how can we reign this in, this sort of abusive relationship that we have with these glowing screens that we're looking at all the time? And is it really serving us and is it making us happy? And I think what you're finding is that a lot of people who are like printing on parental controls or finding ways to not have to deal with social media and hiring other people to do that because it can like suck them in or having good boundaries with when they're using technology and when they're not. So they're not taking their phone to their bed. After 7 o'clock, I'm not going to use technology. I'm going to not take my phone into my room. I'm going to turn my screen black and white so that I'm not being triggered by all the different colors and notification bubbles. These are all things that I've personally done to be able to try to not get into this loop of feeling like I'm going outside of myself to feel these different connections. So it brings up these larger ethics of these big major companies and what's going to happen with virtual reality? Because if we apply the same mindset and business models onto virtual reality, I actually do think it's going to get a lot worse. And we're going to have to really overcome just a lot of momentum to really try to figure out the extent of our own free will. I think that the model that I have is that we have different sensory input that's coming in, we have different emotions that are being triggered, and then we have these different behaviors and actions that we're taking that we may not be fully aware of. And then at the top of that, we have our rational thinking mind that's telling a story of what we're doing. And a lot of the research that people who are looking at the issue of whether or not we have free will or not, they're looking at all these other things that are happening at an unconscious level and seeing that we've already kind of made decisions and then it tells our brain what it is and then that's when we're telling our brain is okay this is why we're doing this but the decision's already kind of been made. that doesn't mean that there is no free will. I actually think that we do actually have a lot of free will and an ability to be able to set intentions and to get out of our ways to be able to overcome some of our worst compulsive behaviors and I think that this is kind of like where I feel like I'm dealing with each and every day and I think hearing other people like this as well and that they're starting to take these more extreme actions to be able to kind of rein in technology And I think these are part of the things that is going to be moving us more into ambient computing and conversational interfaces so that this type of like visual hacking of looking at things and being triggered into these different states. I think that if we start to move into using our voice as the primary user interface, it's going to kind of get us out of a lot of these visual fixed action patterns that have been hacked. So the other thing that I just wanted to pull out of this conversation was this concept of yang and yen. And the yang is a lot of the competitive, every man for himself, and a lot of the incentives that are being driven by profit. And if it's an attention economy, then whoever gets the most attention is winning. And I think that the interesting statistic that I see is that if you look at Tristan Harris's time well spent, they have this app to rate how happy you are with the amount of time that you've been able to spend on these different applications. And it's not like that when people use something like Facebook or YouTube or Snapchat or WeChat that it's a hundred percent like unhappy. Actually it's about a third of the time people are super happy with the time that they're spending on these websites. It's just that they're getting hooked in and then spending like a lot more time than they're really intending and they have about two-thirds of the time for most popular applications that they are either unhappy with the time that they spent or they just are filled with regret with how they spent their time there. And so what that tells me is that we have the situation where these companies have been still optimizing and designing their products to continue to abuse this attention economy. And I think that Mikey said that I thought was fascinating was that like, hey, these companies and startups are kind of like in the domain of mental health and psychology. And that that is a bit of a paradigm shift. I'd never really quite thought about that before, but I think it's actually really quite a cany insight to like look at a Facebook or Snapchat or Google or Apple or Microsoft or Twitter, all these companies that are leading the attention economy. that the interactions and dynamics that they're creating are having a direct impact on our mental health. And the question is, are they being a good steward of that mental health? So that is the competition model. The cooperative model or the yen currency would be, hey, what if we created a technology that was designed from the beginning to be able to like really improve your life and to bring more love and compassion and human flourishing, human wellbeing, human connection, mental health and sanity into your life. If that was the metrics that they're trying to go for, then you would see different applications be developed. But all of those things are really hard to track and measure and record. And so it's actually very difficult to optimize for those unless you're able to connect the objective measurements as well as with the concept of neurophenomenology. That's the idea that we have an inner experience, a first person experience, and that that experience is connected to the objective third person metrics that are connected to it. And that the whole concept of neurophenomenology is that can you actually start to connect to those two of your direct experience? And Julia Mossberg is saying that, like, actually, those external objective things that we're seeing, those are kind of secondary, and that we kind of have it backwards, that those are the things that we're prioritizing, like, what are the things that we can actually measure? and that Julie says that the objective reality is kind of whatever we are subjectively agreeing is that this is our consensus reality, but that Our own first-person experience is primary. It's always primary. Everything else that we're seeing that's objective in the world is kind of in relation to what our experience is. And this is the basic insight of the phenomenologist and the transcendental idealism of Kant or Husserl or Heidegger or Morley-Ponte. All of these phenomenologists are like having this insight that actually the experience is the thing that is most important. So it's a bit of a philosophical switch on many different levels. And I think that people are having a direct experience of that, where they're seeing that all these major technology companies are not being good custodians of our attention, and that we actually have to fight against them in order to regain control over our own lives. And virtual reality and augmented reality could go down that same path of how we're designing it. But I think it's like a good time to take a step back and saying, actually, in order to have like a really good present experience within virtual reality, then we are kind of looking for some of these same things of like human flourishing and wellbeing and human connection and compassion and mental health and sanity and that. There's a potential for virtual reality technologies to start to be on that bleeding edge of having experiences that when people go into and they come out of, they actually feel like they did spend their time well and that they're actually feel more connected or grounded. So I'm seeing a lot of the discussions that are being framed around this around like, okay, what the technology companies are doing bad and that we should like, you know, regulate them or something. But I think at the end of the day, I'm looking at it for more a depth psychological process where these technologies are just a mirror of ourselves. And that if we find that we're addicted to the technology, then that is kind of on us to be able to overcome our own compulsions and addictions. But I do think that there is a certain threshold that that gets crossed over and that we do have to like say, hey, what are we doing here? What are you optimizing for? Are you optimizing for just having the most time on screen? And Are all of these things that we're doing, is it making us happier? And if it's not making us happier, then we should stop using the services. And I think that at the subtext of all of this is that I'm seeing more and more people just revolt and say, hey, you know, I've had enough. And actually, you know, I'm happier if I don't have a smartphone or if I turn my screen black and white, or if I don't even carry around a phone. More and more people that are wanting to break their abusive relationship with their cell phones. And at the end of the day, some of these transcendent or transformational technologies or consciousness hacking, the potential is that we're going to be able to kind of match all of different types of applications that are out there to your temperament. And that's how I personally think about it is temperament for personality and temperament. And what are the things that you're personally struggling with and what are the applications that are out there that could match your temperament for the thing that you're looking for? And that's the essence of the matching problem that Mikey was talking about is How, as a virtual reality creator or evangelist, am I going to be able to have a conversation with somebody and kind of know what experience is going to work for them? Or is there a battery of different experiences that people could have and that we could measure their physiological response and do a first person interview with them and do this neurophenomenological combination of the first person description of their direct experience combined with the actual biometric data, you know, with the future of natural language processing and artificial intelligence, we're going to be able to have these conversational interfaces to be able to actually do that a lot easier to combine the recording of someone's direct experience of what they're experiencing inside of themselves versus what is be able to have as traces, uh, from these biometric data and then combining that to be able to then get a good sense of a profile and quantify it and be able to have some sort of model or framework to be able to have a map of that and be able to then guide people into the different transcendent or transformational technologies that are out there so that we could all hack our consciousness and just have more happy, connected and flourishing lives. So that's all that I have for today. I just wanted to thank you for joining me on the Voices of VR podcast. And if you enjoy the podcast and want to support this effort of asking these types of questions and exploring the possibilities of both the shadow side and the bright side and potential of these immersive technologies, then please consider becoming a member to my Patreon. Just a few dollars a month makes a huge difference. So, you can donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show