#959: Networked Tribalism: Filter Bubbles + AI Algorithms + Political Polarization

john-robbJohn Robb has a concept of Networked Tribalism, which is a result of the combination of filter bubbles, AI algorithms, and political polarization. Robb is a military analyst who looks at the impact of technology on culture, and he started his Global Guerrillas blog back in 2003 where he was tracking what he calls the “Open Source Insurgency” in Iraq where there over 70 different factions who were all collaborating against the U.S. occupation. Open Source Insurgency morphed into Open Source Protest with Occupy Wall Street and the protest movements around the world, and then eventually into Open Source Politics with the 2016 Trump campaign, and then eventually into what he calls “Networked Tribalism.” Robb now runs his Global Guerrillas Report through his Patreon.

Robb uses David Ronfeldt’s framework laid out in a 1996 paper titled “Tribes, Institutions, Markets, Networks: A Framework About Societal Evolution” that splits society into the following groups:

  • “the kinship-based tribe, as denoted by the structure of extended families, clans, and other lineage systems;”
  • “the hierarchical institution, as exemplified by the army, the (Catholic) church, and ultimately the bureaucratic state;”
  • “the competitive-exchange market, as symbolized by merchants and traders responding to forces of supply and demand;”
  • “and the collaborative network, as found today in the web-like ties among some NGOs devoted to social advocacy.”

Media theorist Marshall McLuhan talked about tribalism with Mike McManus in his last televised interview on September 19, 1977, where he predicted digitally-mediated tribalism.

In the years following McLuhan and Ronfeldt’s work using the terms of tribes and tribalism, there have been historians like Chris Lowe who have pointed some of the cultural baggage that comes with these terms that’s explored in his essay The Trouble with Tribe. But Robb is referencing McLuhan’s & Ronfeldt’s work when he talks about the “Networked Tribalism” dynamics with regards to the types of mob-mentality and phase alignment he’s seeing in online behaviors.

The type of networked tribalism that Robb is looking at is happening within a deeper cultural context of political polarization. The United States election was called for Joe Biden by the Associated Press on 11:25am EST on Saturday, November 7th after 3.5 days of counting ballots in a few key battleground states. The electoral college race was a lot closer than the overall popular vote where Biden got over 74 millions total votes while Donald Trump received over 70 million votes. For a lot of people inside and outside of the United States, then it may be a bit confusing as to why this race was even as close as it was. But the level of political polarization in the United States can’t be underestimated as there seems to be two completely different filter bubbles that have a different set of facts that form mutually exclusive narratives on the story of truth and reality.

There’s a deeper cultural context for this political polarization that Pew Research reported on back in October 5, 2017. Going back through nearly 20 years of surveys, their research found “widening differences between Republicans and Democrats on a range of measures the Center has been asking about since 1994.”

Michigan State Associate Professor Zachary Neal did a network analysis of legislation over the past 40 years, in order to document an increasing amount of political polarization. His 2020 paper in Social Networks titled A sign of the times? Weak and strong polarization in the U.S. Congress, 1973–2016 documents decreasing amounts of bi-partisan collaboration in favor of no-compromise, partisan alignment.

There are also elements within the media ecosystem that have been becoming more and more explicitly partisan in their coverage as documented by the Media Bias Chart 6.0 by Ad Fontes Media. They map out different news organizations on a spectrum of left vs right political bias as well as on a spectrum of reliability vs unreliability.

It’s within this larger cultural context where user behavior combined with technology algorithms at Facebook, Google, YouTube, and Twitter that have made the boundaries of these filter bubbles of reality more explicit. Eli Pariser’s 2011 TED talk popularized the “filter bubble” concept, and the technology firms may be merely reflecting and amplifying our patterns of behavior that are driven by a confirmation bias to consume information that reinforces rather than challenges our assumptions about the nature of reality. It’s a lot harder to train algorithms to provide users with aspirational content that’s both relevant, important, uncomfortable, challenging, and has a diversity of alternative points of view and perspectives.

The issue of filter bubbles has reached the level of technology policy with Senator John Thune introducing the Filter Bubble Transparency Act that would require technology companies to disclose how algorithms filter information on their services, but also the option to turn off the algorithmic-driven timelines and search results in order to escape these data-driven filter bubbles.

The combination of the political polarization, filter bubbles, and AI algorithms is cultivating a deeper context for networked tribalism to thrive within our culture. Robb was the Sensemaker in Residence for a four-part series of Zoom talks at Peter Limberg’s Stoa throughout August 2020 that were posted on their YouTube Channel on October 3, 2020: PART 1: August 10, PART 2: August 17, PART 3: August 24, & PART 4: August 31.

I wanted to invite Robb onto the Voices of VR podcast because I found his “Networked Tribalism” sensemaking framework to be helpful in making sense of some of the cultural and political dynamics in the United States, and how they’re interfacing with technology policy issues around filter bubbles & the impacts of algorithmic filtering, as well as the dynamics of censorship online weighed against the role of a code of conduct and community standards in order to create online spaces that are free from abuse and harassment.

The First Amendment protects the freedom of speech in relationship to the U.S. government, but it doesn’t extend out to private property or for big technology platforms where speech is regulated by their terms of service, codes of conduct, and community guidelines. But even the First Amendment has a number of different free speech exceptions such as the “Fighting Words” category of speech, incitement, false statements of fact, obscenity, child pornography, threatening the President of the United States, or speech owned by others. Each technology company has to decide how it weights the benefits of free speech with the potential harms that come from all of the unprotected classes of free speech, and how they will enforce it on their platform.

Whether or not the enforcement of these codes of conduct and community guidelines is seen as political censorship or the regulation of unprotected speech depends a lot on the larger cultural and political context that’s driven by the narratives that leaders and influencers within these political factions are creating. Then what happens when the boundaries of acceptable and unacceptable behavior and speech are determined by machine learning data sets that operate at scale and are imperfect in their implementation? And then what happens when your access to virtual and augmented reality will be determined by your actions and behaviors in a media ecosystem that’s being monitored by these same black box AI algorithms?

Robb expects that the intersection between Filter Bubbles + AI Algorithms + Political Polarization will continue to accelerate and drive collective behaviors through networked tribalism, and it’s an open question to what degree the technology policy and legal legislation will be able to reign this larger cultural and political dynamic.

You can look at this issue through a couple of lenses like Ronfeldt’s framework of “Tribes, Institutions, Markets, Networks” or Lawrence Lessig’s Pathetic Dot Theory of law, social norms/culture, the market, and technological architecture/code. Either way, there’s certainly a large cultural and political aspect where these affinity groups cultivate in-group dynamics through the combination of networked communication architectures that build alignment either through psychologically-driven confirmation bias, algorithmic-enforced filter bubbles, or a increasingly-biased media ecosystem that’s not accurately representing all sides of a story or countering misinformation and propaganda coming from political leaders. This is obviously a very complicated, but also deeply relevant topic for how the intersections of culture, politics, and technology policy will continue to unfold into the 21st Century.


This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to The Voices of VR Podcast. So on today's episode, I'm going to be talking to Jon Robb about his concept of networked tribalism. It's what you get when you combine political polarization with filter bubbles, with AI algorithms that are helping drive these different filter bubbles. With the election just getting finalized here, there was over 74 million people that voted for Joe Biden, but also over 70 million people that voted for Donald Trump. So for me, that's just like two completely different bubbles of reality that have different narratives and stories as to what the nature of reality is and what's actually happening and what's unfolding. And that is starting to come into lots of different intersections with technology policy, specifically around the violation of terms of service and community standards and codes of conduct, where one side of the political party is acting in a way that is either spreading disinformation or inciting violence. And they're getting censored by either Twitter or Facebook and slapping these different warnings on them and claiming political censorship. So as we move forward, what happens if we have this increased amount of political polarization? And for these companies that are seeing what the leadership of the country is doing, when it comes down to individual citizens that are doing the exact same thing, they may get banned. And when it comes to something like Facebook, the access to virtual reality technology is directly connected to your behavior on these social network platforms. So, when things start to happen at these large scales, then you have out-of-control machine learning algorithms that are going out and seeing different behaviors that people are doing, and doing these automatic suspensions or automatic bands, which has already started to happen on things like Facebook or Twitter. So it's in that context that I'm going to be talking to John Robb. John Robb started in the military. He moved into being a tech analyst where he was looking at the evolution of the internet and then went back into fusing those two things of doing military strategy and tech analysis. And the ecosystem of online right now is kind of like this battleground of information warfare. So he's kind of taking these lenses of his background and different strategies of warfare and applying it to this current information ecosystem that we have right now. and seeing how there's different strategies of trying to sow chaos and discord through maneuver-based warfare, or trying to get consensus and alignment with something that's a moral warfare of trying to get agreement based upon certain values. So as we move forward from the election and we start to look at how does this continue to play out, there's going to be an intersection of the larger cultural and political context and how that interfaces with technology policy and the different dynamics as we move forward and just trying to make sense of what's happening, because there's a lot of chaos that's happening in the world. And how do you actually make sense in a way that sees these different dynamics that are unfolding? So that's what we're covering on today's episode of the Voices of AR podcast. So this interview with John happened on Tuesday, October 6th, 2020. So with that, let's go ahead and dive right in.

[00:03:07.812] John Robb: I'm John Robb. I have a background in the Air Force. I went to the Air Force Academy, Astronautical Engineering, so I was aimed at the space program for a little while. But the shuttle blew up when I was in pilot training, so it kind of put a damper on things. Ended up flying special ops. I did tier one counterterrorism for five years. Brought me to all the garden spots in the world. Different kind of mindset than the traditional military. I was outside the box. I was given plane and money and a mission and sent off into the world for a couple of weeks and I came back and mission was done. And that's not like the standard military. Got out of the military after seeing snapshots of my kids every three months and ended up going to Yale for MBA, which was pretty much kind of filling in the gaps that I had, but it was just at the perfect time to see the internet back in 93, 94. And I was on it at the university. I mean, where else did you get access like that? And I saw an article. in Wired Magazine when it was a bi-monthly. And I had an interview with George Colony at Forrester Research and asked him what he was going to do, or what did he do every day? I had a big picture of his face and a bunch of Q&A replies. And he said, well, I sit with my feet up on the windowsill, looking out over Harvard Square, thinking about the future of technology. And I went, wow, I want to do that. It seems like you wanted the same kind of thing. It's like thinking about the future of technology. It was great, but I didn't really have a background. I applied to Forrester, which was a strategic IT firm. It was still backroom at the time because it wasn't mainstream press at the time. Technology, particularly information technology, wasn't as prominent as it is today. I ended up writing in their classic style. They have a paragraph, bullet, bullet, bullet style. which was kind of my natural style. And I ended up writing a bunch of reports and George was my editor and sat right next to me and trained me up. And when the internet came in 94, 95, and he said, well, hey, John, you want to cover this? Because I was the only person that didn't have a background in a specific technology area. And I was bringing in internet tech all the time. And I did. So it was like the perfect time. It was like 95 and 96 and 97 in the internet space. I was probably the first professional analyst focused on that. Got quotes in the Wall Street Journal and New York Times every week, it seemed like. And I ended up coining a lot of things and wrote which browser, which web server, wrote the first portal report before there were portals, before they went public. So every single internet company came through my doors. for all that time period. And it was a really wild space. And what I developed was this methodology that I use today is that I built frameworks for people. So if the environment is changing very, very rapidly and your old models don't work and they're not keeping pace, what will happen to a decision makers is they get frozen in place. They're unable to make a decision because they don't know which choice will end up being the correct one. And they're worried about the downside. of making the wrong decision in that kind of environment. And so, I provided them a framework for how everything worked, you know, they can use as a structure to make decisions. It didn't have to be perfect. It had to be close enough to get some thinking, you know, unsticks them. After that, in 97, I started a company with another Forrester guy. He raised some money and his Gomez advisors, we did internet finance right as the internet was flipping over into finance. So you had the internet brokers went from 0% of their transactions to over 50% in a year, which was a transformative thing for the brokerage industry. And we did all the banks and brokers and we built the performance monitor. I was a CTO and COO. And it was a system that checked the transaction systems at banks and brokerages from 23 different internet backbones, 54 cities around the world, six continents, every five minutes. So if you're a Fidelity or a JP Morgan, you're worried about the high net worth individual who's like in Switzerland skiing and they go to log on and their transaction doesn't go through. And that $200 million that they have there gets transferred. So that ended up being big. Almost made it through the IPO, but ended up didn't make it. And the company eventually sold for about $300 million about a decade ago. And I left in 2001 after getting that company up and going. And the stress was intense. And I wanted to start something new. I was looking for somebody to follow up on a report I wrote back in 96 called Personal Broadcast Networks. And it was a big think report. They allowed me to do that every once in a while. It was basically saying that this model of website building is going to break down and we're going to start broadcasting our content. Everybody is going to write and put pictures up and things like that. And we'll all subscribe to each other. And folks over at Netscape and Microsoft and Marc Andreessen and stuff were coming to me and saying, how do you build this? I go, I don't have all the answers, but this is the basic structure I was going to roll out. And I looked and looked and looked for companies to do that. And I didn't see anyone until 2001. and I saw Userland Software run by Dave Weiner and he was building a subscription system using RSS. He published RSS as an open source protocol and basically said, here's the system for the future that we're going to be writing from our desktops and we're all going to subscribe. And I decided I'm going to join him and I ended up running the company for about three years and we built first kind of personal web blogging tool that had subscriptions built in. We did the first podcast. We had Adam Curry from MTV came over and he wanted to do a podcast where he sent audio files through RSS and we cut a deal with him to put that into it. We got NPR and New York Times on board with blogging networking. or social networking in the early stages. Once the New York Times started putting all their stories in, in an exclusive deal, that everybody started joining it. And it went Zoom, and we had, what, 75 million people at the peak of RSS. But it was a highly decentralized system. We had Webbox.com, where when you read an RSS summary, it looks exactly like a tweet stream. So when I subscribe to people's blogs using Radio Userland, the summaries looked exactly like getting a bunch of tweets coming in. Because it would come out every minute and it would be replenishing and it would just go and go and go. But it was really heavy lifting in those early days. It was like trying to explain to people why they would want to socially network. It was really tough, really tough. Just like talking about the internet to a lot of people in their, you know, 95. What good is it? Why should I do it? and did a lot of proselytizing, did a lot of work on getting people thinking about how to use it. And a few years later, we had Facebook and Twitter emerge, and they took the model and they centralized it. And then they were able to raise a ton of money based on that centralization and Zoomed with it. And things are different. About the same time, I started a blog called Global Gorillas back in 2003. I had the tool and I saw stuff in Iraq that wasn't coming out of the analysis, didn't see it in the newspapers. And it looked a lot like the open source work and a lot of the technological trend work that I had been doing applied to warfare. So I combined the two and started writing about what I called open source warfare. the dynamic and how it works in practice and how it was able to keep US military at a standstill for many years. And that proved to be really popular. I ended up writing a book in 2007, Brave New War, play off Brave New World, obviously. And then that went everywhere. You got a New York Times op-ed and a bunch of other things. And all the top generals are reading it and people in the field are reading it. The practitioners, the guys who are out there actually banging down doors and trying to find al-Qaeda in Iraq, those guys were saying, that's it. I mean, because Iraq didn't have just one group or two groups, it had 70 different groups and they were all working in a very decentralized, loose system. And open source warfare described it to a T. So over the years, I started writing more and more about how it was morphing, how open source warfare turned into open source protest, which we saw in Egypt and Tunisia and the TAPL governments We even saw it like in Puerto Rico just recently with the governor and we saw it. First time I actually saw it in practice was back with Columbia at Nomás Farc against the guerrilla group that was doing a lot of kidnappings. I was kind of familiar with that place because I'd been down there back in the day. Boy, then most recently, back in 2000, I did a bunch of different companies in between. Back in 2016, 15, 16, I saw the emergence of the kind of open source politics that had moved from warfare to protest to politics. And it was playing out where we had two groups First, the group to emerge was an open source insurgency in the political realm. Trump was the weapon, what I call the plausible promise, that was sent into the White House. First, it took over the Republican Party, swamped it, overran it. Millions of people contributing online, coordinating, feeding up memetics, feeding up ideas, keeping things chaotic. It's a maneuver-based warfare. I wrote a report on this, my first report in my new service called Weaponized Social Networks, describing how that worked. And then they got them into the White House. I mean, so an open source insurgency inserted a person into the White House. And then in response to that, we saw the emergence of the resistance, which is they fought in the moral realm. And moral warfare is a very different kind of warfare. And it's been doing all the heavy lifting, whereas the establishment focused on things like Russiagate and all the things associated with that. And it just failed. That didn't work. The resistance held firm and it was constantly keeping the pressure on. And now with COVID, things are changed again. And now we're moving into something I call network tribalism. It's based on the open source thing, but it's a kind of a bad outcome for open source politics. I thought open source politics was something that could actually be beneficial. And now we're moving into some negative phase. Like we can talk parallels in history and stuff like that, but it's using this technology in a way that could end up being very bad.

[00:13:00.162] Kent Bye: Yeah, that's a great overview. And it covers everything from the beginning of social networks and RSS feeds. I wanted to orient myself where I came into your contact the first time. Back from 2002 to 2003 on until like 2007, I was working on a project called the Echo Chamber Project where I was looking at the media's treatment and the buildup to the war in Iraq of how media became an echo chamber. along that time that I was tracking a lot of different stuff. And I did 45 different interviews with lots of different people talking about what happened. And I realized that in order to create this documentary, I would have to recreate a lot of the problems I was critiquing. I would have to try to boil all the complexity down to 90 minutes, erasing all the nuance that the media themselves were erasing. And so I ended up turning into trying to open source this as a documentary. And it's in that interface of looking at the war and looking at open source that I came across Global Guerrillas. And I was like, wow, this guy's actually applying these same types of open source principles, but looking at the insurgency of Iraq as a principle for this resistance. And so I was already tracking this topic and then was reading your blog a lot and seeing analysis that was coming out of your blog then. And so I've been following you for a long, long time and to see how over the time we have it for the open source insurgency, which, you know, you mentioned some of the stuff that's happening internationally, but also there is the Occupy Wall Street.

[00:14:23.737] John Robb: Occupy was an open source protest, yeah.

[00:14:25.639] Kent Bye: Yeah, the open source protest with Occupy Wall Street, as well as the Tea Party. And then we have right now with open source politics, network tribalism, there's a political context to what's happening online in terms of censorship and what you called in your Stowa talks, the phase alignment. And I think this is what I got really hooked into. I wanted to maybe start here is maybe some start with the tough stuff first. Yeah, the phase alignment, you mentioned what happened in the Nazi time where you would get everybody to basically agree and that what seems to be happening is that we have such a polarization between these two parties that there's not a lot of middle ground and that you have this two parties that are basically phase aligned and then you can't necessarily have a common agreement and the AI sort of maybe accelerating into these filter bubbles. But as we start to move forward into the future, I just see the role of these technology companies being a facilitator of maybe even amplifying those levels and making this partisanship that may be already existing within a cultural context, but the technology is actually making it even worse. And so, you know, maybe you could talk about that concept of phase alignment, because that was such a key part that seemed to be from a historical sense, what you're looking from the history and what you're seeing now.

[00:15:45.895] John Robb: Okay. So I have a little bit different view of the conflict of the 20th century was between three major systems, trying to solve how to organize bureaucracy. Bureaucracy was really the instrumental social decision-making system and implementation system that we had in the 20th century and markets were growing. in parallel, but weren't quite as strong as that bureaucracy. A big reader of Max Weber, and so this kind of cockroach organizational system that transformed everything. And that each of these three systems had a different way to use bureaucracy to organize the socioeconomy. And you had communism, which we saw, we approached it as a single bureaucracy over everything. You had capitalism or democratic capitalism, which used a government bureaucracy and let corporations run their own little bureaucracies and private organizations of any type in kind of a big corral that was kind of limited at the very edges by laws. And then you had fascism, which ate itself, blew up very early, but it did something that most of the people who purport to really know about fascism and say, you know, when you see jackboots or you see militarism, that's fascism. When you see propaganda, that's fascism. Well, you saw that in any dictatorship, actually. All those things aren't really new or unique to fascism. Hatred of people, we saw it with the Kulaks and others, got in the way of the revolutions in communist countries. But what really made fascism tick was that they solved the problem of how to get all the capitalist corporate bureaucracies to align, to act like a command economy. That's the problem with democratic capitalism is that it's like herding cats. How do you get everyone to go in the same direction when you really wanted to fight a war or you wanted to grow your national power? The fascists did something called Gleichschaltung. These Germans did it. They called it Gleichschaltung, which is in phase. The first three laws are major laws that they passed back in 33 through 34 were focused on getting everybody to point in the same direction. There wasn't some major overarching ideology per se, it was just they had to think in the same way and they were getting rid of the deviants in every single organization, people who didn't agree. that the country was going in the right direction. And it was, according to Hitler and others, it was far easier to get that done than they had anticipated. I mean, it went into every private organization, every hobbyist organization. I mean, it aligned everything. And the negotiations between labor, which was then centralized into one big labor union, which then offered incredible benefits to its people, like even three cruise ships that took people in the labor unions around the world. So it's a little bit different than the way it's characterized, but In order to get people to align like that, it had to drive the propaganda machine into intense overdrive. I mean, it had to really amplify the goal, which then created a whole craziness and nuttiness that ended up eating itself. but every single person in that organization was lying. So taking that lesson out of the 20th century and then applying it to what I see in network tribalism is my worry is that networking makes it, and especially as it goes into kind of a tribal framework, which there's an in-group and an out-group and you believe what I believe and those people don't believe what we believe and therefore they're evil and they're enemies, is that the network makes it possible to get people to align more easily and enforcing it. And it doesn't have to go through government or have to have this government-run propaganda machine in order to make it work. And that if anyone who's a deviant, you can find them and pinpoint them and shame them and get them cast out of their organization, deprive them of a job, deprive them of a means of making a living, deprive them of any internet access, access to social networks. And as these networks integrate and become more interconnected, eventually, if you pick somebody that's a deviant, they will become a non-person because everything will have network components. So my worry was that networks that move in that direction become too strong. They can institute a totalitarianism, a tribal totalitarianism that can come from the outside of government and be worse than anything we've seen in the past because it will be in everything that you do. And everybody you interact with is in fear of violating a principle and being out of alignment themselves, and they'll turn on you in a second. But they'll do it from themselves. They won't need a secret police to do it. Everything will just have to be surfaced. And once it's out in the public, everyone will come down on you. So that's in phase, that's what I call alignment. And so we're starting to see these tribes, there's two tribes, a group of tribes on the left and a group of tribes on the right. And the right is actually behind the left in this instance. The left tribes have grown faster because they see Trump in the White House as an existential threat. That fear of being wiped out is driving them to organize faster. And the tribes on the right have been going slower, they're growing slower as a result of having As long as he's there, everything, he's like the canary in the gold mine to them. It's not really a threat until he's gone. So we're seeing alignment being pushed on companies. We're seeing it pushed on organizations from the tribes and left. We see China, who's adopted a similar version of tribal alignment using network means, and they're actually forcing every company to align with them. And you saw that in the Hong Kong. You saw that recently in the US where they were trying to get companies to stop criticizing that everyone in Hollywood already knew this years ago, that you couldn't put anything negative about China into a movie. And now you have athletes having to reverse and say, OK, I won't criticize China because I don't know anything. Those human rights abuses, not my concern. And companies in the US, in order to do business in China, they're all aligning. And that kind of alignment is a scary thing. China had a little bit of a twist on this. They're doing it through the government. So it's a government amplified thing. And if they're added a social credit system, that's in various stages of development, which then will instantiate it into a gamified system. So if you're connected to somebody who is accumulating negative points, it can tarnish you. So, you know, if you've hired somebody that's being punished by the social credit system, it will harm your company, you know, making it something that could be centrally controlled rather than letting the crowd do it. Yeah, there's lots of nuances as to how, I kind of dove into this straight on, but there's nuances as to how the tribes work that make them more real to people.

[00:22:15.428] Kent Bye: Yeah, I think that's a good start. One of the things that you mentioned in your Stowa talks was looking at what played out in China contrasted to what happened in Russia. And you were talking about how in China you have that real centralized, have everything aligned rigorously. And Russia, it's almost like the opposite. It's like this complete chaos aside from like this centralized corrupt core, but that everything around it is just sowing discord and chaos. And so when you talked about this, you had this dialectic between that consensus and that dissent and this order in the chaos. And I'm wondering if you could flesh that out a little bit, if we could look at two potential futures of one extreme being China, the other extreme being Russia, and if the United States you see is going in either one of those directions.

[00:23:03.619] John Robb: Right. Well, I don't see the political spectrum as we used to characterize it as valid anymore. Left and right in the old 20th century approaches doesn't make much sense. In the current framework for politics, you'd see this in how chaotic even the debates were. It's just not about policies or anything that's even intelligible. You know, nothing in this campaign has anything to do with like the old politics. The left right now is more towards something I call the consensus. A consensus is that everybody agrees on a specific thing and that's what we'll do. And the right is gravitating more towards the disruption, is that everything's chaotic. The strongest will be able to do what they do. And in Russia, that's Putin is the richest man in the world through cross holdings. He owns a bit of all these different companies and those act as a core cohesive unit in a sea of chaos that's stoked using the internet and other things in order to keep everything chaotic. So you have this chaos on the one side and an extreme consensus or alignment on the other. Now, the problem is that, you know, I think the consensus side is a stronger side as to what networks really are really great at doing. It's the thing that it wants to start sliding in that direction that over time that consensus within a tribal framework becomes a totalitarian system that crushes everything, that locks everybody into a single orthodox view of the world and reality. and doesn't allow any deviation. And there's no way to really address it because there's nobody in charge of it in the U.S. case. And in China's case, they've locked it into a historical framework that they're not going to allow anyone to deviate from, like Confucian, you know, just how you organize society and families and everything else. So that is kind of like a civilization killer because there's no progress, no advancement. It's, this is what it is. And anyone who deviates, you don't go to prison, you don't go to concentration camp or re-education camp, you just are disconnected from the network. And this network can go international, you can reach across borders. Already on Facebook is already intervening in elections globally. And so the big wins for these tribal networks is that they take over tech companies, the big tech companies. On this path, we'll probably see that. I mean, the other side and on the right, you know, I think the Trump side, the Trump insurgency was gravitating more towards the Russian model, keeping things chaotic, keeping things away from consensus. But it's still in the political phase. It's not moved up to a tribal phase. And what that tribal phase will look like will be every bit as aggressive as we're seeing develop on the left. And yeah, that could be potentially as ugly because chaos can lead to all sorts of horrible stuff. We caught a little bit of a taste of that with the response to COVID. And so, yeah, does that make sense? Because so the left and the right and extreme consensus and disorder chaos.

[00:25:54.904] Kent Bye: Yeah, I think there's, I mean, I'm a big fan of Grittle's incompleteness. So I think that any sort of framework is going to reveal some level. And so I think there's other dialectics there of the collective and the individual, I think is another big one in terms of the left and the right. But one of the other things that I saw some political analysis looking at philosopher Roland Barthes and using his metaphor of boxing versus wrestling, where boxing has some very specific rules that you follow, and that a lot of the political discourse up to pre-2016 was a lot of these, like, these are the rules for how political discourse happens. I think as Trump came along, it was almost like he was a world wrestling, going into a boxing rink and kind of like smashing chairs over people's heads, not even following the normal rules. And people who were boxing were like thinking to themselves, this is like, how can he do this? Obviously this is not going to work, but yet. it fit into this larger network tribalism and what you refer to as this open source political campaign where there was this larger grievances that a number of different factions of people that were seeing him more as a symbol of fighting against something specific. And so I wonder if you could expand on that, like how do you make sense of this shift from these normative standards of these rules that we used to have, which you can metaphorically think of as boxing, moving into this more World Wrestling Federation mindset where it's more about creating these spectacles that are trying to bring about deeper memetic messages that go above and beyond any of the specific content of what's being said.

[00:27:26.889] John Robb: Okay. Well, Trump was a weapon put in place by the open source insurgency that formed around him. I saw it back in 2015 when I was looking at the different candidates, like, ooh, that guy, that guy's going to do something. He has a potential to go a whole way. And people are like, look at him. Oh, you're nuts. He's just fitting the kind of model that I'm seeing develop. And it became more clear over 2016 that he was the kind of perfect instrument of disruption for the insurgency. See, the open source politics doesn't work the way traditional politics works. You're not electing a person. You're not electing a platform. In Trump's case, it really didn't matter what he did. He wasn't being judged as a person by his supporters. So when the Access Hollywood hit, it didn't impact him at all. And so, you know, I'm sitting with people and they're like, oh, that's going to crush him. He's dead as a candidate. In traditional ways, that would probably have been the case. But I go, they're not going to budge. It's still going to be 41, 42 percent. And that's the case with Trump's support throughout this entire presidency. It's been solid. 42% hasn't really budged the whole time. And it's because the insurgency is very much impervious to the traditional attacks by the established system. Trump's supported as long as he is the source of disruption, disrupting the establishment. And he's really good at taking cues from the insurgency, which is the kind of open source development process, coming up with ideas on how to disrupt the system. He's built a group of people around him, inner and outer circles, mostly in the outer circles, who are good at picking up those ideas and instrumenting them and making them real and putting them into practice. And it's constant disruption, disruption, disruption. He does certain things, of course, that fit within the traditional political framework, but for the most part, as long as he's disruptive, as long as he doesn't back down, as long as he doesn't give in, he's going to get that support. So it folds into, you know, there's more than just this high-level analysis. It's actually a whole method of warfare that he was using. It's maneuver-based. And maneuver-based warfare is built on disruption. It's built on trying to make it impossible for your opponent to think clearly. Okay, so you have an overload of information and you change topic constantly, just like you do in Blitzkrieg warfare, is that you punch through the enemy lines and you're like, oh, they're running through this town right now in our rear areas. Oh, they just hit this command post over here and these tanks, there might be only one tank. You know, they're so strung out, they're barely a threat to the tens of thousands, hundreds of thousands of men that are up in the lines. But the fact that they're back there cutting those supply lines and cutting those lines of communication makes it impossible for them to think clearly, and they retreat in that. They give up. They have to go back to where they can get their head together. And you can see how Trump's just changing topic to topic to topic to topic. There's always a new thing every week and he's always getting that coverage by the traditional media because that's how they work. But it's impossible for the opponents to actually mount anything that's substantial or has any lasting impact. There's no scandal that lasts more than a couple of days. And moral warfare is a different thing. It's driving towards this consensus and it uses shame and uncertainties that you're worried that you're falling outside of this consensus and that you're going to be cast out. And the moral warfare we see on the left is very effective. It's what works in guerrilla warfare. It tries to grow the center and gravity of the consensus at the expense of You kind of shrink them, you want to shatter them, you want to catch them in a dilemma where they're caught between two actions of moral one and amoral one, and moral one's being called for by the leadership, and that they will splinter off. And that's been really successful. I mean, it's a slow trickle, but it hasn't really been a torrent, hasn't really been overwhelmingly successful in terms of beating the insurgency. So, I have a report, I built a service called the Global Guerrillas Report, which I've been writing since 2017. And the first report in 2017 was weaponized social networks that goes through the whole layout of how warfare works in those two instances. What else about the insurgency? Well, you know, it's reaching its end. So, you know, we're transitioning because of COVID has intensified the threat, the imminence of the threat and the degree of the threat is that we're moving very quickly into tribalism. And tribalism is a different beast. You know, I always thought it was going to be tough to move an open source insurgency, which usually just formed many different groups, people with conflicting motivations, all agreed on one little thing. or remove this person from power, or fight this enemy until they leave. And they would do it until they won or lost, and then they disperse. And then open source politics came, and then we found that if a person is that promise, is that weapon in the case of Trump, you could hold them aloft indefinitely. But it's still relatively ineffective, and it would dissipate as soon as that person is out of office. Now we have tribalism, and tribalism, they solve the cohesion problem with an open source insurgency in a different way, is that most tribes have an internal narrative. And the internal narrative is, this is why we're better together, and this is what we've done together, and this is how we've overcome threats in the past, and this is how we'll overcome threats in the future. It's a positive, uplifting thing, and this is what we support. Here's our rituals and the like. Well, these tribes work in a different way. They identify the things that they dislike, patterns of behaviors that they hate. Since they can't seem to agree on patterns on the tribal narrative, the positive tribe, what they're for. we live in a world where values are so subjectively derived, we don't have any kind of religious backplane that will dictate how these morals and structures of ethics arrive, is that no one can quite agree on what they're for. So you ask two people what justice means, and the anti-racist tribe, what does justice really mean? And you couldn't get anyone to agree on exactly what that is. Here's an example of justice and people go, oh, wait, wait, wait, that person did this. No, no, no. It's not something people can agree on. What they can agree on is that person is racist or that person is a fascist or this action is fascist. And it works great within the network context because what we see is in McLuhan's sense is that the social network is rewiring us. is changing the way we think, the way we process information at an individual level. And because the volume of information is so great and beyond the capability of what we can do using traditional means, I mean, we've been forced to adopt new patterns, new ways of dealing with it. In the past, we used to read information in long form, read a book, read a report. think about it, come up with an opinion, and then discuss the opinion within a framework or within a box of what you can say and what you can't say. In this kind of current environment, it's impossible to make sense of everything that's coming at you. What you can do is you can pick out pieces of information that fit patterns. that you are curating, patterns that make sense of a lot of the things that you're seeing. And the patterns that work within the tribal context, the anti-racist pattern is that here's examples of racism. And we have examples that we were picking up constantly, and there's curators that work 24 seven on this, is that these patterns that help you make sense of the world around you, this chaos, makes sense of the politics that we're going through. It makes sense of how the system is not delivering results. And the network also makes it easier. And what we find in social networking, most people don't admit this, or I've never seen much written about it, is that social networking is great as a means of empathic communication. You know, empathy isn't really what we think of it. It's not kind of a thing I look at somebody who's suffering something and I kind of feel what they're feeling a little bit. And then through that sense of identification, I can do things for them or more willing to do things for them and help them. It's not kind of a voluntary thing. You know, empathy, true empathy is a involuntary means of communication. It is like pre-verbal, is that when somebody is going through something, You can see this in animals as well as human beings. You involuntarily model their mental state inside you. And it's a fast volume of information that's coming in. And the things that you're not getting directly, you're filling in the blanks. You're creating this complete sense of terror or panic when you see somebody being attacked. Like when a rat is seeing another rat being electrocuted, they grit their teeth. They tense their muscles. we do the same thing. And so when we see videos and pictures and hear stories that trigger this empathy, we're immediately put into this mental state. And the way social networking works is it wipes out all the contextual cues that would kind of help us protect ourselves from that empathic transfer. But we get it like raw. And if it's fitted into a pattern, this raw empathic trigger hits you, zoom in the face. So when a George Floyd video hits, you are George Floyd. You're watching the whole thing play out. I had a whole report on how empathic triggers work. But it's like how it builds tribalism. It's something that then allows you to identify with them at a very deep level. You become kin, you become part of the same tribe, and that the enemy is clearly there and the enemies are all people that you don't extend empathy to. Okay, so it's a selective empathy. I have extreme 100% without any caveats, empathy for this group, and I don't have anything for these people. So you have empathy, you have pattern matching, and that yields a tribal framework that can become massive. It can scale. It can act as a group. And people think that they know exactly what they need to do at any given moment because they're part of that tribe and they know who their enemy is. Does that make sense to you? Or is it, you follow me?

[00:37:04.500] Kent Bye: Yeah, I think there's, I have a few thoughts here. One is that we used to have newspapers that would go through this kind of dialectical process of trying to get both sides or trying to provide some level of truth that this is what we believe as a society. But yet, as we moved into the social networks, it's moved more into this influencer model where you don't need to check with the other side and that you can have this confirmation bias that reinforces your preconceived notions. And that what I understand what you're saying with the empathetic trigger is that it can be like encapsulating through some visual representation of a concept or idea viral because it is invoking this level of outrage. And that more often than not, it's against things that we don't want rather than what we do want. And when I understand, whenever you have a moral transgression, you kind of feel it in your gut and you have this moral outrage. I guess the challenge is that things spread so quickly and that there's no ways in which that internal vetting that used to happen with journalists that check with the other side or provide that deeper context to have that more dialectic process. A philosopher that I look to is Agnes Callard, who talks about the Socratic method. What she says the Socratic method actually does is that it's impossible to simultaneously believe all truths and avoid all falsehoods. that it's impossible for one person to go through both of those algorithms at the same time, to both consider a belief and to have the will to believe, but also have the skepticism and to go through all the ways in that you try to tear that concept apart. And that the process of science and the peer review process tries to do that. You put forth some research and you say, hey, we think this may be a fact that we've surmised from these sets of observations and yet, the whole community comes together and tries to disprove that. And it's not until it's replicated until you say, okay, now we can say this is the truth. The same thing with justice is that justice is trying to acquit the innocent and prosecute the guilty, but yet you can't have one person do both of those tasks because it's a conflict of interest. You can't. you have a neutral way of saying this is why he's innocent and this is why he's guilty. And that's why you have that process that happens within a court that tries to figure out that process of justice. So when you think about these normal processes of both truth and justice, when you apply political polarization to these social networks to have influencers who end up being the arbiters of truth, whether or not they're actually considering the other side or not, you have this amplification of the polarization that's happening in our culture right now.

[00:39:37.705] John Robb: Yeah, but network tribalism is a way of thinking as a group, okay? So it's kind of a social manifestation of the way we've been rewired internally. So McLuhan said, okay, first we start with the rewiring from the technology and then we wire society in order to match that. We did that with the printing press and we've created bureaucracies, created markets, created the kind of things that we see today and take for granted. So things like the press, which is an old style institution, now has become kind of tribal media, right? So it's picked a side, you know, you can divide them up and depending on which tribe they support. And that science is a discovery process, but it really isn't meant for how we run our lives. Only recently, there's been a fad of sorts that we thought, okay, we could be led by scientists. It's like, we can be informed by scientists, but social decision-making is much more complex than that. It takes in a lot of individual factors rather than some kind of scientific optimum. So what we see is that the concept of truth is starting to change. There's no objective truth. It's a tribal truth. And it doesn't have to be completely right. It has to be mostly right. For instance, if there's something that uses an empathic trigger, and it goes off, and it blows up, and it turns out to be exactly the opposite. It turns out to be a hoax. It turns out to be wrong. mischaracterized. It doesn't matter if the retraction goes out because no one really cares. And that no one really wants to examine how that process broke, whether the initial claims were right or wrong, because to do so would cause you to question the entire pattern, question the entire sense of what makes you a part of that tribe. And that's an identity issue. And if you're a part of that tribe, then who you are is tied to that. How you think, how you process reality is tied to that. So people aren't willing to do that. I've had plenty of instances where things have reversed and I said, hey, look, even on a small scale, not the big things, it's like, hey, this isn't what you thought it was. People are like, it doesn't matter. This is happening everywhere, right? So even if this one was wrong, it really doesn't care. How could it not be based on initial appearances? It has to be some sort of confusion. It's spinning it in the wrong direction or some conspiracy to hide the true facts. On a small scale, when you take tribal thinking, it's like conspiracy thinking. It's patterns. So you get conspiracy thinking everywhere. So just saw it with Trump. Trump got sick. Contradictory information came out. Conspiracies were everywhere for every conceivable reason. I mean, everyone was talking about how he hid the truth and was spreading it to the other side and the right was like there was an attempt on his life. It's an assassination attempt. Somebody stuck it in. to the gatherings. And so depending on the tribal network you're affiliated with, you saw the different things. I try to kind of spread everything out so I see everything coming in from across the different networks. Probably based on my experience doing guerrilla groups, I tried to read everything and I didn't take one side or another. If you do that, then you become unable to do the right kind of analysis. So people would say, how could you say this about this guerrilla group doing this? This is where it's going to go. And I go, well, This is how it's developing based on my analysis of that. And, you know, I'm not just looking at solutions. I'm looking at how everything develops. It's a kind of a larger technological thing here is that we're in a process of transition to kind of a networked reality, network way of thinking and processing information. And that in a way we used to do it as individuals, we're tied together using these mechanistic organizations is not functioning as well as it should. It's hard to get anything done. It seems to be falling apart and it's easily gamed and it won't scale to a global level. So clearly we need something else added to the bureaucratic and market and traditional tribal nationalism and like methods of thinking through social problems. And networks would be it. And learning how to do that is going to be tough. It's not going to be easy. I look back at biological metaphors and it's like you're looking at biological examples of how to think as a group and you see kind of the highly decentralized version would be an aspen forest or aspen stand. Larger organism in the world, it's connected, but it's each individual tree or back creating environments where they work together, but they don't have a robust means of communication. And you find that all the higher order organisms have higher levels of communication, more data flow between the nodes. And if we think of our current situation is that, you know, we're kind of like a global thought process that's booting up for the first time, it's emerging. And that we're acting as neurons with our individual discovery process about how the world works. And then we're amplifying or dampening different signals as they come through. And those signals are in conflict. And there's a lot of different ways in which that can reach a level of coherence where it can think in a rational way. My worry that it will become like too narrow and too focused and too disciplined that there will be just one signal or just a few that are allowed in one thought process and it won't allow it to move forward from there. But you know, that's going to point to where AI goes. And most people think of AI as trying to create a human equivalent or beyond, beyond human capabilities. Thinking entity. And, you know, I look at the world and I think, okay, we already have something that's greater than human. That's a thought process. And it's the global social system and preserving information across lifespans of any individual is constantly evolving. It's constantly adding new information and solving problems that none of us as individuals could ever possibly hope to solve. And we already have it there. The superorganism is there and it's greater than human intelligence is already in place and moving forward at a far faster than our biology does as individuals. And that AIs, as a developer, aren't going to be replicating the individual neurons like us. It's going to live in the spaces between us. It's going to facilitate that interaction and make it better. already seen the biggest AIs, at least I've seen, are the ones inside the big social networks. And so far, they've been geared to solve marketing problems and corporate problems. And their after effects have been pretty aggressive. You know, the whole paperclip problem thing. I want to have people stay on my site longer, but the way it solves that is to cause people to get more outraged. You know, you could maybe even make the case that everything that we're doing right now, all the fighting that we're involved in right now is about or is over how those AIs get built, how those AIs that mitigate our interaction, that smooth it, that help us focus it. or help us break it apart if it goes down bad pathways is going to be set up. And that kind of fight is for all the marbles, right? Get it wrong, everything locks down into something horrible, or we get it right and we move forward faster than ever. I don't think we're going to get it completely right. We're going to get it kind of partially right. But as long as we're more on the partially right side with some room for improvement or ability to change in the future, then I think we're going to do okay.

[00:46:51.483] Kent Bye: Yeah. Part of the reason why I wanted to have you on to help get a sense of this landscape, because the cultural political polarization that's happening, the risk is you start to have these networks that there's only a handful of major monopoly networks that are controlling all the different discourse. And so if you get banned or censored from that platform, then you are no longer part of that collective conversation because there's no equivalent of what you would call a public space when it comes to network communications, because it's all owned by these private corporations. And then when I talked to Vint Cerf, one of the co-inventors of the internet and works at Google, And I was at the Decentralized Web Summit where we're looking at the future of decentralized web technologies. And I was really hoping that he would say, okay, well, these are really promising and that we're going to be moving toward this. But he was really quite skeptical because he's like, look, the economies of scale of decentralized systems are just so much better that if you want to try to provide all the information to everyone in the world for free, then the type of business model that Google has with advertising is able to do that. So you have that.

[00:47:56.373] John Robb: Economically, there's so much wealth to be gained by centralizing. That's why Facebook and Twitter and others are worth as much as they are. Everyone's working so hard to actually make them successful. You know, I was caught in that whole decentralization, centralization trap back in the early days. We did RSS and a blogging tool that sat on your desktop. Get ready to use it. And I was a big believer in decentralization and I was wrong. You know, it just didn't offer the kind of economies of scale. It didn't offer the speed at which you could actually move the network forward. You could add features. You could add connections. You know, it didn't offer the wealth accumulation that would attract the investment necessary to go global fast. I think the solution, having done that 16 years ago, and going through that process, decentralization versus centralization, is that it's kind of built on the PC thinking, right? Is that I think the real solution to this is to think more in a social context. The model is kind of like private ownership of land, right? So if you're going back to the 16th century, the idea that a private individual could own land, not part of the nobility, not part of the money or the landed classes was pretty outrageous. But once that private ownership of land and property became possible, then the dynamism of the system opened up and the ability to kind of extend rights and protect individuals became much more important. And in this instance, we live in a sea of information and the NAIs that matter, the AIs are going to be built based on my projections in that space between us in the social landscape, require our data to get great. The more data that are provided, the better they are. It's a very strong correlation in at least the deep learning space. So amount of data and the quality of the thinking process. So the solution is to give people ownership over their data. all of it. It's not about building new technology to protect it. It's not about debating at a philosophical level what privacy is, which changes everywhere. It's like leaky ship. It's just great for discussion over beers or something, but it's not really something you can build a society on privacy. Ownership is ownership. There's contracts and there's structures in the old system that will allow you to protect that. And if somebody takes that data, they're stealing from you. It's theft. It puts the individual in the driver's seat. It makes it possible for them to actually put limits on the system that's going to be built for them. And granted, a lot of wealth will go to the people who are building these systems, but you're not counted out. If you own your data across all the different platforms that could potentially be gathering it, and that there are places that will store it for you and protect it and guard it, there'll be groups of people that will come together to help you sell it or license it or make it available to different companies and organizations that want to make use of it or train based on it. Some people may even give you part of a royalty on the value of an AI that's developed off of it. that becomes an annuity stream that pays you for the rest of your life. But it all starts with the data ownership.

[00:50:52.365] Kent Bye: Yeah. I just wanted to jump in there because I've been doing a lot of deep dives into privacy and there's a philosopher of privacy named Dr. Adam Moore, who talks about privacy as like a copyright that you would license out different aspects of your data. And I talked to John C. Hatton of the IEEE and he talks about data sovereignty and because, you know, if there's an open market for data and you don't have like complete sovereignty over that data, then that means that there's going to be all these data brokers that are actually going to be outbidding you against your ownership of your own data. So that ownership slash sovereignty aspect is.

[00:51:26.320] John Robb: Oh yeah. And I mean, ownership is ownership though. So you're talking first principles when I create it, I own it. And then only then when I release it, do I release it. Right. So yeah, you have to have that initial ownership given to you. Like you shouldn't have to bid on your own data. Ownership means you own it when you create it.

[00:51:45.272] Kent Bye: Right. And the GDPR was formed with looking more along a philosopher privacy, like Dr. Anita Allen saying that privacy is a human right and saying, these are the rights that you have, but it's not taking it as something that you would license out like a model.

[00:52:00.290] John Robb: Well, I mean, that's the problem with the European approach was that it took a privacy approach and basically was destroying it. It was preventing data from being accumulated too. I mean, it's like, okay, I'm of the mindset that I think almost every product will have AI in it. some kind of artificial intelligence, not like science fiction AI, but I mean, it will be built based on data that's accumulated from people who use the products and use the system. And that if you prevent that data from being accumulated, you're basically destroying your economy. I mean, you become backward, behind. And that right now there's two big data pools. have been accumulated. There's the US one, which has Facebook and others that are accumulating across the world and bringing the data in. And then you have China, which is pretty much an island so far, and it hasn't really found a way except for TikTok. That's why people are going after TikTok actually, is to stop them from accumulating the data necessary to build the products that can sell internationally. And then we find that these AIs, as they're built into every product, require a lot of local information because something built for a German is not the same thing as built for somebody from China, built for somebody from Japan or from Florida. It requires local information in order to really, really be good and nuanced and work correctly and tailor the experiences of the product more and more, both on the service level as well as the actual physical products to individuals. And unless you have that data, you can't build great products, can't build great services. You know, Facebook and Google and others should be like our crown jewels, should be finding ways to make it more possible for different companies and other things to grow out of them and build an economy based on that. But things have not worked out. They tend to be more global and they're more moving away from us. Though they started here, they're not really that loyal anymore. And that's where these network tribes may end up changing that.

[00:53:47.973] Kent Bye: You mentioned earlier that you were wrong to start with decentralized. And I don't think you were wrong full stop. I just think that the centralized systems have proven to go first. And then there's decentralized alternatives. Neil Trevett from the Kronos group that looks at graphic standards. And he said that for every successful open standard, there's a proprietary competitor. So there's the vertical integration that happens. And then there's like more of the standardization that happens that pushes it forward over time. And so I do think that we're in this phase where the technology platforms used to be everything open first, the original internet protocols and the world of web came out in that open context. But yet once you made the platform shift over into the mobile market, then you started to see Apple and Android do these app store ecosystem models, which was their closed wall garden. And then with virtual and augmented reality, it seems to be adopting that same model of making everything into a closed ecosystem because it takes a lot of capital, a lot of money to innovate on those platforms. And it is actually better when it's all vertically integrated and it all works because it's more efficient. And the type of open source hardware hasn't achieved the same level of parity as mobile or future platforms of augmented and virtual reality.

[00:55:01.945] John Robb: All the innovation that went into Facebook and Twitter was done by 2004, for the most part. I mean, all the basics before they even started as companies. So they were all about just scaling it and making it possible to do this on a national and then global scale and then smoothing it out. And they added little bits and pieces, but the innovation burden was already achieved.

[00:55:22.277] Kent Bye: One of the things I just wanted to throw out to finish what I think about the European Union, because I do think decentralization, what's happening is that you have a mass centralization of these networks that have gone global, and that you have things like the European Union saying, no, actually we want to enforce these general data production regulations. You have to abide by this, or you're not going to be able to do business here. You have the shrimps to decision that happened that has this connection between the data that are being shared to the national security state through the third party doctrine, saying any data that you give over has no reasonable expectation to bring private, which means that the US government can get access to this information. And so you have the European Union saying, you know what, we don't want all of this information in the hands of the government. So we're going to have this privacy shield that is now dissolved. So you're starting to see like the balkanization of the internet. following these different laws and regulations, we already have China has its own internet, and we already have Russia that can pull a switch and not be connected to the internet anymore. So I think that decentralization has happened, and now we're moving more towards these decentralized approaches.

[00:56:25.314] John Robb: Well, I mean, you still have Facebook growing, right? 100,000 people a day or 200,000 people a day. So it's like still at three and a half billion.

[00:56:33.349] Kent Bye: The engagement with the youth is not at the same as it is with say, TikTok or Snapchat.

[00:56:38.473] John Robb: So it's internationally, it's actually pretty good. And just in the US, you'll see where we tend to look at it from a US lens and say, okay, younger people in the US that are cooler and hipper don't use it, right. But most of them are still on Facebook, because they're still connected. That is you're going to be recalling her And so you go to TikTok and other things. Yeah, so I mean, there is a little bit of fragmentation, but those companies will end up getting bought and folded up. I mean, the real big jump will be the next platform. I think it's going to be AR. I'm a little less VR. I was more like, I love the whole Alphaville thing back in 96 and things like that. I mean, that was great. And I really loved the whole Stevenson thing and the idea of the metaverse. But I think the big jump will be in AR and being able to add things, not just a data layer, but actually change the environment that you're in on an ongoing basis all the time. We talked a little bit about this earlier. I modify Skyrim, right? It's a kind of persistent world. But you see, the thing is with VR, virtual reality, virtual worlds, they have limited offerings. Maybe a hundred million people may like one, but they never seem to grow beyond that because they're one person's view of how the world should be. Whereas in the AR world, it's adding to the existing environment and everybody gets to make it the way they want. They can change the color palette. add furniture, add decorations, add music, add tonal quality, change the appearance of every single person that they interact with. And all of those features, all of that capability can be delivered as services and delivered as tools that you can use on your own to modify. I mean, even in Skyrim, even if it was five years old or six years old, and it's getting still 20 new mods a day. just built by people who want to change everything, change the voices, change the conversational patterns, change the behavioral patterns of different objects and different people, adding NPCs to the environment, adding conversational NPCs to the environment that didn't exist and that could be added to the existing environment that we're in, that make it richer. And that is the next big social shift and how that interacts at a social level is something I'm still thinking through. I still don't have it, but I mean, I love games and I'd love to be able to go into a VR environment, but I just don't see a metaverse pulling out of that. And even going back to the Gibson stuff where it's all kind of symbolic, it's still, you can connect so many things inside an AR world. Like if you want to have tea and you're sitting and we can go to Paris and sit there in the real Paris and have tea and talk to each other at a cafe and see it as if it's real, as if it's here with us. but it's all layered onto the existing reality. That's the great fragmentation moment. That's the great experimentation moment. That's like how we connect up and how those tribal networks enforce certain patterns, certain things we're allowed to do within those environments will be interesting.

[00:59:27.652] Kent Bye: Yeah, I should circle back to that. Well, first I would say from my perspective, AR is a lot more theoretical because a lot of the hardware is not consumer at this point. And so VR is actually way further along in terms of proving out what's compelling, what's working. And so I imagine that it will come along, but especially in the midst of a pandemic, there's just So many of the design patterns, there's a paradigm shift from 2D to 3D that's happening with VR. And that same paradigm shift is also going to be happening with AR. I think one of the things that we are going to start to see is the increased influence of these layers of virtual augmentation that we already have through the virtual world. And we have through asynchronous communications that's happening. And there's a sense of moving into the virtual world becomes much more synchronous, like more real time. And another dialectic that I'd say from the centralization to the decentralization is that so much of the activity that's happening on the internet is happening on direct messages, private messages, like the dark force theory of the internet, which is that private discord groups, journalists stop writing articles and starting their own email listservs. So you have this thing that used to be the public sphere is now going into like this more private context, but that private contact is more real time rather than asynchronous. There's still asynchronous private context, but in VR in particular, a lot of that allure is that you're going into these shared social spaces with people in real time. and you're having authentic interactions with people in the moment, rather than something that's reduced down to a memetic transmission that's amplified out with the network effects, it's harder to go viral in that sense because it's more an embodied experience that takes time for you to actually experience.

[01:01:12.014] John Robb: Right, so the deep dives that you're getting are social dives. So rather than deep diving into a book or a long article or reading the paper cover to cover, you are deep diving into a social space to discuss stuff at depth. That reflects a change in the way we process information. We do it as a group now, right? So when something comes through and you don't know whether it's true or not, and it's compelling, you put it to your group. And if you have a good group and you've been maintaining good quality network around you or connected to you, there's always somebody that will say, okay, wait a second, that doesn't make any sense. Or, wow, that's really true. You know, I've seen this in other places and this is a good source and this actually makes sense. I mean, no one really processes information as individuals. And if you do, you're going to be caught out more often than not. That was one of the first things I came up with back in early blogging days. I started a group on Yahoo called K-logs, K-logs, 2001. mailing list. And it had a couple thousand people for that topic, which was kind of wild. Way back when. And one of the ideas behind blogging and why it would be advantageous to kind of network blog is that it would allow individuals who know something to process information. Because people post when there's something new. They read posts because they want new information and new analysis and they want it as real time as possible. If somebody is maintaining a blog and they see something and they have unique knowledge or unique insight, they can then post on it and that can be consumed by the network. It helps them make sense of what's going on now or explore a topic that they're interested in at greater and greater depth as time goes on, as things develop. It's like becoming smarter without having to add that specific knowledge to your own capabilities. You network in and that's being done in real time for you. And you add your own and you add it to others. Not everyone's capable of actually adding value to this, but they're good at amplifying it and repeating it often. I don't know. I mean, before the internet, I mean, for me, it was like turning on the lights, you know, knowledge was so hard to acquire insight. And, you know, I work a lot on an intuitive feel for things really good at it. Like creating this synthesis, all these different pieces and put them together is that the internet just like was built for me. It was like, wow, it's natural. It was like, I'm sure there are a ton of people like that. And if they weren't, they're becoming like that. It's a different way of thinking and now the social aspects are very cool. I mean, the idea of deep diving socially is really awesome. You're right about the dark web thing. I mean, my blog, Global Gorillas, used to be public and it got millions of views and that got me initially known to people and then made more sense now to have it. locked down in a Patreon, and then I got a bunch of very cool supporters, and I keep out people who troll in our Discord discussions, and get really sharp, sharp people just sporing topics, just sporing modern politics without calling each other names. Even if they're on the different ends of the spectrum, I mean, it's the only place on the internet I think you could actually find a place where people discuss it without calling an NPC or a Nazi, you know what I mean? It's really cool, but I had to lock it down in order to make that possible.

[01:04:28.882] Kent Bye: Yeah, that's the dilemma that I wanted to dive into, which is that most social networks that are at the scale of like Facebook, they have to implement like these code of conducts and these community guidelines in order to create these safe online spaces and that. Right. And in order to do that at that scale, you end up having to start to draw a line for what is acceptable and what's not. And then this whole issue of content moderation, there's a whole documentary called The Cleaners, which goes into like these people from other cultural contexts going in and moderating content that is unacceptable. But once you get AI to that certain point, there was a point that you made in the Stowa Talks, which is that Marshall McLuhan concept of the technology evolves to the point that eventually becomes a technological artifact.

[01:05:13.313] John Robb: Society becomes an artifact.

[01:05:15.073] Kent Bye: Yeah, the society becomes an artifact of the technology, meaning that the algorithms that you put within things like what is okay and what's not, the AI that is curating this is going to be a reflection of either one side of that network's tribe or the other, or somehow maybe we'll figure out that dialectic process to figure out how to do both. We have to.

[01:05:33.338] John Robb: No, it's not a if, either we figure it out or we die, or we collapse, or we fall in on ourselves, or lock ourselves into stagnation forever. So if we figure it out and make it work, or when we figure it out and make it work, then we're going to be able to move forward faster than ever before. Society as a technological artifact is extremely cool, but how do you build protections for the individual and that kind of thing, the thing that can go global? is that, you know, that data ownership model is something at an individual level that can protect us. Anything that's centrally implemented is going to be attackable. And I don't think you can break apart the technology in a way that gives people power at the periphery over the hardware they own. It's got to be a rights-based It's an interesting time. What were you talking about at the very beginning of before?

[01:06:20.448] Kent Bye: Oh, that there's a bit of the dilemma is the scale of these networks. They're so large. I mean, Plato would say that the Polish should be around 5000 people or so. And so when you get that scale, it's almost like metaphorically, if you're like the United States, that you have the federal government, but you have states and states rights. And so you have the dialectic between the decentralized and the centralized, but yet a company like Facebook is going into all these regions that they have no cultural context or expertise when they become de facto decider of what the bounds of the conversation is without really even proper interaction. And so it's almost like this technological colonialism into these cultures. And so the challenge that I see is that as we go on, I just don't see it being viable that the centralized model is going to hold because we can't just have one company that is becoming the arbiter that we actually need to have it decentralized in other ways, or what Cory Doctorow calls adversarial interoperability.

[01:07:17.086] John Robb: So ways in which that these major- The only way they'd make that possible is if you own the data. So you can transfer it. I'm not gonna give it to Facebook anymore. I want it over here. or they get limited access and if they want to continue to offer me services, this is all they get, right? So I don't know. Here's an interesting frame for you. I mean, most people, and when you think about rights, almost all the protections were against the state, right? The state is a all-powerful, monopoly and violent thing to be protected against. It's bureaucracy run amok because bureaucracies at the core don't have any moral frameworks. They can do horrible things. They're implementing. They're actually getting things done, but they can go off in horrible directions. So how do you get protection against this kind of monolithic state? You build rights and you contain it. You put it in a kind of a box of laws. Thing is, the way networks and markets work right now and the way networks are growing is that networks and AIs are going to be in everything. They're in the dating apps that you're using. They're in transaction processing systems that you use. They're in the publishing platforms. They're in the everything. I mean, absolutely. Hiring Monster and all the hiring algorithms and everything else, they're all being interconnected. And there are very few laws. There's not really any way to legislate this. They're operating outside of the government. And they can have more power over your life than the government can. Boxed in its place. And in that instance, the terms of service become way more powerful for all these different network access points, all these different things that you rely on, become more important than the Constitution. And they become what we should be focusing on, right? Is that we need a Bill of Rights for this environment, for this networked age, and that will protect us as individuals from predations of the network. We want it to be strong enough to protect us and offer services and do great things for us, but we don't want it to prey on us. And the way it's going right now, it's going to end up preying on us. That we're headed towards this kind of what I call a long night, that network lockdown, that eternal dictatorship. And it can be sooner than later. Most people think also that the state, the only way you can address or confront the state or overturn it is through force of arms and everything else. you know, do a coup, a revolution and the like, but you don't have to do that in this current environment. I mean, you can control every single American just by changing the terms of service on all these different services. And it can be done all outside of government, and you could exclude people and make them non-entities, and you can promote people, you can amplify messages, you can do manipulation, you can change everything. Even the election that we're seeing right now is actually just a sideshow relative to the real power that's being developed on the network side. The people who own the network, people who actually control it, whether it's a tribe or whether it's a bunch of rich individuals in oligarchy or it's various companies through cross-ownership and kind of like the Russian model, is that they can throw the election any way, which way they want in the future, just by pushing things a little bit this way or a little bit that way. They can drive consensus on issues without all the big propaganda we saw going into the Iraq war. You can get a consensus to do this law or that law and push government around like it doesn't even have any kind of say in anything and punish people for deviating and squelch topics and ideas before they even get passed up. And it can be done in a nuanced way. You can do it down to the conversational level, down to all these little private networks that are operating, you know, Messenger, phone, Facebook, all the way down to the ones that are trying to interoperate. AIs can do that in a way that individuals and secret police can never do on a bureaucratic scale. They can change every conversation, tone it down, mitigate it, sway it, tell people not to do stuff or make it disappear. So anyway, that's the thing I think we should be cognizant of, but we're focused on this political contest. We're focused on calling each other names and we're looking at the dysfunction of the old system in the current complex environment and saying, oh, it's due to this side or that side. And it has nothing to do with it. The decision-making system that we used to have is not functioning in this current environment. It's too complex for it. We're running into failure after failure. What we need is a network component, and a network component can make it easier to reach consensus and get things done. But we're not availing ourselves of that tool. We're not trying to mold it and constrain it and turn it into something useful. And until that happens, it may take us 100 years and a lot of bloody wars or horrible experiments to find out how to do that. But once we do, things will move really, really quickly. in a better direction. And for all the faults that we see in the modern world, we're much better off than we were 200 years ago. I mean, much, it's not even a debate.

[01:12:03.401] Kent Bye: So Lawrence Lessig has this model, the pathetic dot theory, where he says there's the economy, there's the laws, there's the culture, and then there's the market. So if you wanna shift collective culture, then you have to have some combination of all those things. And I think that in the current context, at least my analysis of it, We have economic economy that's run by all these centralized monopolies that are becoming more and more powerful. We have a political system that's being completely controlled by those monopolies and this political system that is almost like this zombie state being controlled. We have a culture that is in this increasingly tribal network, partisan polarization that is hard to have anybody talk to each other. And we have the technological architecture and the code that is maybe able to provide different technological architectures, say like decentralized systems as an antidote. And so when I look at the solutions, I see that that's basically the palette of what we have. Either it's regulation, trying to educate and have ways of people operating differently as they engage with these networks, more on a self-contained way or other ways to kind of shift the culture. And then there's different market dynamics, like you're introducing, you know, if we owned our data, maybe that would create new market dynamics to have this interoperability. There's also the decentralized alternatives that are out there as well. So as we look into like just solutions to all this, how do you make sense of how all those things could potentially come together to create a world that we want to live into?

[01:13:29.772] John Robb: I use just a slightly different framework from David Ronefeld. It's a Timmin framework. It's a tribes, institutions, markets, and networks. And tribes, institutions, and markets have been around hundreds of years. And they were transformed and accelerated with the printing press. With the printing press, it shattered the universal church, the Catholic church. It shattered the feudal system, created the nation state, created bureaucracies, made possible global markets. Scientific progress was made possible then because you could write papers and have them go, travel. You could have universities and centers of learning and things like that. And it transformed things. We're in that early stage of actually learning how to use networks, right? This new social decision-making system. And things that have kind of taught me when I was back in open source insurgency days in the early days in Iraq, watching 80 groups work, is that it's extremely innovative and it can be kind of like a tinkering network. So we can grab that out of it, tinkering, innovation. You don't have to have direct communication. You can try things out in an open source model. And if it works, the press will report it. That's that press component. I likened it to Stigmergy in the insect realm, leaving a chemical trail based on success of a given innovation. that others can copy and then put into practice. And then we saw with the open source protests, they can mobilize to change things fast and then dissipate, okay? So if there's an egregious figure in power, you can mobilize very quickly. So tinkering and mobilization, nothing mobilized faster than these open source protests. They just overwhelmed, I mean, millions of people turning out. Even in Nomás Park in Colombia, it's like 5 million people I think came out. So then an open source politics is that they can influence politics and take political office over a sustained period. And that the open source political system, we saw a little bit early, this kind of dynamic, I call it consensus and dissent, rather than the kind of extreme versions that we're seeing, like extreme order and chaos. Consensus and dissent is a really useful thing. It's like, if there's like a COVID crisis and we came to a media consensus as a nation, we wanted to solve this. and then we gave it to the political system and they fumbled. Didn't produce a test at the CDC and then we didn't have any tests for a month and a half and we got it way behind the power curve and we didn't know how to handle it. Whatever, they messed it up, but we didn't take advantage of this network to say, okay, we don't have enough masks, make masks. Get the cottage industry going. Oh, don't have enough PPE? Let's see if we can source it and get it to the hospitals, right? We can work as individuals to get these things done, but they didn't do any of that. They just treated them like top-down communication. I'm going to tell you what we're doing. Didn't take advantage of any of that open source consensus that could have gotten it. It was done. All of those companies that started to do remote work did it before the government ever did anything, before the CDC even thought about saying, stay home. Early March, the tail off in flying happened then. That was consensus in action. That's so powerful when we squandered it. I go, man, it was like a God-given gift. It's like, right. It shows us what we could do with this networking in a positive social fashion. We messed it up. I was hoping maybe even the opposition to Trump would have picked it up and then ran it and had people come through from all different branches and talk about mobilizing people to do stuff and do a YouTube thing every afternoon from two to four that would have gotten 100 million people to watch every day instead of a White House briefing. But no one did it. No one on the Republican side did it either. And then you have the dissent. I think the dissent is actually useful because you don't want a consensus to arise when it shouldn't. And you don't want it to stick around longer than it should. And it is constantly bringing up attacks and attacking it from all angles. And unless that consensus is really strong, it's going to fall apart. In this case, the consensus that wasn't embraced by the people in power and people was falling apart, didn't do anything, languished. Open source consensus fell apart and the dissent overwhelmed it. In a healthy political system, I guess that's the way it works, but in a healthy political system, people would get in front of the consensus and actually make it work. So that's a kind of useful framework. You know, you have the mobilization, you have the tinkering, extreme innovation, you have this useful dialectic or back and forth for political decision making. And then now we get to the devolution stage, we get into tribalism. And the kind of empathic frameworks we see in warfare have been developed in peacetime. So, the reason people go to war, the reason people go kill people, is that you have extreme empathy for your side and none for the other guy. None for the other nation. They're Nazis, they're imperialists, they're communists, they're whoever we're going to kill. And we did it in peacetime, which is wild, which is crazy. That kind of empathy, you know, selective empathy is just, you don't usually see that like that, but it developed really quickly just online. And since COVID, I mean, that's the downside of this is that in addition to all the potential chaos and pain associated with this thing developing and merging so quickly and all these weird thought processes and inability to actually act in a coherent way, we now see it can be dangerous and that it creates this kind of warfare and conflict. So those are the things we're learning from the network side and how those fit into bureaucracies and markets and the old tribalism. The old tribalism nationalism is being replaced pretty quickly by networks or by tribal networks. This kind of network decision-making could make it possible to reach a consensus on getting things done at a bureaucratic level. and get us all focused on the same goal, even if it's for a short time until the descent takes it down. But the worrisome thing is watching the big networks trying to take down descent too early. And the canary in the coal mine for me is the indication that we may be in dangerous territory is when the goofy stuff is taking off. So when you start going out and taking out QAnon, why? I mean, they're not dangerous in any real sense, or all of the little things, flat earthers and things like this, you start erasing that stuff. That's like, okay, who the hell is making that decision? Who's to say that that stuff can't be on a public system in the public discussion? So we're hobbling the dissent. Who knows where the source of the insight that collapses or stops some horrible consensus from actually happening. So if you're gearing up for the Iraq war, well, let's crush all the dissent. any theory that the WMDs don't actually exist, that's obviously a conspiracy that we have to wipe out, right? Oh, financial crisis happened. So wait, there was all this fraud that led to that. Let's crush that. They did that in the old media, but now we can do it online too. Any conspiracy theory that showed banks were actually knew that they were selling crap to everybody else, to all their customers and their pension funds. And there was a massive fraud involved in a $22 trillion financial crisis. Oh, let's not allow that to be talked about online. I don't know. I mean, I'd like to see our pain tolerance or willingness to let crazy stuff occur or be talked about, even by thousands of people, even by hundreds of thousands of people. It has to be higher. We have to allow the dissent to occur. We can't crush it. can't block it out. Because if we do, that's like hobbling the usefulness of networking as a decision-making tool. We obviously need to move faster. It's more innovative. It allows us to mobilize quicker. It allows us to solve problems as a society faster than anything else in the past. We had the discovery process and the consensus over the COVID that was much better as a group than it was at the government offices with the intelligence agencies and the CDC and everyone else gathering information in a bureaucracy. So we need that for a complex world. We need that for solving those problems and withstanding the effects of those problems or the problems that the complexity voiced upon us. Networking can provide us cohesion, offer us solutions. just a little spinning on this, but I mean, like an app that did contact tracing and did all sorts of great stuff, allowed us to really find who had the virus and who's spreading it, but we couldn't do it because we don't even have trust and no amount of privacy protections would protect that. If you own the data, And maybe if you had Bill of Rights, if you had a sense that the network was actually working in your favor, if the government came in on your side and said, OK, we're going to pay you $1,000 a month as kind of the emergency UBI attached with you're using this app and using it and putting data in and acting upon it and not trying to spoof it and other things like that, then everybody's in this together. And we could have solved this problem like any other country in the world. It could have been gone. It could have been just a thing that we didn't even think about other than two or three minutes a day looking at this app. That kind of collective action, but with a confidence and a surety of who we are, while protecting us as individuals, as giving us trust and a sense of belonging, at the same time, is possible, but we're just a little out of reach. It's just the system of before is standing in our way, and a lot of this infighting, this stupid, trivial stuff that they're doing on the left and the right right now, relative to the problems that are inbound, is getting in our way and preventing us from solving them. And probably that's probably a good end to right there. Great.

[01:22:43.391] Kent Bye: And is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[01:22:50.014] John Robb: Oh, no, I've talked about a lot of stuff. It's nice to be able to talk more about the technology features rather than the earlier stuff that I was doing on the tribalism, which is more of my normal coverage area. But just blue skying is really fun. I'm going to try to write up what would go into a Bill of Rights, network Bill of Rights, maybe either this month or next month. as a report and see if I can like nail it down. At least a framework, you know, a framework to get the ball rolling, get people thinking about it.

[01:23:16.148] Kent Bye: Yeah. I think there's certainly a lot of groups that are out there, the decentralized web network privacy groups. But yeah, John, I just wanted to thank you for joining me on the podcast. I've been a fan of your work for a long time and just your ability to be able to make sense of what's happening through this lens of warfare, I think is actually very insightful to see some of these different dynamics. And yeah, thanks for the work that you're doing here. And if folks are interested, I'd recommend them go check out what you're doing there on Patreon. So thanks for this discussion. So thank you.

[01:23:45.657] John Robb: Well, thank you. And thanks for having me on. This is a lot of fun.

[01:23:49.179] Kent Bye: So that was John Robb. He's the author of the Global Gorillas Report. So I have a number of different takeaways about this interview is that first of all, well, I think that John's framework of this network tribalism has helped me to at least understand some of what's happening in the political and cultural context here in the United States. and where it might be going into the future. So going back and tracing the evolution of this idea from open source insurgency that started in Iraq, and then ways in which that was applied to protest movements like Occupy Wall Street or the Tea Party movement or protests from around the world, and then into the open source politics. he sees that Donald Trump was benefiting from that. In a much similar way for the insurgency within Iraq was like 70 different groups that all wanted to have the occupying force of the American forces kicked out of Iraq. Just the same, there's a lot of people that wanted to disrupt the overall political and economic system, and they saw that Donald Trump was this disruptive force. And there was something about that movement that made Trump immune from any of the personal things that he was doing as an individual, because it became more of a symbol of this larger movement and resistance and trying to create chaos and disruption within the overall system. And then moving into the next phase of networked tribalism, more groups are coming together and because information can spread so quickly because they filter bubble dynamics because of the increased amount of media bias within different media organizations. I mean, there's a report that is put out by Ad Fontes Media that tries to map out media bias of media organization across two spectrums. One is whether or not they lean between left and right and also whether or not they're reliable or unreliable. Most of the mainstream media they have slightly to the left but also reliable but there's this whole other media ecosystem of other entities that are out there that are putting forth a whole range of political partisan polarized information that isn't necessarily completely reliable or taking in both sides to be able to account for alternative perspectives. And so it helps to reinforce this confirmation bias dynamic that we have within our political ecosystem. One of the things that John Robb is particularly concerned about is the degree to which that some of these decisions are made then implement it into the algorithm. And then from a Marshall McLuhan perspective, it's the technological implementation of those algorithms that then the society becomes a technological artifact. So the technology is reinforcing the culture that can emerge. And if one side is completely eliminated, then what's that mean for being able to have this freedom of expression of ideas that are able to have these debates? And that if you have this complete phase alignment, then you move more towards this model of China where they have instituted by the government the Great Firewall that is trying to restrict the amount of dissent that is happening within the country and that you have a stagnation within the culture. The other alternative is going to something like Russia where they have a corrupt core in the center, but they have lots of different utter chaos on the outside. And so they're trying to make it as disruptive as possible. That was one of the areas that was actually interesting to hear from john Robb from his military analyst background saying that there's this dialectic between that order versus chaos or the decent and the consensus. So on one side, Donald Trump trying to just be a disruptive force and having what he calls maneuver based warfare or blitzkrieg warfare, which is to just sow confusion and chaos and to do everything you could to be a disruptive force. But that's not necessarily a model that is going to build up much consensus, other than to have people fight against that type of strategy. So the fear, I guess, is that the consensus is going to get to the point where Some of these different morals or values are implemented in a way that is more and more like China, where any sort of dissent is not going to be tolerated. And so you start to have ways in which the algorithms are automatically eliminating people from the discourse and the discussion. Now, when you start to lay this out on that spectrum, one of the things that I have to bring in here is the First Amendment is not unconditional. It's not limitless. There's certain conditions on that. First of all, like these companies are private companies and so it's not like these are public spaces These are private spaces And so at the end of the day these companies can do whatever they want to be able to moderate the speech that happens on these platforms the issue is that they have so much monopoly power and they've had these anti-competitive behaviors where they've been eliminating any of the competition that that ends up being the only Communication platforms that are out there because there's no other alternatives, then it does feel like when you're being censored from these platforms, and it feels like you're being completely limited from the larger public discourse, because it has been treated as almost like this de facto public square, when in actuality, it's like these private companies that are controlling speech. So there's a larger issue there within the dynamics of antitrust and enforcing antitrust. Since the court case of Microsoft, there really hasn't been a big antitrust enforcement. There's a lot of legislation that's happening now, but I think that's a part of the dynamic there. But even if there's the perfect implementation of the First Amendment and free speech and all these different platforms, there's still the issue of exemptions for freedom of speech, whether it's from fighting words, so hate speech, harmful speech, preventing other people from speaking. There's restrictions on inciting violence. obscenity, child pornography, threatening the president of the United States, or speech owned by others. And so, you know, there's a whole list of different types of speech that are exempted from your free speech rights, whether it's the government or not. And so each of these companies have their terms of service, their code of conduct, their community guidelines to be able to regulate all these other aspects of unprotected speech. But when you have one political side that is deliberately spreading a lot of misinformation and disinformation with no basis in fact, Then what type of moral responsibility do they have and I think there's a lot of different ways in which that These companies have tried not to say that they're a publisher But in instances like the election that they do realize that there are some implications there. So this is a battle that's going to continue to play out and certainly a lot of different grievances that have been coming up around this but When it comes to inciting violence and being able to do hate speech and stuff like that, well, there are limits to the free speech. And so a lot of the ways in which that one political side is getting censored is because they're often falling into these categories of unprotected speech. Now that these tech companies have so much power over our culture and being able to make these different algorithmic decisions can potentially change the outcome of the elections. And whether or not we actually even want to have companies that have that much power, should we be decentralizing it and breaking up some of these platforms and being able to have more diversity of other competitors and what Cory Doctorow talks about, adversarial interoperability, which John Robb in this conversation went back to again and again, the fact that we need to own our own data. And if we have that type of data sovereignty, then that could encourage adversarial interoperability. We're able to interoperate between these big social networking platforms. So, you know, this is a dynamic that's starting now with technology, but will continue to unfold when it comes to the future of immersive technologies. And do we want to live into a world where the same problems that we have now are amplified even worse when we have even less numbers of different big tech companies who are in charge of this underlying architecture for our communication platforms? So, you know, one of the things that John Robb says is that it's a civilization killer if you have too much alignment on one side and the elimination of this natural dialectic that happens through the expression of freedom of speech. But I'd also say that there's another extreme, which is like when you have complete utter chaos and when people have like bad intentions and deliberately putting out misinformation and disinformation, then that's also a civilization killer. So I guess, you know, going back to these different dialectics between order versus chaos or the decent versus consensus, there needs to be one way or another, a balance between these two, where you are able to have a core agreement about the nature of reality and nature of the facts, but also allow dissent to be able to come in and shift it, but also when things get too chaotic and, you know, no order at all, then moving back to having some levels of agreement across the different factions of political parties, there's a healthy balance of needing to not just have complete phase alignment, but allowing for that dissent to be able to come in. And I guess the fear is ways in which that dissent is being eliminated from the overall discourse. So there's a series of sense-making and residence talks that John Robb gave at the STOA, which is a YouTube channel featuring a lot of philosophers talking about contemporary issues. And they have a YouTube channel, and on October 3rd is when they published the whole series that John Robb had given back in August, talking about this networked tribalism, going through his theory about all of this. And if people are interested in hearing more, I recommend checking out that, and also checking out his Patreon to be able to get some of the reports that he's been writing up on all this stuff as well. And so in that whole series, John Robb goes into a lot more detail into this framework of network tribalism. And a key paper that he seems to be referencing is David Rohenfeld's paper that he wrote back in 1996 called Tribes, Institutions, Markets, and Networks, a Framework about Societal Evolution, where he talks about the kinship-based tribes, the hierarchical institutions, the competitive exchange market, and the collaborative network. So that's the framework that John Robb's looking at. I look at a similar framework that you could sort of map over. Lessig talks about, in his pathetic dot theory, the technology, architecture, and code, which are probably reflected by these communications networks and whether that's a centralized or decentralized network. Either way, you're having these technologically mediated networks that are able to connect people and cultures together. But there's also the market, where there's open to market competitive exchange markets. similarities, certainly between the laws and the institutions and the bureaucracies and the different orders that they have there. But then there's the kinship-based tribe, which I think probably maps over most closely to the culture, to the different cultural aspects of people coming together and different ways of creating cultural artifacts and whatnot. The tribe sort of implies more of this in-group mentality. There's been a lot of different reflections on just even the use of tribe by historians like Christopher Lowe when he wrote this. article in Tolerance Magazine in 2001 called The Trouble with Tribe, where he goes into all the different baggage associated to a term like tribe. So in my mind, I tend to just prefer culture because it's any sort of affinity group and community coming together rather than adding this troublesome language around tribe. I'm not sure if I have an alternative word that actually encompasses all the nuances of that, but I just wanted to sort of point that out as well. And it certainly goes back to Marshall McLuhan and a lot of work that he was doing with tribalism, and he talked about that as well. But, you know, really differentiating what are the nature of those tribes and what is it that evolves through these different phases. It's a more exalted version of collective action. What is that called? Deliberative democracy or communities, affinity groups, you know, trying to really isolate the boundaries of what this behavior that he's seeing and what it's really exhibiting. But I think you could kind of just reflect upon the existing levels of political polarization in the filter bubbles. There is this larger political context and culture of this political polarization. But also, from a technology policy level, how do you actually come up with, from the technology side, ways that you try to mitigate this larger cultural issue? A cultural issue that has other vectors of influence that are driving these types of behavior. For me, I think it's important to try to open up the dialogue and have conversations like this, trying to really listen to the different perspectives and kind of sort through where there's overlap, where there's agreement, and where you disagree. There's different nuanced details in the conversation that I have with John Robb that I disagree with some of his assessments. One small example would be him saying that he sees that the left is a lot more advanced when it comes to their networked tribalism behaviors. But I'd say, well, the right has had a lot of politically polarized media for decades now, and that their more conservative media has been there for a lot longer. So not necessarily sure that you can point to any side or the other to say that they're better or worse or more advanced or less advanced. It seems to be, if anything, equal or potentially the opposite, which is that the right has been cultivating this type of partisanship for a lot longer, especially when you go back to 1994 and see the different strategies of political partisanship that have been implemented, and then different research that other academics have been doing going back to 1973 to see how there's been an increasing level of political polarization on both sides. So lots of other little nuanced things that I could go through and say every little detail that I disagree with John on, but I think the larger point is that this framework for trying to understand the dynamics that are happening at the cultural and political level and how that is going to start to influence into the technology policy and thinking about the algorithms that you're creating as this technological artifact that are reflecting the society itself And if you're trying to move toward a number of different generations, then how can you ensure that you're not unduly eliminating perspectives that may be completely valid? His concern that he brought up around, you know, what about the buildups of the war in Iraq? What if anything that was deviating from that Saddam Hussein had weapons of mass destruction, what if that was algorithmically eliminated? It was functionally eliminated by the media already, but cases like that where there's a real need for dissent to be able to have a calibrating impact when it comes to some sort of consensus that's made. In what degree do you try to prevent the cultivation and development of something that at the beginning may seem like something that's completely insignificant like QAnon or Flat Earthers, but may potentially be pointing towards this larger cultural movement towards a distrust over basis of what the common facts are to be able to manage a society as we move forward It's a deeper question in terms of you know, how do you start to bridge this polarized gap? but I think generally it's gonna require people to be somewhat comfortable with paradox and not knowing and be able to handle the Conflictory information and to triangulate many different pieces of information and try to see like what is the most likely? Theory that you can use to be able to help Navigate all these different information because it's impossible for one person to believe all truths and avoid all falsehoods and you know one of things I said relying upon the processes of science to be able to understand a just the nature of truth and epistemology and how that works through a dialectical process. John said that's great, but that's not how people make their social decisions. But I think we're actually moving into a world where we're going to have to get a lot more sophisticated with our own methods of epistemology of trying to discern for ourselves what the truth is and trying to weigh all this different Information it's a sort of this folk epistemology that's already happening within the context of online But how to have it in a way where you have this balanced ingest of information it goes back to this 2011 TED talk that Eli Prazer gave around filter bubbles Talking about how it's not just about feeding us exactly what we want to hear based upon our past behaviors it's almost like metaphor of eating a lot of junk food and you because you've eaten junk food in the past you only delivered a more and more junk food? And how do we start to pull into more of these aspirational aspects to get information that's both relevant, important, uncomfortable, challenging, and has a diversity of alternative points of views and perspectives? So I think that's the real challenge, is to not just look at our past behaviors and to rely upon these filter bubbles to feed us all this information. And John Thune has the Filter Bubble Transparency Act that tries to, number one, have the companies disclose when that type of algorithmic manipulation is happening, but also have the option to potentially turn it off, like Twitter does, where you can just have the chronological timeline rather than the algorithmically driven timeline. What would it be like if you could turn that off on Google, Facebook, Twitter, YouTube, and just get the ingest of whatever the thing is the most recent that you've requested and not rely upon all these recommended posts that the algorithm thinks you should be looking at without really understanding a deeper aspect of this more aspirational aspects of yourself to be able to have a healthy diet of different information. With Twitter, you have the ability to make lists, and I find that to be one of the most effective ways for me because I can start to isolate, OK, now I'm going to be going into this list of different conservatives or people who are political reporters, just to get different slices of different perspectives and start to triangulate from there. I find that to be a really helpful strategy for myself. But to really have that level of individual responsibility on this issue, there's a cultural dynamics, yes, but there's also ways that each of us can find ways to try to break out of our own filter bubbles. And yeah, I find myself often fighting against the algorithms in that way just because, you know, I'll create different lists and then on my Twitter timeline, Twitter will start to like put information that's from these different lists into my main timeline that just sort of like confuse the different ways in which I'm able to filter all this information. So anyway, I think there's going to be a more and more important issue as we move forward, this concept of network tribalism or whatever you end up wanting to talk about it, kind of like the mob mentality, group behaviors online that are driven by these different filter bubbles. And yeah, it's just, I think, important to help make sense of what's already happening and then how to potentially use different vectors of the culture, as well as your personal responsibility, as well as the technology policy, as well as the technological algorithms themselves and how to help shift some of this as well. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a list of supported podcast and I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show