Tech: Existential Threat or Life Support?

Tech totalitarianism – is that our future? In Covid times, Big Tech is accruing more and more of our money, time and attention. Along the way, the world’s most rapacious corporations are gobbling up our data and our privacy too. In this conversation on The Laura Flanders Show, three tech activists debate the threats, the possibilities and the power that still lies in our hands to break up the monopolies, resist the big brothers, and harness anti-human technology for human ends. 

As Doug Rushkoff (Team Human) says on the show, “It’s not that this is what the technology can do, rather this is what the shareholders of this company are allowing this technology to do.” 

Guests:

  • Douglas Rushkoff, Author, Team Human
  • Nabil Hassein, Technologist, Educator and Organizer;  Co-organizer, Code Ecologies, School for Poetic Computation or SFPC
  • Chancey Fleet, Fellow, Data & Society; 2017 Library Journal Mover and Shaker

This conversation is even more timely now than it was when it was recorded in March, 2019.

Transcript

Laura Flanders:             Days of wonder. Days of rage. We are living through an information revolution as profound as any since Gutenberg and the printing press. But are we the people, the so called users using the tech or is the tech using us, and for what ends? Can we harness this particular transformation for good? That’s the question. And we have some answers this week on The Laura Flanders Show. The place where the people who say it can’t be done take a back seat to the people who are doing it. Welcome.

Laura Flanders:             Alexa, Siri, are you out there? Just who is in our audience? By 2021 the research firm, Ovum, reports that there will be almost as many voice activated assistants on the planet as there are people. While it took three decades from mobile phones to outnumber humans, Alexa and her elk look likely to get there in less than half that time. That’s what they say. So what happens to us? New tools have always issued in new economic and social orders from the printing press to the pilotless drone. But does today’s smart tech pose a threat to us or a promise? As one of my guests today has written what if we thought of the future as a verb. It’s ours, at least until it’s theirs.

Laura Flanders:             Joining me today is Douglas Rushkoff host of Team Human, the aforementioned writer. He is the author of a new book by the same name. Also, Nabil Hassein technologist and abolitionist, who’s worked with the School for Poetic Computation about which more later. And last Chancey Fleet, disability justice advocate and fellow with data and society, working at the Intersections of Disability and Technology. Welcome. All.

Laura Flanders:             So, Doug, you lay out a pretty grim scenario of technology being as you put an ant- human by design. How so?

Doug Rushkoff:             Well, I don’t know that people come up with technologies in an anti-human way, knowing maybe even Mark Zuckerberg and the Google boys, they were first thinking we’re going to make something that helps people accomplish this or that. But once they really have to make a whole lot of money with these technologies, once they’ve taken on investment and they’re expected to deliver 1,000x return, what happens is the technologies end up really playing the people instead of people playing the technology.

Laura Flanders:             You’re talking about embedded in the actual code?

Doug Rushkoff:             Yeah.

Laura Flanders:             Wow.

Doug Rushkoff:             Well, if you think of a algorithm, an algorithm on Facebook is using data from your past in order to find out what psychological statistical bucket to put you in, in order to know which behavioral finance techniques to use to get you to do what it has been told to get you to do.

Laura Flanders:             And how different is that from other technology? I mean, as I’ve said, we’ve had lots of new technology. It’s all changed our lives.

Doug Rushkoff:             Well, I guess the main difference with these computational technologies is that they learn from what you do and adapt to your defenses. So this is the first time, I mean a television ad, I guess they could see if it works or not, and then next quarter try something else. The algorithm is going to try something else the next second, the next second, the next second. And then once it finds something that works, it’s going to tell all the other algorithms, oh look, this works on this human. You try it on your humans. I mean and some of them are true. I mean what these technologies learned to exploit our hour painstakingly evolved social mechanisms. All the mechanisms that we’ve developed to establish rapport with one another, our reciprocal altruism.

Doug Rushkoff:             If they can figure out a way to make an algorithm cry like a little baby and elicit my maternal instinct is to heal it, they’ll use it.

Laura Flanders:             That’s bad?

Doug Rushkoff:             That is bad. The reason why it’s bad is because instead of helping me connect with other people, it’s helping corporations extract data from me, extract money from me and get me to behave more predictably. Right? So as the algorithms and smartphones get smarter about us, we get dumber about them. And our behavior becomes more programmed, more automatic, less human.

Laura Flanders:             All right. So you’ve heard his case, Chancey and Nabil who wants to jump in here and Nabil?

Nabil Hassein:               Yeah, I mean, I think technology has a class character. I mean, when I call myself a technologist, I use the term very broadly, like I consider writing of technology for example. And like I trust this technology that I’m using right now a lot more than I trust my smartphone or my laptop. Different cultures have had and still have different technologies. And the specific, computer technology that we have today emerged from imperialism, from the US and UK and other related imperialist powers, military-industrial complex. They still retain great control over the infrastructure that underlies this, whether that’s the undersea cables that transmit data back and forth across the world, the physical supply chains of manufacturing and so forth. So I would say this technology is very much under the control of people who don’t have the interest or me or my communities.

Laura Flanders:             All right. So that’s still talks to the control. Chancey, to you. What about this technological existential threat?

Chancey Fleet:             I’m checking this out through the lens of accessibility. I work with communities of disability, many of whom like myself are happy to use all kinds of platforms, whether it be computers, smartphones, Braille displays, slates and styluses, which are that how Louis Braille would’ve written Braille, analog and digital technologies. Many of us are comfortable with a diversity of those.

Chancey Fleet:             But there is this mainstream narrative and there is some grounding for it that conversational interfaces like Siri and Alexa are gate creators for people who have disabilities that impact motor interaction, impact typical workflows or impact someone’s desire to avoid complexity.

Laura Flanders:             So you mean like a gateway in, not a gatekeeper out.

Chancey Fleet:             A gateway in, exactly. And I’ve seen folks with motor and speech impairments and memory impairments have a great deal of success with these interfaces, but I’m troubled in a couple of ways. For one thing, that mainstream media narrative misses the fact that it’s a small curated feature set that really hangs on habits of consumption more than habits of creation or even habits of communication.

Chancey Fleet:             It would be awfully hard to write a novel with Alexa and by suggesting that folks in our community that are being challenged by the limitations in contemporary technology design should just gravitate to these conversational assistance. They are suggesting that because of the way we are embodied, we ought to do less. The other thing that I find a little bit problematic is that we don’t actually need to submit our conversations to a server in order to have workable conversational interfaces. And I think we need to disentangle the ability to use a different workflow from the necessity to submit ourselves and our lives to inspection by the developers in the cloud.

Nabil Hassein:               Definitely from technical perspective, there’s no reason that these technologies have to work in this way. It’s that Google and Amazon walked them to work in this way for their own data collection and surveillance, resource extraction, and exploitation of us as consumers, as workers.

Laura Flanders:             So that’s where Team Human comes in?

Doug Rushkoff:             Right. I mean it’s really easy, especially for those of us who are not data scientists or computer engineers. It’d really difficult for us not to accept these technologies at face value. We think, “Oh, this is what the technology can do.” Rather than, Oh, this is what the shareholders of this company are allowing this technology to do. If anything, most of us look at the tech and figure, “Oh, well it’s just a technology. So it’s neutral.” It’s like, “Guns don’t kill people. People kill people.” Or this technology is just what it is.

Doug Rushkoff:             But as we look at the algorithms and the way they work, what we find out is, oh no, they’re not neutral at all. These algorithms are racist. They don’t feel racism, but they have embedded in them certain value systems, whether they’re extractive and they’re going to just do whatever we tell them to do. I mean, sadly or not, it’s true of anything. It’s true even of speech, of text, the language we use to speak is embedded with values, but they don’t quite change as dramatically and not as intentionally.

Laura Flanders:             One thing that I’ve been thinking about is our kind of basic myth around human evolution, the sort of survival of the fittest model that is now being questioned at the level of evolutionary science. How could we flip the script on how this technology develops going forward? Could we embed cooperation, not competition?

Doug Rushkoff:             Yeah, I mean, absolutely. If anything, the networking technology emerged as a collaborative effort. It was a way for terminals to share computing resources. That was how we started with it. And certainly we can flip the script on how we understand evolution. All we have to do is read Darwin and find out, “Oh, this isn’t the story of competing individuals. This is the story of how species learned to collaborate and cooperate.”

Laura Flanders:             Is that poetic competition? Is that what the institute is all about?

Nabil Hassein:               Yeah, because maybe just speak a little bit about that before we just very briefly. So the School for Poetic Computation is an independent artist run tech school here in New York. One of the things that we try to do at SFPC as we call it is, if you had to think about the other possibilities for technology, for our communities and for what could be done as opposed to only what has been done. Like for example, I took two weeks of evening classes there, one summer, 2017 and I made a rhyme generator based on the rhymes with my favorite rapper, MF Doom. And I used a library of software.

Nabil Hassein:               A library, which is code that someone else has basically packaged it up, free to use in your own program, called pronouncing, and this pronouncing library with based on a pronouncing dictionary which was funded by the US government, DARPA, like the Internet and many other important projects and computing history.

Laura Flanders:             Defense Department?

Nabil Hassein:               Mm-hmm (affirmative). Yeah. And this pronouncing dictionary only represents a general American English in the list of pronunciations for given word, which is not how this artist pronounce words for the most part. And so the biases that we’re embedded in this technology before I was born by with military ended up playing out in my art project. And so there’s a lot of the ways that these things do tend to accumulate over time. But like these decisions could have been made differently and even now it’s not too late.

Laura Flanders:             And Chancey, you and your working dog who just wanted to be part of the conversation there. What would you like to see?

Chancey Fleet:             What we need are strong, solid allies. We need developers to really know something about accessibility and to challenge the notion that, for example, some things are too visual. I’ll give you an example. At the New York Public Library where I work, we have a program called Dimensions and we can teach blind and sighted people to work together to create maps and art and infographics with raised line graphics. And anyone can come and learn and we do it collaboratively. We’re troubling that notion that some things are just too visual. Any spatial information we can render. But if I spoke to a random person in a Best Buy or a graphic design school that I wanted to enter, I think that they would tell me that I might be happier doing something else. And I’d like that to stop.

Chancey Fleet:             The other thing that we really need is a strong core of people who can help you when your abilities are changing or when your ambitions are changing, to understand how to use the technology you want rather than passively receive the technology you’ve been prescribed. When you walk into a Best Buy, when you walk into a computer lab in an average place, and the way you use your tools is not average. Folks will tell you that they don’t know how to help you and that your path is probably hard. But if you can find people who understand the workflows, who have had the problems and knocked down the barriers, they can show you how to do so.

Chancey Fleet:             Another thing we do at the library is one -on-one coaching that’s powered by peers who are native users of technology. And we take the mystery away. So when you take the mystery away, you can get on with your work. This goes back to actual people.

Doug Rushkoff:             Right. When I used to teach at NYU’s interactive telecommunications program, when we started in assistive technologies course there and what we very quickly learned was that as you make technology more friendly to, potentially disabled people, you make it more friendly to everybody.

Laura Flanders:             We are all potentially disabled.

Doug Rushkoff:             Exactly that too, but, you actually make the technology better. And, and it’s not just because it’s better user interfaces, but because your orientation has changed from what can I do to this person with this technology to what am I allowing this person to do with the technology? And that’s not the bias that Silicon Valley has right now.

Laura Flanders:             Last word from you and Nabil.

Nabil Hassein:               Yeah. I just want to maybe highlight some of the work that’s going on, towards these ends. I don’t know how many folks are familiar with a group called Cooperation Jackson. But they’re, a bunch of organizers down in Jackson, Mississippi, which is a poor muffly, probably 80% black city in the US deep south. They have a whole range of initiatives around this, including, what is it, the community production initiative?

Laura Flanders:             Maker Lab.

Nabil Hassein:               Yeah. They’re definitely in need of funding in order to basically take a greater portion of this technology’s development away from Silicon Valley and in a wave from some of these other entities and put it into the hands of their community, to serve their own goals. And I believe technology can be used for this. It’s obviously the way that capitalists who predominantly control society’s resources choose to invest them. But there’s nothing that prevents us from doing it to the extent that we can and basically export reading them, taking their wealth, and then re-allocating it towards a different end.

Laura Flanders:             Perfect. We’re going to have to close it there, but clearly we’ve just started this conversation. Nabil, Doug, Chancey, thanks so much for coming in.

Laura Flanders:             If you want to find out more about Cooperation Jackson, you can at our website. We were pretty much there at the birth and have done a lot of coverage. That’s lauraflanders.org So, thank you all. It’s great to have you.

Nabil Hassein:               Thank you.

Laura Flanders:             I appreciate it.

Laura Flanders:             As we’ve heard, the new digital age poses as much a threat as a boon to our liberty and sense of self and society. We’re all suffering from what our guests Doug Rushkoff calls present shock, an onslaught of anti-human tech. Fortunately, he says there is a solution and opportunity lies before us to create more space for people to be people by embracing and re-vivifying the present. The choices ours. Here’s Doug.

Doug Rushkoff:             When I first encountered the Internet, I thought it was going to set us all free. We were all going to work at home in our underwear, trading things back and forth. It was a world where anything was possible and we were going to do it in our own time. But instead, we strap these devices to ourselves and have them ping us every time somebody texts or messages us, as if we’re supposed to respond to everything as if it’s this real-time emergency crisis, and that’s what’s put us in this state of present shock. The only kind of people that were interrupted this frequently and this incessantly used to be 911 operators and air traffic controllers, and they would only do it for two or three hours during the day. And they would be medicated in order to live that way.

Doug Rushkoff:             Present shock, present shock, present shock, is the human response to living in a world where everything happens now. It’s a real time always on existence without any sense of beginning, middle or an end. It’s just now.On the one hand you could just go into the moment. You could have this kind of Dao like sense of peace and here we are, I’m in the present. But most of us aren’t there. Most of us are. Instead, chasing this kind of boss now of our Twitter feeds and our email inboxes, trying to catch up with the moment as if the present wasn’t something we live in, but the present was something we had to grasp to. And the problem for us is the inability to be in touch with any of the natural rhythms that underlie our human experience.

Doug Rushkoff:             Okay. The ancient Greeks had two for time. Chronos, which means time on the clock, and Kairos, which means human timing. So you could ask, what time did you crash the car? I crashed it at 4:01. But what time do you tell dad you crashed the car? 4:17? No, you tell dad you crush the car after he’s had his drink and before he’s opened the bills. That’s Kairos. It’s the sense of readiness or human timing, something that only people can understand as we move through the temporal landscape of human experience.

Doug Rushkoff:             The industrial age was all about Chronos. Time is money, do things faster, increase your production over time. What we’ve ended up doing really is taking a 21st century technology and use it to reinforce the 13th century operating system. This is really that the central problem of this age, this conflation and confusion between Kairos and Chronos is use of technology really to take people out of the time that only people can understand.

Doug Rushkoff:             We’re spending an increasing amount of our time on digital landscapes about which we know little or nothing, and we treat the web and these devices as if their preexisting conditions of nature, but they’re not. They’re platforms that were designed by people in corporations with very specific designs on who we are and what we do. Now, if you ask a kid “What’s Facebook for?” They’ll tell you, “Oh, Facebook is here to help me make friends.” If you go to the boardroom that Facebook, I promise you they’re not sitting there thinking, how are we going to help little Johnny maintain his friendships? No, they’re looking at how are they going to monetize Johnny social graph and it’s big data. This is why we’re getting such unpredictable results with our technology. We’re incorporating them into our lives without any real sense of who made them and what they made them for you. And if you don’t know what a program you’re using is for, then chances are it’s using you instead.

Doug Rushkoff:             So I think the easiest way to contend with present shock is to embrace the present, find the present, explore and reify the rhythms that are in forming who you are as a person and as a human organism just living on the planet. Most people don’t realize that each phase of the moon corresponds to a different neurotransmitter in the body.

Doug Rushkoff:             The first week of a new moon, a body tends to be dominated by acetylcholine, which is a very specific neurotransmitter associated with new ideas and making new friends and being open minded.

Doug Rushkoff:             If you’re in the second week of a moon, you tend to be dominated by serotonin, which is all about getting things done and being industrious and reaching conclusions. If you’re in the third week right after the full moon, you’re dominated by dopamine. It’s really the party neurotransmitter. You want to relax and enjoy people not work, right? You’re not about getting done.

Doug Rushkoff:             If you’re in the last week of a moon, you tend to be dominated by norepinephrine. It’s a very analytical chemical. One that’s associated with organizing things, with moving above the situation and seeing what happens when, how do I sequence this? How should I plan my life? Where do I put everything? It’s kind of a colder state that’s not really about bonding with other people and much more about figuring out the structures underlying things.

Doug Rushkoff:             The simplest things, day and night, seasons of the year, phases of the moon. These are how human beings grew up. This is what predated civilization. This is what makes us feel at home on planet Earth and it’s what can give us coherence to help find the rhythms by which people really live, to connect to the rhythms that everybody else is living and no longer be victims of present shock.

Doug Rushkoff:             Present shock is disorienting, I get that. Whenever we move from one technological age to another, there’s bound to be a bit of wobble. But this moment is also an opportunity. It’s an opportunity to use technology and humanity really to embrace both kinds of time, bring them together to create if anything or new synthesis and our understanding of time, what are the rhythms, what are the patterns underlying human experience and how can we live really in harmony with them rather than constantly trying to work against them.

Doug Rushkoff:             I think that we have a choice here. We can use these technologies in the wrong way, I’d argue, to create ever more conformity, more schedules, more restrictions, more regimentation, more robotic activity for people so that we’re more efficient and get more done. Or we can use these technologies instead to create more time, more space for people to be people, free ourselves and write programs that allow us to return to human time and let on machines take care of Chronos for us.

Doug Rushkoff:             To restore the rhythms that give us coherence and that help us to rediscover one another culturally, socially as really living organisms rather than a just cogs in a machine. This could be the moment that we release ourselves from really 2,000 years of understanding time as a burden. Understanding time as something that contains human beings and instead see it as a partner. We have Chronos to keep track of what’s going on and we have Kairos to actually live it. If we can do that, then this moment of present shock will be the moment we remember as the time that we set ourselves free.

Laura Flanders:             It’s that time of year again when we’re encouraged to celebrate black history and soon it will be Women’s History Month. The time when people like me ask what about the rest of the year? This season, Black History Month coincides with the start of the presidential primary campaigns. On the Democratic side, we’re already seeing journalists stretching for that pencils to divvy the candidates up. So far the main divides they’ve identified seem to stem from which of the contenders lead with race and gender justice and which one to suck it to the corporations.

Laura Flanders:             But those social versus economic distinctions aren’t going to hold up for long when every last Democrat for all of their faults is a civil rights paragon in contrast to the Klan endorsed guy in the White House. No. This primary campaign is not going to be waged about where Democrats stand on things like abortion and marriage and voting rights, but rather on where they stand on property, and public ownership, and workplace democracy, and taxes. Much as they are out of practice journalists are just going to have to grapple with economics. So far this hasn’t gotten more granular than asking candidates if they call themselves Democratic Socialists, we’re going to have to do a whole lot better than that.

Laura Flanders:             Just as we need to get beyond the superficial celebration of a few abolitionist heroes to take a long look at the ideas and assumptions that underpin white supremacy and patriarchy, so too, we need to look at the lens through which we think about wealth. Conveniently, the questions aren’t all that different. Are our life choices and outcomes determined by ourselves alone in an objective world free of bias? Are our successes affected by our individual character or also by social structures? Things like systemic privilege. If the former, a few reforms will do. If the latter, we need to shake things up.

Laura Flanders:             Donald Trump is already throwing around invective about socialists. At the very least, we need a socialist history of economic history month to sort things out. They would do. Much better than that though, we need to take a long, hard look at the ideology that underpins the system we call capitalism. On that, like it or not, the clock is quite clearly ticking, so if we have to have such a long campaign season, let’s make it a season of great debates.

Laura Flanders:             You can hear more of these commentaries and get them weekly in your inbox by signing up to be a subscriber to this program. You can do it at our website.

Become a patron at Patreon!