Tech dystopia: 7 ways to help swing the pendulum back
Written by Anne Collier
Technology is neither all good nor all bad. Neither is it neutral, as technology historian Melvin Kranzberg famously said. But that’s another story. This one’s about how we view technology, as a society (and as a planet being connected by it). With national elections, human civility and data privacy seemingly out of our control, the societal pendulum has swung from utopian (remember that time when “connecting the world” sounded so positive?) to a much darker, dystopian view. The news and our negativity bias would have it that, for example, when we hear news of airline crashes, we extrapolate that air travel is really dangerous. When it’s actually one of the safest modes of travel.
“Our view of the world has a fundamental tendency to tilt toward the negative,” says social psychologist Alison Ledgerwood in a TEDxUCDavis talk. And especially now and where tech’s concerned, we’re all up in our lizard brains. We’re walking amygdalas.
We need to get the pendulum back down to that middle point. That’s why I like the name “All Tech Is Human” – because humans, like the tech they create, are flawed, and that’s ok – when we create and occupy space for learning, humility and listening to one another. We know consensus building is good, diversity is good - we need all perspectives and stakeholders in the same room, because both our problems and their solutions are more complex than ever.
“The Valley is right now like a patient who’s just received a grave diagnosis,” former Obama White House tech policy adviser Aden Van Noppen told New Yorker writer Andrew Marantz. She said there’s a type of person who reacts to that grave diagnosis by staying in denial—trying to keep anyone from knowing there’s a problem. “Then there’s the type who wants to treat the symptoms—quickly and superficially”—hoping the problem will go away on its own. “And there’s a third group that wants to find a cure.” She’s suggesting that 3rd group is the right one, right? “Finding a cure.”
But what I want to suggest to you is, that’s the problem, actually. People frantically seek a cure only when they’ve bought into the assumption that the situation is grave – when we’re thinking from our amygdalas. The amygdala is not the center of our creativity.
I suggest that we need to step back and examine that assumption together. That’s never easy, but I think it’s even harder in a filter bubble where the lights are out, where people in tech are listening to very dark, one-sided news about their work and hearing from fellow people in tech (or former fellow people in tech) share their own fears about being on the “wrong side of history.” I suggest that even that third group that sincerely “wants to find a cure” can drink too much of the dystopian Kool-Aid.
So what is the middle way between utopia and dystopia? For one thing, it includes thoughtful tech use and tech development. On the user side, it’s very individual – a big reason why we need to listen to each other a lot. It’s also situational (in time) and contextual (or environmental). No matter how long we try, we won’t make it perfectly humane for everybody. So there are actually a number of middle grounds. For now, as talking points, here are 7:
Humility and mindfulness: modeling it as well as calling for it – recognizing that there is no one solution, such as prohibition or digital detox (what about judgment detox?), and no one place to assign responsibility or single source of understanding. This is a social media environment we’re talking about. We need to stop with the shaming and blaming, because it breaks down civil, productive engagement. So the next one, logically, is...
Listening, not just messaging: From tech to policymakers to advocates, listening to the people we’re serving. Prevention education has its limits, especially if it fails to check in with the learners, its so-called beneficiaries. As a youth advocate, I’m biased. I feel that HAS to include people under 18….
Balancing ALL 3 categories of digital rights for all users, not just children. Even though the 3 categories of rights are laid out in the UN Convention on the Rights of the Child (which has been ratified by every country on the planet except the United States), they apply to all users: rights of provision (education) and participation (conscience, expression and association) as WELL as protection – the “3 Ps.” Too often societies throughout the planet focus almost exclusively on protection rights, to the detriment of people’s participation rights and excluding the views of those whom they would protect. Having input on policy that would affect you is a fundamental human right of everyone, regardless of their age.
Inclusion, not separation: including all stakeholders, skill sets and perspectives in solution development, not blaming, vilifying or demonizing any one of them. The adversarial approach no longer serves. What is the value of demonizing tech and continuing moral panic, except for opportunistic policymakers to gain votes or NGOs to raise funds? But they do need to be in the room so they can hear the views of those whom they intend to serve and those with alternative solutions.
Education, or prevention, plus intervention: It’s neither all cops and lawyers or all content moderation on the intervention side nor all user education on the prevention side. It’s all the above. And education means equal weight to all three literacies of our very social digital media environment: social literacy, digital literacy and media literacy.
It’s humanity PLUS technology. That doesn’t let platforms off the hook. It’s just that, in a very social media environment, we all have to take responsibility for reducing hate speech and other problems online and offline – for making tech truly useful and safe for everybody involved.
A middle layer of user care: I mean a middle layer between help in the cloud, such as abuse reporting to the platforms and help on the ground – from professional specialists such as 911, the Suicide Prevention Lifeline, Crisis Text Line, etc. This middle layer helps users as well as the professionals both on the ground and in the cloud. It knows both landscapes. It offers credibility that corporations can’t, by definition, enjoy. It reduces false positives in content moderation, etc., etc. – I can explain any of that if you’re interested. But it’s growing all over the planet in an ad hoc way, in the forms of Internet helplines, Article19’s proposed Social Media Councils, FB’s developing Oversight Board, and new Internet-related government agencies and quasi-gov’t entities such as Europe’s deletion centers, Australia’s eSafety Commissioner’s Office and NZ’s Netsafe. This, I suggest, is what needs more news coverage – so we can start working together to build it out in a conscious way.
The middle way has never been particularly exciting or sexy, but it seems to be getting increasing traction, so as it challenges current assumptions and stands out in ever starker contrast to adversarial attitudes, obsolete “solutions” and dark predictions. It definitely seems worth exploring, don’t you think?
Writer and youth advocate Anne Collier has been chronicling the public discussion about youth and digital media since 1997. She is founder and executive director of national nonprofit organization The Net Safety Collaborative (TNSC), whose main project is a social media helpline for schools. Anne is an advisor for All Tech Is Human.