🔑 Key Takeaways from the All Tech Is Human Library Podcast Series

The All Tech Is Human Library Podcast is a special 16-part series featuring a series of rapid-fire intimate conversations with academics, AI ethicists, activists, entrepreneurs, public interest technologists, and integrity workers, who help us answer: How do we build a responsible tech future?

Key Takeaways

  • The more institutions and organizations are open to change, the more it will allow for collaboration towards encapsulating an internet that offers opportunity to all users. One where people will not fear voicing their opinions and concerns about societal issues — and an internet where people have the option to create independence for themselves.

  • Technology is developing faster than it can be regulated. Institutions and companies have to do a better job at adapting and creating digital infrastructures that can anticipate emergent technologies. People need to be at the heart of tech development.

  • Education is a big factor in preparing new generations for the impact they are going to have on the market and culture-at-large. Specifically, education about the systems and structures that created our current tech ecosystem are imperative to create a brighter foundation about what comes next — and how we can avoid the same pitfalls of yesterday. This comes further into play when young people build new, exciting companies and startups put their employees and their users’ interests above anything else by driving their culture around digital responsibility and ethical participation.

  • Mainstream social media companies need to re-center human values. The commodification of interaction as a market force - and the resultant data-economy behind it - must be reconsidered and redefined to allow for a true sense of connection and presence with peers. Prioritizing focus on consistent collaboration towards impactful societal goals and issues is key for the development of social media platforms and all tech futures.

Quotes

I think we still need to maintain hope that we are moving in a good direction, and people still need to have a strong vision in terms of the things that they're accomplishing and the problems that they're solving. That should never go away because that really drives people. It motivates people to push forward on where the current status quo of things is. But I think equally what we need is if we do see that things are veering in a direction that it seems like there are some either intentional consequences and risks that are occurring or unintended consequences, that there are good systems of checks and balances and governance structures to have people be able to raise their hand and say, "Hey, let's think about this a little bit." [What are] the consequences of what we're doing. And people being able to have those hard conversations may reroute what you were thinking that you were gonna do. Because it has these costs, right? And we're just not willing to take on those costs because it's the process of doing business. So I think again, those fundamentals of thinking that way are really important. Of course, I have certain existential threats and certain types of technology don't happen either...but hopefully, if you have the people there to say, Hey, this isn't the type of future we want. Let's think about an alternative. And that's actually taken seriously. [A culture] of questioning and again of alternatives...it's hard to say that, "Oh, the status quo has always made us all this money, and so we need to pursue it that way." Maybe it could be different. Why does it have to be that way if it's coming at these risks in these costs? So hopefully we'll build more cooperation, more unity amongst people to solve and tackle some of these big issues and these preexisting systems that push things in a certain direction. -Erika Cheung

================

I want people to - it sounds so silly but it's so deep - meet your neighbors. Share stuff with them. Again, another great Cory Doctorow quote: Rather than having a minimum viable product drill in every single house on your block, why don't we all buy one good metal, hard-hitting, fucking nice drill, keep it in Joe's garage, and that's where you go when you need a drill. I promise you there'll be enough drill power on your block, but there'll be more effective drill power on the block... and you're gonna have an excuse to go to Joe's house. You're gonna have an excuse to say, "Hey Joe, where's the drill? And he goes, "Oh, Seymour's got the drill now. Okay, see you, Joe. Oh yeah, have some cheese danish." It's such an easy, fun, wonderful life. The only obstacle to that, the only problem with that is, "Oh, what about the drill company? They're not gonna sell as many drills, uh-oh. What? And what does that mean? Oh." So we are in service to this economy that should be serving us, right? And digital helps create the illusion that this symbolic realm matters more than this one. And it doesn't. This is the realm where it matters. I think it's gonna be as easy as it starts with that...put one barbecue pit at the end of the block that people use instead of everyone having their own. And then a barbecue becomes this part of your community. Play cards with people, make love with people, [and] make eye contact with people. -Douglas Rushkoff

================

I believe in a world to live in is a world where the sum total result of the shapes that we've created are more of those. Like you look at this [artwork] here, and there's a beauty and wonder and it looks chaotic. It doesn't necessarily make sense, but it's within this ecosystem and this framework that makes sense of it with its boundaries. And there's that respect of the edges so that this can beautifully exist. That...I believe if we can just commit to that....I don't know what the policy needs to be, I don't know what bill needs to be signed. I don't know. All I know is that nature itself has done a pretty good job of making a great place for us to live and exist in. And if we can continue to support that reciprocal relationship with what we build and what we create and what we say and how we live, that's a future I'm down for. -Marshall Davis Jones

================

When we listen to the many voices in our own personal lives, it's hard because sometimes you just wanna talk. You want to…you just wanna say what you need to say…There's so many people you wanna be, right? So many people want power. I had these conversations with another…peer mentor and he reminded me people want power, but they don't want accountability. And so you have to be careful what you say. And the only way that you could be careful and cautious and intentional about what you say is if you know how to listen. -Dorothy Santos

================

For me, that world would look like using data to reduce pain and sadness and hurt and trauma because the world doesn't need any more of that. But what we do need would be an inclusive and equitable and justice-oriented future where people can get up every day and know that they belong and they are making a contribution that is going to be respected, celebrated, valued, and they are definitely going to be considered as someone worthy of this life as well. -Renee Cummings

================

[The future of technology] should inspire us to human creativity. It should let us do the things that bring us joy, whether that's art or literature. It should inspire us to create more and better and augmented things. It should create shared economic opportunity, which means that as these technologies are creating new models for how we might build economic systems, we need to move to a world where there's a cornucopia of bounty and that everybody shares in it equally. We need to live in a world where technology transforms political power. Where individuals understand what's happening are able to use technology to influence policy because I have a deep hope and a confidence that if we empowered communities to be a part of shaping our shared destiny, we'd live in a world where all of these things I've just described would happen naturally. So really, to me, the best form of a technology future is one where people are at the center and have the ability to make decisions that create a dignified future for all of us. -Vilas Dhar

==============

I think the better future online is in a lot of ways paralleled by a better future offline, right? A better future in the real world. I think tech companies need to be thinking more carefully about how they're designing for positive outcomes and for positive interventions when there are problems like violence, extremism, or bullying, or child sexual exploitation. How do I proactively design my system to enhance the good behaviors we want when it comes to the offline? Governments need to be addressing these really core problems that we see at the fundamentals of society. People who are feeling disengaged, who are feeling excluded, who are feeling lonely, who are feeling that they don't belong in society, but those are the people that need to be reached. Those are the challenges that need to be addressed. And until we address those in the offline world, we are gonna see these reflections of these problems and the amplification of these problems in the online world. -Tom Thorley

==================

I'm definitely more of the anarchy, burn it down type person and start again. I think in order for us to really build a responsible and healthy tech future, even just an information ecosystem, we can't continue in the structures that we have. It doesn't allow for it. I think they're too rigid and I think we need to start, maybe it is a public interest internet that I know has been talked about and pushed forward. I think that may be the better route to take versus trying to change the system from within. -Diara J. Townes

=================

And so what I see, if things got better...corporations wouldn't just be looking at...AI ethical risks...or just privacy or big data. They'd be looking at what I like to call "digital ethical risks," and what a robust, comprehensive program looks like that takes care of or addresses all the ethical risks, each of the ethical risks of those technologies, and the ways in which they relate. So the better future looks like one in which there are widely deployed, robust digital ethical risk programs and corporations around the world. -Reid Blackman

================

I think it's a tech future that doesn't feel like it's a necessary evil, or a lesser evil, or something you have to do. It's amazing that you know how social media, what we compromise with our data, what we compromise by watching those ads and feeling manipulated because we feel like we have to, right? And I feel like a tech future where we don't feel at all that we're making those compromises, that it feels organic to us, that it feels natural, that it's something that we would never feel and say is unnatural. And I know that sounds crazy, but when you connect with humane tech, you never for a second feel that it's not something that shouldn't exist in the world. And that's what I would hope for. -Gabo Arora
===

Well, the two things I brought up in the report [are] one, focus on accountability. I think for me it is we can hold these systems accountable, but we know who's accountable. I think right now we are all pointing fingers in different ways. We're like, "I'm accountable for protecting my data and I might not know how to do that." I think the government's accountable, the state's accountable, and companies are accountable...We want accountability, but we don't actually know how to drive that and where that comes from. So I think we learn who we're gonna point to in terms of accountability and how we get there. And I think a responsible tech future is dependent on us figuring that out. And then, the other part is...power to the people...The people are the voice, they are the heart and soul of all these movements. And I have so much faith in the people. There's no coincidence that the people changed id.me. There's no coincidence that so many of the things we're seeing right now, people have invested time and effort into moving these movements forward, and I believe they can have so much power when they are together.  And so I think the responsible tech future can look like many different things, but the two things that I really feel strongly say...understanding how to hold who's accountable and holding accountability, and then us leveraging our own power in that space to actually bring the actions and the future we want. -Amira Dhalla 

================

My best…near term vision is... before we even get to, some of these values, integrity, respect, like those are important and we should consider them. But before we even get to that point, we should be thinking about ‘What do we want out of society and the environment of people before we even decide to build technology, right? Are we trying to make healthcare access [easier] for society?’ If that's the case, there are probably two dozen different ways to do that off the top of my head. And maybe technology is one or two of those things, but technology is not always the answer. And then once we decide, ‘Okay, technology is going to help us achieve this kind of societal benefit, how do we do that in the best, most fair, most respectful way,’ and so forth. But it's very conscious, very thoughtful about why we are doing tech in the first place before we get to that kind of set of ethical questions. -Chris McClean

================

I would say that when people can start to look at things from a fresh, clear mind without being influenced or psychologically manipulated by technology, I would like to see a world like that. Where you're in control of your data, and we're gonna do that with your own social graphs on this new platform. You're gonna be in control of your own social graph. It's not something that big tech's going to take from you, or big tech can no longer leverage you and make money off of you. You spend your lives building your friend groups, building businesses that you have, trying to make a living, and that's something you should own. So I would love to be a part of a situation where my children can go online to really have experiences with friends, not feel bad about themselves, where people can control their data and information, not have the privacy of their lives infiltrated by big tech, and to really put the control back into the people, more of a democratic kind of environment in all respects. Probably the biggest thing is [to] get tech out of the way so people can begin to connect and interact again. In the past, we used to be able to have honest and open discussions about anything, and today we can't. Today if you're on one side of the fence or on the other side of the fence, there are arguments, there are fights, there's violence, there's craziness. And people should have their say in all cases, but let's just go back to a world where it's more rooted in common decency. If we're fighting among ourselves, then we're not accomplishing anything. -Jeffrey Edell

================

I think we need to increase trust and safety on every platform. I think we need internet literacies to be spread amongst the public and to feel a little less distant from the magic of the Internet. We need to stop thinking about it as something that's distinctly separate from us and understand that the internet and real life are intertwined in a way that could be inseparable. So trust and safety departments have to start integrating people who understand that, and see that so we could start having social networks appreciate the way that trust and safety operates. So internet literacy, transparency, and [an] increase in trust and safety. -Jamie Cohen

Next
Next

All Tech Is Human Library Podcast Series #16 | Jamie Cohen