Fireside chat with Yaël Eisenstat

Interviewed by cyber law & policy expert Betania Allo

Yaël Eisenstat being interviewed by Betania Allo at the Responsible Tech Summit: Improving Digital Spaces held at the Consulate General of Canada in New York on May 20, 2022.

Fireside chat from our Responsible Tech Summit: Improving Digital Spaces held at the Consulate General of Canada in New York on May 20th. Find the full event overview here.

Key Takeaways

  • As a society, we have a responsibility to address social problems, inequalities, and complex conflict that existed prior to digital spaces - those are not products of social media companies. However, this does not absolve companies from enforcing a business model that is intentionally designed to sustain these oppressive systems and systemic issues.

  • There needs to be a shift in technology development from “scale now, fix later” to giving the space to build tech in a better way that ensures the product isn’t going to hurt people and without the pressure to scale immediately.

  • There is hope in new technologists that take an interdisciplinary approach to ensure technology serves us better.

Quotes

“It’s not about whose fault it is. It’s about every single piece of the puzzle that contributed to what happened.” - Yaël Eisenstat

“These companies were birthed here [United States] and these companies scaled to the degree that they scaled and exported their ideology to the rest of the world because of our legal permissive environment here and it’s on us to fix this.” - Yaël Eisenstat

“The reason we need transparency of some sort into how these systems work is so that then we can build the accountability mechanism, if in fact, they did actually push someone into these radicalized groups.” - Yaël Eisenstat


Q&A | Transcript

Betania Allo: How would you define the current status of the use of technology in America, specifically?

Yaël Eisenstat: You know, I think technology is a really big word. Actually, I think Greg [Greg Espstein] mentioned this earlier - what is tech vs. what is technology? So I’m going to narrow it a little bit to social media, not because I think social media is the end-all, be-all only thing that has ever gone wrong in the world, but it’s I mean, I always have to say that because it’s crazy how quickly people will lump you into a bucket that you never actually said you are in, but it is what I have worked on the most; both intimately and bot within Facebook and before I went there. And I do think it is the biggest catalyst for some of the issues that we see right now. You said in the US in particular, so I do think that’s an…it is interesting I spent my whole career focused overseas and then I did a dramatic turn to focusing in the US around 2015. Specifically, because I had spent my whole career on counter-extremism issues, right?

Betania Allo: It would be great if you can maybe pose like a contrast, how you see it abroad given your experience.

Yaël Eisenstat: But it was very clear in 2015 for some people earlier. I was no way the first person to see this; that there was something happening in the US; that while there had always been anger; there’s always been hatred; there’s always been polarization. All these things have always existed, but something was accelerating it in a way that really did feel new to the point where it wasn’t just that we had more and more divisions being exposed, but it was the fact that those people who really were sort of here, in terms of how they felt about any given issues were being further and further pulled into the extremes.

And so it’s been digging in for years, including a short stint at Facebook. I laugh because anyone who knows me knows that I left pretty abruptly and have been public about why, but the bottom line is this: do I blame social media for the problems that we as a society have yet to truly grapple with or yet to truly address? Absolutely not. That does not absolve particular companies of intentional design and business decisions that they have made to intentionally ensure that…I’m not going to give the attention to the economy speech right now, but it is important to really remember this, many people would have you shift the discussion to talk about content moderation. Why? Because content moderation is an unsolvable problem in the United States. You know what is solvable? A business model that intentionally ends up surfacing the most extreme voices and often silencing those people who actually want to have truly engaged nuanced conversations.

And that’s what I care about. I care about the intentional decisions, not about, “should this piece of content come down? Should that piece of content be up?” And I don’t think we’ve gotten anywhere on that yet in the US. And it’s really frustrating because I think many of us don’t realize that Facebook in particular wants us to talk about content moderation. They do not want us to talk about their business model and so they make sure that we continue to talk about content moderation.

Betania Allo: This is her opinion, okay?

Yaël Eisenstat: That is my opinion.

Betania Allo: You were talking about of course content moderation, what are the challenges that you see besides Facebook? Like in other tech companies and ket’s expand this to other countries, how do you see these challenges being overcome through the years?

Yaël Eisenstat: Yeah, so I want to be clear on one of those reasons why I use Facebook often, most often: well A) because I worked there, so I actually have intimate knowledge, but also, B) because I do still find that to this day, they are most egregious. I do find that, so it’s interesting we like to talk about how they’re starting to lose their relevance in the US and maybe we should be thinking more about Tik Tok, or maybe we should be thinking more about Discord or any (name your platform). Yes, I think we should be thinking about all of them, but let's not forget that Facebook recklessly, relentlessly, and intentionally, scaled to dominate the entire world’s information ecosystem minus maybe China; minus maybe Russia, now. It was an intentional decision. They went into countries that did not have a robust media landscape to benign with. They struck deals with national governments and telecoms to become the app on people’s phones that became the gateway into the internet.

I bring this up for a reason - we are trying to move on in the US and say, “oh, we should talk about…they’re not that powerful anymore.” Yes, they are. You go to the Philippines; you go to India. I went to India on a Facebook research trip when I worked there. Every single person we spoke to when we said, “how do you get onto the internet?” They said through Facebook. So I just want to be really clear that we have a very US centric lens often when we’re talking about this, and rightfully so.

These companies were birthed here and these companies scaled to the degree that they scaled and exported their ideology to the rest of the world because of our legal permissive environment here and it’s on us to fix this. It really bothers me that we’re counting on Europe and Australia and Canada to fix something that we helped create. That said, I will help Europe and Australia and Canada since I actually have more hope that they will get there before we will, but shame on us. I just want to be very clear about that. But we have very much exported Mark Zuckerberg’s ideology to the entire world.

And he did it intentionally and has never invested in also, building up the protective measures for the rest of the world, and so it’s when people say, “why do you still talk about Facebook” I mean, there’s nobody under 70…I’m under 70, but whatever, “nobody under 70 uses Facebook.” Really? Have you stepped outside of the United States? And Instagram is Facebook too; and WhatsApp is Facebook too. So anyway that was very lengthy, but my point is just this, if you intentionally…if your goal is to dominate the entire world’s public square, if that’s what you want to call it, why on earth do you think you don’t have the responsibility to be a good steward of that? And that is why I always talked about the responsibility side of it, not the “did they make the right decision on this on any given day?” Because as we heard in the last panel say those decisions, like I actually do give space for mistakes. Mistakes happen. It’s how you choose as a leader to deal with that mistake. That tells me what I need to know.

Betania Allo: I want to take you back to your counterterrorism experience. What comes to your mind when you hear news like the ones that we received from Buffalo [New York] last Saturday [May 14, 2022]?

Yaël Eisenstat: So, what happened in Buffalo is a product of so many different things. And what’s interesting in the conversation I’m hearing now is there’s already people immediately saying “well it’s not Facebook’s fault that this guy was a racist.” It’s not about whose fault it is. It’s about every single piece of the puzzle that contributed to what happened. Here’s the biggest problem: we have systemic issues in the United States of America that we have never come to terms with and that is not Facebook’s fault; that is not Discord’s fault; that is not Twitter’s fault.

However, when I was in the…so in my counter extremism days, I spent a few years in particular leading what some call our hearts and minds work along the Somalia border. If you’ve heard my story before, you’ve already heard this, but I’m going to repeat it now. And the number one thing I learned during those few years, and this was pre-Facebook like 2004-2006, is I was spending time in vulnerable communities that were specifically susceptible to extremist messaging and all I was doing was trying to build trust. And most of that came from me listening, not speaking, no arguing, just listening. Understanding and building actual trust. That sounds so old school, I get it, but the other thing I learned is what made people vulnerable to extremist messaging…and there were lots of certain traits that it wasn’t whether you’re rich or poor. It was usually you feel disaffected, powerless, marginalized. You don’t believe your government is there to actually help, protect you or take care of you, and this outsider has come in and started exploiting your vulnerabilities to make you feel like you belong like you’re part of something and then start to radicalize you. And that was the core to my many years in the counter extremism world.

So what do we see online now? What did we see with the guy from Buffalo? What did we see with the woman who was shot and killed when she entered the Capitol [United States] on January 6th? Like I did a deep diver into her Twitter feed. We saw a US veteran who came back after a few tours overseas who was disaffected; who was marginalized; who was having a hard time coping with how to readjust to civilian life, and then you watch her social media feed. She starts getting fed more and more in extreme content. And here’s why I bring all of this up, and I’ll wrap up…I know I’m going on for a very long time here, but in the days when pre-social media, radicalizing an individual was just like…it was a process. You had to understand their vulnerabilities, and then you had to feed into that and make them trust you, and then go down this path of radicalization.

Today, we have recommendation engines that have figured out what makes you tick. And I mean just don’t take it from me. Read the research that came out of these so-called Facebook papers; read the article on Carol’s journey to QAnon. It proves exactly what I’ve been saying for years that…look it up. It was an individual; a fake account that a Facebook researcher made who went online. She liked like three things; she was a fake midwestern woman in her 40s, a mom. She followed one or two politicians and one or two outlets. Within two days, she was being recommended conspiracy theories and within I think four days, she was being actually recommended into QAnon groups. What does that matter? Because Facebook will have you believe they’re a mirror to society, and that they’re just showing what you are looking for. And I would say as long as we have never had actual transparency into how the recommendations systems worked we will never know if any of these people actually went looking for QAnon or went looking for these hate groups or whether they were pushed into them. And that’s why two, I think it was probably Renée’s [Renée Cummings] panel when they talked about transparency for transparency’s sake. One hundred percent was to double click on that.

The reason we need transparency of some sort into how these systems work is so that then we can build the accountability mechanism, if in fact, they did actually push someone into these radicalized groups, so the guy from Buffalo, I mean, his manifesto is one thing, but now they’re starting to release some of this Discord chats from while he was there. He had a toothache, apparently that was his number one grievance. His Jewish doctor didn’t fix it, so then he started blaming Jews for it and then like all the things and you can say that’s not social media’s fault, I agree - that’s many problems.

Betania Allo: Yeah, and also double clicking on the accountability issues, as well like due to the lack of designations of these group I would say, and also, the existing narrow focus of hate crime legislations, as well under which these perpetrators are often tried; it’s quite challenging I guess for states in general to be able to prosecute these people effectively who are so clearly driven by xenophobia, racism, and other forms of intolerance. And I would like to take a step back and I’m curious if we can zoom in on what you mentioned about the evolution of how people were radicalized prior to social media and maybe what do you think are some of the dangers that have developed over the years?

Yaël Eisenstat: This one, I’ll try to make a little shorter. I know I can be very long-winded. You know, when I was in my more counter-extremism days in the early to mid-2000s, we would talk about the lone wolf. That was the issue we were dealing with, right? Believe it or not, I joined government before September 11th, so I went through this whole process and path of different kinds of work in this space, but then we started having what we called the lone wolf and if it really was like the Fort Hood [Texas] shooter [November 5, 2009]...this might be before many of your time, but this man was apparently radicalized by Anura Lalaki who was a cleric, an American board cleric in Yemen.

And what happened here, yes, the internet was definitely a factor because he started watching Alalaki’s sermons, but then they started emailing each other. And so why did we call him a lone wolf? Because a lone wolf is an individual who was racialized and acts on his own. Here’s what’s interesting about what’s happening now, and we struggled with that in government; we struggled with it in part because we only want to believe that terrorists are foreigners. We don’t want to believe that terrorists can be Americans that is a huge other issue in and of itself. But now online, that same one-to-one process of radicalizing someone is not even necessary. We can’t call these lone wolves anymore. These are people who find their crew online; who find their fellow white supremacists, and they all start to go down a further and further path of grievance together.

You don’t need that one terrorist somewhere off in the world to really hands-on recruit you anymore, so lone wolves in and of themselves, were very challenging to tackle and now we havea situation, where it’s entire groups, and let’s not sugarcoat it, we like to be politically correct and sugarcoat it of white men with like this sort of victimhood, grievance going on, and the internet is really helping them find all the reasons to believe that there grievances are legitimate and yes, I believe that actually Facebook does not want this on their platform. I will be clear about that. I don’t blame them for everything in the world, and I believe they’re trying to tackle some of that, but they will not touch their business model; they will not touch the piece of the puzzle that is actually helping identify your vulnerabilities and sucking you into the platform to keep you engaged and that’s what bothers me and there’s a few things that bother me.

Betania Allo: Feel free to rant. We were focusing a ot on big platforms vis-a-vis Facebook, what’s your take on smaller platforms because it’s undeniable the migration that happens from bigger platforms to smaller ones that do not have the resources maybe to change their business model or to content moderate content more effectively, so what is your take on those?

Yaël Eisenstat: I actually think it’s often the other way. It starts on the smaller platform and then migrates to the bigger platforms, but this is why I know that people think we can’t regulate our way out of this, but that doesn’t make it okay to not have any government imposed guardrails on how whether it’s surveillance capitalism or whether it’s surveillance advertising. All the systemic parts of platform design…let’s be clear: here’s why I have a hard time talk-, I know what’s coming next, “how do we fix this?” I’m jumping ahead, but still on this question, I get asked this all the time and everyone in this room are the type of person who is trying to figure out how to fix this, right? But the one thing we’re never allowed to talk about is unfettered capitalism and how that plays into all of this and asking me to figure out how to fix this in an environment, where…no, but it’s really important. Normally, there are certain level levers for accountability, right? This is what our free market account economy is about five general levers: 1) government - well that is absent in this space; 2) the next one is stakeholders - doesn’t apply to a company like Facebook because Mark Zuckerberg has his dual class structure shareholders can’t hold them accountable; 3) the next one is supposed to be the markets - I’m sorry, but Wall Street has absolutely rallied behind Facebook after a $5 billion FTC fine. That leaves two groups: 4) employees, which hopefully, more and more employees are going to stand up and demand more or quit, and I mean for these companies I would say 5) advertisers as opposed to consumers. Those are the only two groups in my opinion who right now can hold any sort of flame to the fire. As long as Wall Street venture capitalists, all of the money behind it continues to not care, than asking someone like me or anyone in this room, “how can we fix it?” I’m amazed by so much of what so many people in this community are working on, but it’s an uphill battle when the power structures still remain that one Silicon Valley, whie guy possibly two if Elon Musk succeeds, continues to hold the power of the entire way the world connects with zero accountability baked in. Okay, I’m not always a downer. Ask me a positive question.

Betania Allo: I’m going for that. Right now, exactly, I want to focus on the brightside. If we can, what kind of policies can you mention that in the tech industry that actually worked?

Yaël Eisenstat: Policies or you mean like self-regulation?

Betania Allo: Incentive structures.

Yaël Eisenstat: Okay, so incentive structures is the perfect way to frame it actually because it is all about incentive structures, right? I am very happy to see how many, sort of, the next generation of technologists are asking much deeper, more critical questions. I am really hoping we get to the point where we stop holding the tech founder as a god who cannot be questioned because his vision is more important than his execution. I am encouraged by it as long as it is not window addressing it, it becomes real. By companies understanding what it means to have the right people at the table; to discuss how you should be developing your product.

I am so encouraged by a lot of that I’m not totally encouraged by the venture community and who they continue to back. I’m hoping that we can make some dents there. I find it a bit unfortunate though a lot of folks in this community will call me and ask me, “hey, can you help us think through”...again, I’m going to keep quoting Renée, the unintended consequences or potential unintended consequences, and like it’s only unintended if you don’t feel like discussing it before you make a product., but in their defense, it’s always….we haven’t raised enough to be able to pay you to help us with that because the people who have invested in us are expecting these returns first, so they move fast thing still exists; move fast and scale and deal with the cleanup later that we really really need to fix that. Because for all the really amazing things that many of you are trying to build, we have to have the space to build it in a better way without the pressure to scale before we are even sure that what we are trying to build is not going to hurt people.

Betania Allo: That’s great. I would like…I think we are almost on time, so I would like to end on a positive note. If it’s possible and given all the colleagues that are joining us here and people watching online, I think without a doubt forums like this and building a responsible tech community is a great step forward, and an effective tech policy also needs a whole of society approach, and I was wondering if you can end this conversation telling us what else should be done right; what else can we actually do to create a healthier internet; and you touched upon a little bit, which stakeholders you would sit at the table, but who else should be involved and how?

Yaël Eisenstat: I love how much she keeps trying to get me to be positive, and then I keep going. Yes, I think I am holding out a lot of hope. I don’t mean to say I’ve given up on my generation, sorry folks, but I am holding out hope for this newer crop of technologists who will be more steeped in a more interdisciplinary approach because it’s funny, right? I’m not a technologist, and yet, I saw things years before people started exposing it at Facebook. It was just kind of obvious and back then, my voice was like, “who are you to about what’s happening in technology?” Today, I think we understand that there are sociologists and anthropologists and risk people and, you know, humanists chaplins apparently, and all sorts of people, who have a a stake in ensuring that technology serves us all better and, so hopefully, we’ll continue to use their voices and seek roles within tech companies…that’s another thing, people think I’m anti-tech, no I want tech to do better. And I want to make sure that people who come from a wide variety of backgrounds help ensure that happens.

Betania Allo: Great, excellent, perfect way to end it. Thank you so much! It’s been such a pleasure, thank you.

Speaker Bios

Yaël Eisenstat is the founder of Kilele Global, a strategic advisory firm that helps companies, governments, and investors align technology to better serve the public. In 2018, she was Facebook’s Global Head of Elections Integrity Operations for political ads. Previously, she spent 18 years working around the globe as a CIA officer, a national security advisor to Vice President Biden, a diplomat, and the head of a global political risk firm. Eisenstat’s commentary has been featured in the New York Times, BBC World News, CNN, CBS News, ABC News, and many more.

Betania Allo is a cyber law and policy expert, and former Programme Management Specialist at the United Nations Security Council Counter-Terrorism Committee Executive Directorate. She graduated as a lawyer in Argentina, and holds a Master of Laws (LL.M.) with an academic focus on cybersecurity and counterterrorism law and policy from Syracuse University. In addition, she holds a Master's in International Relations with a Graduate Certificate in International Security from Harvard University, and a Postgraduate in Cybersecurity and Compliance from Universitat Internacional de Catalunya. She’s a director at the International Counter-Terrorism Youth Network (ICTYN), the South America Coordinator at NextGen5.0, and an editor at the International Counterterrorism Review (ICTR). Betania lectures at several universities worldwide in cybersecurity and counter-terrorism. Moreover, she is regularly a keynote speaker in conferences about terrorism, emerging technologies, innovation, and feminism.  Find out more at www.betaniaallo.com


Previous
Previous

Building Multistakeholder Collaboration: Fireside chat with Yu Ping Chan from the United Nations and Justin Hendrix from Tech Policy Press

Next
Next

Tech Policy & Social Media: Where Are We Headed?