Yaël Eisenstat to speak at upcoming Responsible Tech Summit in NYC on May 20th

All Tech Is Human is thrilled to announce that renowned democracy advocate Yaël Eisenstat its upcoming Responsible Tech Summit to be held on Friday, May 20th in NYC. This gathering for 120 leaders focused on improving digital spaces will be held at the Consulate General of Canada in New York, our event partner. Yaël Eisenstat is the founder of Kilele Global, a strategic advisory firm that helps companies, governments, and investors align technology to better serve the public. In 2018, she was Facebook’s Global Head of Elections Integrity Operations for political ads. Previously, she spent 18 years working around the globe as a CIA officer, a national security advisor to Vice President Biden, a diplomat, and the head of a global political risk firm. Eisenstat’s commentary has been featured in the New York Times, BBC World News, CNN, CBS News, ABC News, and many more.

This cannot just be left to technologists to solve and lawyers to debate. Social media has a profound impact on all of society, and we are all stakeholders in the ultimate solutions.
— -Yael Eisenstat, Democracy activist & Founder of Kilele Global

The Responsible Tech Summit on May 20th seeks to unite a diverse range of stakeholders to build on each other’s work and co-create a tech future aligned with the public interest. All-day event (9am to 4:30pm) with panels, fireside chats, and plenty of networking and collaboration. Talks will be livestreamed for a global audience. Our event partner for May 20th is the Consulate General of Canada in New York. 

This gathering will be in person, while the talks on stage will be livestreamed for a global audience. To find out more about the upcoming Responsible Tech Summit, read here.

All Tech Is Human specializes in bringing together a diverse range of stakeholders to tackle thorny tech & society issues. Previous summits, livestreams, and reports our organization have featured individuals from Aspen Institute, Berkman Klein Center, World Economic Forum, Data & Society, Mozilla, IEEE, DataKind, Center for Humane Technology, IBM, Salesforce, New_Public, Deloitte, Accenture, the New York Times, Avanade, Facebook, Microsoft, Twitter, TikTok, Discord, Sesame Workshop, Consumer Reports, Google, the FCC, Hulu, Roblox, Partnership on AI, Web Foundation, Omidyar Network, Tony Blair Institute, and many more. All Tech Is Human held a virtual Responsible Tech Summit on September 15, 2020 that drew over 1200 registered attendees across 60 countries. Pre-Covid, the organization held summits in NYC, San Francisco, and Seattle. Its inaugural summit was held in NYC in the Fall of 2018.

Two of our recent reports dealt specifically with improving digital spaces. Our most recent is called the HX Report: Aligning Our Tech Future With Our Human Experience. All Tech Is Human is a member of the HX Project, alongside organizations such as Aspen Institute, Data & Society, Project Zero, and Headstream, to have an “approach to talking about, engaging with, and designing technology in a way that is aligned with our needs as humans — not users.” In our HX Report we took a holistic approach to improving digital spaces, looking at product design, business models, content moderation, digital citizenship, tech augmentation, and tech & wellbeing.

And previous to the HX Report, our organization released Improving Social Media: The People, Organizations, and Ideas for a Better Tech Future. These two reports featured resources from over 150 organizations doing valuable work in the ecosystem, and included profile interviews with over 80 leaders focused on improving digital spaces.

Yaël Eisenstat was profiled in our Improving Social Media report, which was released in Feb 2020. The interview is below.


Tell us about your career path and how it led you to your work’s focus.

After spending 18 years in the national security and global affairs world – from countering extremism abroad to being a national security advisor at the White House – I began to view the breakdown of civil discourse here at home as the biggest threat to democracy. I became increasingly concerned with how the Internet was contributing to political polarization, hate and division. I set out to both publicly sound alarm bells and to see what role I could play in helping reverse this course.

This led me to Facebook, where I was hired to head the company’s new Global Elections Integrity Operations team for political advertising. Realizing I was not going to change the company from within, I am now a public advocate for transparency and accountability in tech, particularly where the real-world-consequences affect democracy and societies around the world.

In your opinion, what are the biggest issues facing social media?

We are being manipulated by the current information ecosystem, entrenching so many of us so far into absolutism that “compromise” has become a dirty word. Because right now, social media companies, like Facebook, profit from segmenting us and feeding us personalized content that both validates and exploits our biases. Their bottom line depends on provoking strong emotions to keep us engaged, often incentivizing inflammatory, polarizing voices; to the point where finding common ground feels impossible. Unless they are willing to reconsider how the entire machine is designed and monetized, no amount of "whack-a-mole" content moderation will fix the divisive nature of the biggest social media platforms today. They will never truly address how the platform is contributing to hate, division and radicalization. But that would require fundamentally accepting that the thing you built might not be the best thing for society and agreeing to alter the entire product and business model.

In your opinion, what area do you think needs the most improvement?

Every one of these is part of the larger puzzle. There is no one magical solution – we need a whole-of-society approach. I focus on the government's role in defining responsibility, accountability for the externalities and threats to society caused by current social media business models. This goes hand-in-hand with civic education, media literacy, public awareness and healthier media in general.

What people and organizations do you feel are doing a good job toward improving social media? Why/how would you say their work is helping?

Civil Rights leaders, academics, journalists, advertisers, legislators, employees and activists all play a critical role in this movement. Many organizations help educate the public, raise awareness and push the government to step up and address these issues. Every one of these voices is important. This cannot just be left to technologists to solve and lawyers to debate. Social media has a profound impact on all of society, and we are all stakeholders in the ultimate solutions.

What do you see as the risk of doing nothing to address the shortcomings of social media?

We have already seen the risks that I (and so many others) have been trying to highlight for years play out: People are using social media tools, exactly as they were designed, to sow division, hatred and distrust. We saw where that can lead when followers of conspiracy theories tried to launch an insurrection at the U.S. Capitol.

Part of our mission at All Tech Is Human is to diversify the people working in the tech industry. In your opinion, what academic or experience backgrounds should be more involved in improving social media?

This cannot be left to just technologists to fix. In addition to the need for racial, socio-economic, religious and geographic diversity, this will require true diversity of thought, experience and background to fix. If we desire to create a healthier, more equitable information ecosystem, the people who are most affected by the negative side of social media must be incorporated into the decision-making processes moving forward.

Will we ever be able to solve the conundrum around whether the platforms are policing too much or too little (de-platforming heads of state vs. not protecting the vulnerable enough)? Can governments solve this conundrum?

I do not believe the government should be regulating what speech is ok and what speech should be taken down, except where it breaks the law. But I do think government should figure out how to regulate the tools the platforms use (and sell to advertisers) for curating, recommending, amplifying and targeting. And that comes down to the fact that there is no transparency into how those tools work. By insisting on real transparency around what these recommendation engines are doing, how the curation, amplification, and targeting are happening, we could separate the idea that Facebook shouldn’t be responsible for what a user posts from their responsibility for how their own tools treat that content. I want us to hold the companies accountable not for the fact that someone posts misinformation or extreme rhetoric, but for how their recommendation engines spread it, how their algorithms steer people towards it, and how their tools are used to target people with it.



Previous
Previous

20 Responsible Tech Events to catch in April 2022

Next
Next

Julie Inman Grant, eSafety Commissioner of Australia, to speak at upcoming Responsible Tech Summit in NYC