Tech and Democracy Profile: Nathaniel Lubin

All Tech Is Human’s Tech & Democracy report addresses key issues and best practices in the field, and highlights a diverse range of individuals working in the field (across civil society, government, industry, academia, and entrepreneurship). Similar to all of our reports, this is assembled by a large global working groups across multiple disciplines, backgrounds, and perspectives.

As part of the Tech & Democracy report our team interviewed more than 40 people working to create a brighter tech future. This week, we’ll be highlighting select interviews.

Today, we hear from RSM Fellow, Berkman-Klein Center, and Founder, The Better Internet Initiative Nathaniel Lubin. To read more profile interviews, click below to download the Tech & Democracy report now.

Q: Tell us about your role and what it entails.

I'm currently a Fellow at the Berkman Klein Center where I'm working on a framework for incorporating public health metrics into product evaluation decisions. This work stems from experiences in several of my recent projects, specifically non-profit work with Fellow Americans, the Better Internet Initiative, and Survey 160.

Q: How did you build your career in the tech & democracy field? What advice would you give others looking for a similar career?

I started my career working in digital and technology teams for Barack Obama. I began as a volunteer and ended up running the digital strategy office in the White House. If you want to work in those roles, the best way to get started is to find a leader you like and find a way to work for them early. Since then, I have transitioned to work with technology and media companies, and philanthropy.

Q: How is democracy defined in your line of work? How does it influence your approach to technology and democracy?

As someone who used to work for the Federal Government, my orientation is around functioning institutions that operate in the interest of the public. A functioning democracy requires not just participation from the public, but meaningful feedback between the interests of the people and the actions of leaders. Technology and media tools that obfuscate those relationships, or that make it harder for straightforward incentives to direct decision-making, result in the undermining of democratic systems.

Q: What do you think are the key issues at the intersection of technology and democracy?

As long as attention-based business models remain central, we need meaningful constraints that protect the public from structural harms. That is distinct from content moderation challenges: we need to differentiate harmful effects that happen to individuals from harmful effects on populations. System reductions in interpersonal trust, for example, produce real long-term challenges for democratic practices. I believe that appropriate limits will be placed on abuse if we have strong, demonstrable evidence, but at the moment we do not have good enough methods for understanding these effects.

At the same time, the current incentive structures in large platforms are for enabling the small minority of users who are the loudest -- since those tend to be the ones who drive the most engagement (and revenue). Because those loudest users also tend to be the most abusive and most likely to spread hate and misinformation, we need to reorganize incentives so that large platforms are more aligned in their interest to limit the distribution and exposure to the most objectionable content.

Q: What are the key challenges for democracy that technology can exacerbate?

The best forms of democratic institutions often are slow, thoughtful, and deliberate. Social media tends to prioritize the opposite. When the most strident and divisive messages are the most likely to generate broad reach, finding common ground is disincentivized. For example, we see this in practice when the most objectionable candidates tend to be among the most prolific fundraisers using small-dollar donations fueled by social media.

Q: What are the key challenges for democracy that technology can ameliorate?

Meaningful connections can be and are sourced using digital technology tools. We see this in the best versions of communities with moderation, like Wikipedia or well-monitored niche communities, or even some subreddits. The more that tools foster connection and meaningful communication oriented around positive relationships rather than the promotion of outrage, the more they will foster democracy.

There are also direct opportunities to use tools in support of democratic practices, such as organizing tools, event building, and direct feedback in government. We have seen cases where engagement with comments for public rulemakings, such as some of the currently pending rules by the FTC, can be greatly fostered by technology tools.

What actions can government institutions and/or media companies take to rebuild trust with civil society? (50-250 words)

This requires a longer answer. But meaningful connections between product development and regulation need to be the end result. To get there, greater access to the internal decision-making processes in companies, including more access to data for researchers, would go a long way. Audits tied to duty-of-care principles are interesting approaches now being explored in some other countries.

Q: What are the roles and responsibilities of the key players in the tech & democracy ecosystem like industry, government and/or civil society?

I think the industry's longer-term interests would be well served by engaging more directly with short-term painful choices that might result in reductions in shorter-term revenue. Those are hard choices to make in the abstract without shifted incentives provided by competition from new entrants, and direct engagement by the government/civil society. More competition among products would help, as would clearer red lines for what kinds of activities cross lines. Academia and civil society are best positioned to advocate for those standards, but they must be implementable in product development and/or regulation.

What are the responsibilities of government and/or media companies when social technologies are used to exacerbate social tensions, threaten democracy, misinform and destabilize society? How can we hold each of these groups accountable? (50-250 words)

I'm working on a framework for this right now at Harvard's Berkman Klein Center and would be happy to go into more detail in March/April when it goes public. The short version is that I believe we need to build assessments of social harms into the product development cycles of the largest products/platforms, leveraging the evaluation systems those platforms already employ for growth. I published a piece last year in MIT Tech about this, which we are now expanding, taking inspiration from the world of public health.

There has been a lot of discussion around increasing multi-stakeholder collaboration to reduce some of the issues related to tech and democracy. In your opinion, how can we increase multi-stakeholder collaborations? (50-250 words)

I'm interested in some of these approaches, such as the Bridging Systems work. At the end of the day, though, I think we need to rely on existing systems of governance more than building brand-new ones, and we need to do that by empowering those systems with better levers to make choices. I don't think there is a silver bullet for making the broader public feel a part of this -- the best way would be for them to feel that institutions are looking out for them.

Q: Looking five years into the future, how would you hope the conditions have changed related to tech and democracy?

I hope that digital spaces feel calmer and slower. The drive toward reducing friction in all digital spaces is not always productive, and I think many people are coming around to that view. I hope that the dominant systems increasingly take that to heart, with near 100% freedom/very limited restrictions for small communities and conversations, and more measured/limited reach for societal-scale feeds and systems absent protections of real structural harms.

Which people, organizations, or institutions are doing impactful work at the intersection of technology and democracy?

I've been impressed by many of the groups represented in the Council for Responsible Social Media, of which I've been lucky enough to be a member. I've been lucky enough to see many great academics through Berkman Klein and am excited to see what is coming. And of course, in government, the FTC has really stepped up enforcement in exciting ways.

Q: From your perspective, what does a better tech future look like?

A better tech future looks like one where creativity and meaningful connection are prioritized.

Previous
Previous

Tech and Democracy Profile: Zamaan Qureshi

Next
Next

Tech and Democracy Profile: Mardiya Siba Yahaya