A Spotlight on Tech & Democracy in the Global Majority

By Faisal Lalani

The heart of the responsible technology movement has always been well-intentioned. Today, the rallying cry to establish an inclusive and safe digital information realm can be heard around every corner, especially as the influence of emerging technologies penetrate democratic processes at a scale unprecedented. 

But the impact of this cataclysm is not equal; technology is and will always be the great amplifier of the status quo, which means that communities that propagate violence and hatred offline will only find new ways to do so online. This logic, while simple, is often obscured by a bias, an instinct to isolate the effects of technology to the West. This is understandable: the present-day challenges in the West itself are too all-consuming. However, when the only world one thinks about is their own, the solutions we build become just as exclusionary as the problems we hope to vanquish.

The frameworks and policies being crafted for AI are no exception. Calls for global governance homogenize an immensely vast and diverse “Global South,” assumptions for new technical solutions rely on universal digital competency, and countries at risk of violating freedom of expression online are seen as outliers rather than the norm.

Not only must the discourse shift to acknowledge these disparities, but the voices of the activists, journalists, lawyers, and civil society in these geographies must be set front and center. The fact of the matter is that digital ministers in Ghana and Rwanda have already been working diligently on implementing strategies on AI literacy and capacity building. Sri Lanka has a robust civil society coalition working together on codes of practice for online safety. If the efforts being made in communities outside the normative scope of responsible AI discourse could be incorporated into our traditional “multi-stakeholder” models of knowledge production, the idea of true multilateral AI governance may actually become a reality.

Thuley, a new global foundation that challenges orthodox thinking around social change, recently launched a series of Global Majority spotlights in conjunction with All Tech is Human and Tech Global Institute. The aim of these spotlights is to create rooms for key players in tech policy in regions like South Asia to informally discuss the threats to democracy their respective countries are facing.

The first of these conversations was hosted and facilitated by Sabhanaz Diya, the founder of the Tech Global Institute. She was joined by Shmyla Khan, a researcher and campaigner based in Pakistan, and Saritha Irugalbandara, the head of advocacy at Hashtag Generation in Sri Lanka. With more than 2.5 billion voters this year participating in elections in low-to-middle income countries, Diya, Shmyla, and Saritha focused on the pivotal risks in their respective purviews:

First, a history of political and economic instability contributing to an erosion of trust that both undermines efforts to promote online safety and desperate attempts to consolidate power by passing oppressive legislation. This is reinforced by political systems and courts that exist within a manipulated state machinery, meaning that the adversaries in tech battles are not just private tech companies or foreign actors, but the state itself. Coordinated disinformation campaigns reinforcing ethno-nationalism that prevents voters from meaningfully participating in political processes further contribute to this distrust.

Second, proposed legislation tends to pathologize online behavior and categorize it in a vacuum independent of offline systemic failings. This resorts to purely reactive measures that punish rather than create any sort of sustainable, systematic change. Due diligence by tech companies only goes so far when most of their moderation tools are catered towards an English-speaking, Eurocentric consumer base. Furthermore, legislation in these regions is often more than just the law and demands nuance: who holds power before and after its implementation? Who gets to participate in defining terms within the space, and what does the history of similar legislation tell us about the priorities of the implementer? Is the content of this law just an expression of an autocrat’s desire for more control?

Third, attempts at multi-stakeholder consultation often fall flat and appear performative because they are corrupted by bureaucratic intergovernmental structures that are so far removed from the lived experiences of Global Majority communities. Guidelines for this type of collaboration are defined by agencies like UNESCO but lack depth mean little when people at the top of the hierarchy already have set agendas. Attempts to adhere to these guidelines also cater to tokenistic representations and superficial recommendations that prevent any meaningful change. When Global Majority actors speak out, they are seen as obstacles or exceptions in achieving a pre-determined consensus on how to move forward.

While each of these risks convey a clear and present danger to upcoming elections, they indicate a far more important lesson for any individual or group hoping to save the future of the digital world: technology exists within ecosystems, each with their own histories of unity and discrimination, truths and falsehoods, pursuits of hope and desires for control. When we begin to perceive technology as an isolated evil or anthropomorphic savior, we ignore the fact that disinformation is not just happening now in this election, but has always been propagated by institutions, even and especially through non-technical means. We overlook the individuals actively upholding corrupt systems of power and oppression. In short, we seek the easy way, the path that paints the world as less complex and straightforward to understand.

Instead, we must pave a different path, one that Diya, Saritha, and Shmyla offer substantive insight for: create spaces for Global Majority tech and democracy advocates to communicate within themselves, independent of colonial proxies. If Western institutions want to offer support, listen and learn rather than preach and prescribe. Consultation should be more than a buzzword - examine your reports and outputs and question where each recommendation came from. Reconsider temptations to create large-scale, global models over smaller, locally sourced and facilitated frameworks. Global governance may sound exciting and easy to sell, but each community is its own world - a one-size-fits-all-solution will only obfuscate grassroots nuance.

Initiatives embodying these principles are already underway. The Tech Global Institute, for instance, recently hosted several Tech Policy Circles in countries like South Africa, Bangladesh, and Sri Lanka in which transparency and accountability frameworks were brainstormed by local advocates. In another example, Thuley is working on building partnerships between tech and democracy leaders in Global Majority countries. We encourage you to support these efforts by insisting that the voices at your roundtables and panels are ones that derive from communities in the Global Majority. This is not a favor: your own strategies and solutions will only benefit from understanding the impact of emerging technologies outside your typical purviews. For it is in these contexts that social change is truly tested.

Previous
Previous

New Careers in Responsible AI This Week!

Next
Next

Ambassador Amandeep Singh Gill Will Speak at Strengthening The Information Ecosystem on March 6