Tech and Democracy Profile: Mardiya Siba Yahaya

All Tech Is Human’s Tech & Democracy report addresses key issues and best practices in the field, and highlights a diverse range of individuals working in the field (across civil society, government, industry, academia, and entrepreneurship). Similar to all of our reports, this is assembled by a large global working groups across multiple disciplines, backgrounds, and perspectives.

As part of the Tech & Democracy report our team interviewed more than 40 people working to create a brighter tech future. This week, we’ll be highlighting select interviews.

Today, we hear from Data and Digital Rights Researcher, Policy & Africa Community Lead, Team CommUNITY Mardiya Siba Yahaya. To read more profile interviews, click below to download the Tech & Democracy report now.

Q: Tell us about your role.

I refer to myself as a digital sociologist. My role examines the impact of technology on societies and people within the global south, especially on how minoritized communities respond to and experience technologies at the intersection of gender, sexuality, location, race and ethnicity. Most of my work and interest are on surveillance, datafied societies, gig-communities, and education technology. I am also a community movement builder, who collaborates and facilitates processes and spaces that allow us to build sustainable and meaningful relationships as digital rights and security practitioners. 

Q: How did you build a career in the tech and democracy field?

I began my journey as a person who was simply interested in the social impact of technology. This happened from a very optimistic place, until my very first internship required me to lead a campaign on the effects of online violence in 2016. It became increasingly clear that social media spaces, which many young people were excited to use and be a part of, were inaccessible because of the harms they reproduced and for other exclusionary reasons. So based on a person’s socio-political and economic positionality they either experienced an affordance or disaffordance. 

However, my initial interest in surveillance was on gendered and religious surveillance, which I explored during my undergraduate thesis. This gave me a foundational understanding of the subject, which I further studied while tailoring it to researching technology and society,through visual media technologies for a Masters in Sociology. Collectively, these provided me with subject matter expertise. A lot of my professional experience however required me to continuously research and manage technology innovation projects within different African countries. This gave me the opportunity to directly use my subject expertise to inform on-the-ground decisions for research and experience design. 

My career in tech and democracy thus far,has also been the outcome of the constant support and mentorship from many African women and feminists within the field. In addition to my effort to build subject and skills expertise, a large community of women have also contributed to this career journey by providing me opportunities to practice and grow simultaneously. 

Q: What do you think are the key issues at the intersection of technology and democracy?

The first issue I see is violence. I consider violence as the systemic harmful actions and practices against civil society actors, individuals, and users of technology. Violence allows people in power to shrink and limit meaningful engagement within civic and public spaces, both of which are important to the fully democratic design, deployment, and use of technology. People who constantly experience violence through technology are prevented from enjoying the pleasures of the space and their personhood within said environment. Thus, for technology to be truly democratic, we would have to interrogate the different layers of violence it facilitates, reproduces, and enacts on people’s bodies and lives. 

The second issue is capitalism. I often wonder if we can create technologies that are inclusive and do not further harm marginalized groups or facilitate harm while operating within white supremacist capitalist institutions. When design teams are made up of a socio-economic privileged group, key aspects of harm and exclusion start from the initial phase, forcing us to engage with technologies created with exclusionary core designs. Also, harm happens when companies center profit over inclusivity, security, and safety. At different levels, we realize that democracy requires meaningful civic participation where decisions are not swayed by people in power. It also protects the interests of the people ‘at the bottom’ and nurtures communities. Yet, with how violence and capitalism collaborate, technology in its current design and use is very anti-community, centers on the needs and wants of a privileged few, and enacts violence on the most vulnerable and marginalized groups. 

Q: What are the key challenges for democracy that technology can ameliorate?

An important principle of democracy is meaningful and equitable participation, which has consistently been a significant challenge for democracy. Technology, despite its current issues and risks, has provided diverse spaces to boost community building, engagement, and participation. For example, in Zimbabwe, a group of young people designed a tool that allowed people to track election trends, read about candidates within their constituencies, and learn various ways they too can participate in the entire process, while holding the candidates accountable. In a similar way, various communities of digital rights and security workers organize through slack, signal, and Mattermost. By providing well managed, centralized and decentralized spaces for participation, technologies have allowed more people and interest groups to work together, hold power accountable, and learn and build cross-regional communities in a way that was not easily accessible and implementable previously. 

Q: What actions can government institutions and/or media companies take to rebuild trust with civil society?

What would a participatory action or community based approach to engaging civil society actors look like? The lack of trust is a result of years of constant violence against minoritized communities whom civil society groups are either a part of or advocate for. Perhaps government institutions may have to begin by not creating more policies and laws that facilitate harm and threats against civil society. This will also mean institutions would not participate in threatening civil society groups and their work, but also work to protect them against interpersonal and systemic harm and violence. My suggestion may seem ambitious. However, we should consider it as a starting point where no new harmful and deeply exclusionary policies / system designs are created or implemented. Thus, allowing for us to work on historic biases and oppression as well. 

On the other hand, transparency would be another step towards building trust. A lot of the mistrust currently between government institutions or media companies and the civil society happens because of decision making and algorithmic black boxes that exist. This also intersects with policy and design decisions because when governments and media companies are not transparent, it allows them to create and deploy harmful technologies into people’s lives, furthering the lack of trust. Meanwhile, we would also have to consider trust as not something that is outsourced to a third-party but a process where the people who use and are affected by the technology are involved in the research and decision making processes.

Previous
Previous

Tech and Democracy Profile: Nathaniel Lubin

Next
Next

Tech and Democracy Profile: Caitlin Chin