Responsible AI

The capabilities of artificial intelligence (AI) are developing rapidly, in multiple directions. Recent breakthroughs in Generative AI are currently the most visible, but AI technologies are impacting decision making and automation in a wide range of fields, with implications for industry, government and civil society. 

This rapid pace of evolution and deployment is resurfacing important and complex questions surrounding AI ethics.  A diverse community of advocates for responsible technology is, and has been, actively working to provide practical thought leadership to guide AI development and deployment - with a focus on equity, societal benefits, harm reduction and environmental viability. 

The technical and social complexity of AI systems requires a multi-voice effort to explore what AI can do, what it should do, and what it could do in the future. The responsible tech ecosystem is a venue where such issues are examined, value propositions are defined, trade offs are explored and guardrails are proposed.

For the Responsible AI Knowledge Hub the All Tech is Human team has curated relevant resources from our various publications and from our community. This is a “living document” and will continue to grow with your input.

Please add any suggestions using the button below.

COMMUNITY | EVENT

Last year’s MozFest focused on Trustworthy AI, “harnessing our collective power to better our digital landscape, build transformative systems, and sustain momentum within our community towards positive human and digital rights progress…As we investigate Trustworthy AI and the ways in which we can move from opaque, closed systems towards open, transparent ones, we center the people: builders, users, whistle blowers, and all those affected. MozFest is the gathering place for human intelligence that weaves in ancestral, ecological, technical and spiritual knowledge, to move the needle in shaping Artificial Intelligence.” Read More

Help strengthen the Responsible Tech movement and elevate new and diverse voices.