Type 1: Diffusion of responsibility
Societal-scale harm can arise from AI built by a diffuse collection of creators, where no one is uniquely accountable for the technology's creation or use, as in a classic tragedy of the commons.
ENTITY
2 - AI
INTENT
2 - Unintentional
TIMING
3 - Other
Risk ID
mit01
Domain lineage
6. Socioeconomic and Environmental
6.5 > Governance failure
Mitigation strategy
1. Establish a comprehensive, legally enforceable governance framework that mandates clear, non-diffuse accountability for all phases of the AI system lifecycle, from research to deployment. This framework must explicitly define the roles responsible for monitoring, risk mitigation, and providing redress in the event of societal harm. 2. Mandate full architectural transparency and data provenance across the AI supply chain to enable rigorous traceability of harmful outcomes. This includes detailed documentation of data sources, algorithmic logic, and contributing agents, which is essential for ex-post facto attribution of causality and responsibility. 3. Implement focused, strategic accountability and incentive mechanisms within development and oversight teams. Design organizational structures with clearly delineated individual roles and goals, complemented by group incentives that foster collective risk-monitoring and leverage social pressure to discourage the diffusion of safety and ethical responsibilities.
ADDITIONAL EVIDENCE
Automated processes can cause societal harm even when no one in particular is primarily responsible for the creation or deployment of those processes (Zwetsloot and Dafoe, 2019), and perhaps even as a result of the absence of responsibility. The infamous “flash crash” of 2010 is an instance of this: numerous stock trading algorithms from a variety of companies interacted in a fashion that rapidly devalued the US stock market by over 1 trillion dollars in a matter of minutes. Fortunately, humans were able to intervene afterward and reverse the damage, but that might not always be possible as AI technology becomes more powerful and pervasive.