Back to the MIT repository
6. Socioeconomic and Environmental2 - Post-deployment

Responsibility

HLI-based systems such as self-driving drones and vehicles will act autonomously in our world. In these systems, a challenging question is “who is liable when a self-driving system is involved in a crash or failure?”.

Source: MIT AI Risk Repositorymit599

ENTITY

2 - AI

INTENT

2 - Unintentional

TIMING

2 - Post-deployment

Risk ID

mit599

Domain lineage

6. Socioeconomic and Environmental

262 mapped risks

6.5 > Governance failure

Mitigation strategy

1. Establish a comprehensive, risk-based legal framework that explicitly assigns liability based on the level of autonomous control (SAE Level 0-5), ensuring clear lines of responsibility transition from the human operator (Levels 0–2) to the manufacturer/system provider/fleet operator (Levels 4–5). 2. Institute a strict liability or specialized insurance compensation mechanism for high-automation accidents (Level 4 and 5) to overcome the burden of proving negligence or design/software defect and to ensure swifter victim compensation. 3. Mandate the installation and standardization of "black box" Event Data Recorders (EDRs) in all autonomous systems, and establish clear regulations for the ownership, access, and public disclosure of system logs to enable objective post-incident investigation and causal attribution.