Back to the MIT repository
5. Human-Computer Interaction2 - Post-deployment

Overreliance on AI system undermining user autonomy

AI systems can undermine human autonomy, if they allow for habitually trusting the AI’s suggestions without sufficient exercising of human agency. Over time, a user may develop unjustified trust in or dependence on the system, or rely on its advice for tasks outside the system’s domain of expertise [205, 42]. In particular, less confident users (or users in emotional distress) can be more prone to “overtrust” a system [219].

Source: MIT AI Risk Repositorymit1174

ENTITY

1 - Human

INTENT

2 - Unintentional

TIMING

2 - Post-deployment

Risk ID

mit1174

Domain lineage

5. Human-Computer Interaction

92 mapped risks

5.2 > Loss of human agency and autonomy

Mitigation strategy

1. Establish Human-in-the-Loop (HITL) and Human-on-the-Loop (HOTL) Governance Models: Implement mandatory human review, validation, and the capability to override all AI-generated outputs within critical decision-making processes. This ensures human oversight serves as the final line of defense against consequential errors stemming from uncritical overreliance (Source 2, 11, 14). 2. Design for Calibrated Trust via Enhanced Transparency and Explainability: Integrate mechanisms that actively communicate the AI system's operational boundaries, probabilistic uncertainty (e.g., highlighting low-confidence outputs), and a simplified, rudimentary explanation of its reasoning. This enables users to develop accurate mental models, fostering appropriate reliance and promoting algorithmic vigilance (Source 3, 6, 10, 11). 3. Mandate Continuous Critical Skill Development and AI Literacy Training: Institute ongoing educational programs designed to enhance user-side critical thinking, domain expertise, and independent problem-solving skills. The goal is to counteract cognitive offloading and dependency, ensuring the workforce maintains the necessary competencies to effectively monitor and challenge AI recommendations (Source 2, 5, 7, 18).