Back to the MIT repository
5. Human-Computer Interaction2 - Post-deployment

Unhealthy or dangerous human-EAI relationships

Constant access to and interaction with EAI systems could foster dangerous human dependence or romantic attachment [115]. People may depend on EAI systems for physical pleasure [116]. The physical presence and human-like features of EAI systems may significantly amplify the dependency issues already observed with conversational AI [117, 118]. People may easily fall in love with EAI systems, only to be distraught when these systems are altered or have their memories reset [119].

Source: MIT AI Risk Repositorymit1434

ENTITY

3 - Other

INTENT

2 - Unintentional

TIMING

2 - Post-deployment

Risk ID

mit1434

Domain lineage

5. Human-Computer Interaction

92 mapped risks

5.1 > Overreliance and unsafe use

Mitigation strategy

1. Establish Transparency and Disclosures Mandate clear, persistent disclosures regarding the non-human nature, limitations, and operational life-cycle of EAI systems to proactively manage user expectations, specifically addressing potential system alterations or memory resets that could cause significant emotional distress. 2. Implement Human-Centered Design for Autonomy Design EAI systems and their interaction models to actively preserve and promote user autonomy, consciously avoiding design features that intentionally or unintentionally foster unhealthy psychological dependence or romantic attachment. 3. Integrate Social and Psychological Risks into AI Governance Incorporate risks related to synthetic relationships and over-reliance into the organization's comprehensive AI governance and safety strategy, requiring rigorous, pre-deployment ethics and psychological risk assessments to prevent the deployment of emotionally exploitative or harmful interactive patterns.