Back to the MIT repository
3. Misinformation2 - Post-deployment

Pollution of information ecosystems

Contaminating publicly available information with false or inaccurate information (i.e., the generative tool's output is disseminated beyond the end user)

Source: MIT AI Risk Repositorymit1345

ENTITY

3 - Other

INTENT

3 - Other

TIMING

2 - Post-deployment

Risk ID

mit1345

Domain lineage

3. Misinformation

74 mapped risks

3.2 > Pollution of information ecosystem and loss of consensus reality

Mitigation strategy

1. Implement source-level generative controls, such as Retrieval Augmented Generation (RAG) and rigorous data quality audits, to minimize model hallucination and ensure output faithfulness to verifiable external data, thereby reducing the systemic injection of false information. 2. Mandate and deploy content provenance and detection technologies, including digital watermarking and robust AI-powered flagging systems, to establish transparency regarding the synthetic nature of content and facilitate rapid, verifiable identification of deceptive media. 3. Establish and fund comprehensive multi-stakeholder programs focused on enhancing digital literacy and cognitive resilience to strengthen the public's capacity for critical evaluation and to mitigate the societal impact of manipulated information ecosystems.