Back to the MIT repository
3. Misinformation3 - Other

Widespread use of persuasive tools contributes to splintered epistemic communities

Even without deliberate misuse, widespread use of powerful persuasion tools could have negative impacts. If such tools were used by many different groups to advance many different ideas, we could see the world splintering into isolated “epistemic communities”, with little room for dialogue or transfer between communities. A similar scenario could emerge via the increasing personalisation of people’s online experiences—in other words, we may see a continuation of the trend towards “filter bubbles” and “echo chambers”, driven by content selection algorithms, that some argue is already happening [3, 25, 51].

Source: MIT AI Risk Repositorymit903

ENTITY

1 - Human

INTENT

2 - Unintentional

TIMING

3 - Other

Risk ID

mit903

Domain lineage

3. Misinformation

74 mapped risks

3.2 > Pollution of information ecosystem and loss of consensus reality

Mitigation strategy

- Implement mandatory platform design constraints, such as 'algorithmic choice' or minimum diversity requirements, to actively counteract personalization effects and ensure users are minimally exposed to high-quality, decision-relevant information outside of their existing epistemic communities. - Launch comprehensive, multidisciplinary media literacy and critical thinking campaigns to inoculate the public against manipulative persuasion techniques and to foster the behavioral capacity for seeking and evaluating diverse viewpoints outside of algorithmic filter bubbles. - Establish institutional and technological mechanisms for the transparent signaling of trustworthy information sources, thereby increasing the cost for bad actors to spread unsupported content and facilitating citizens' ability to assess the reliability of information.