Back to the MIT repository
6. Socioeconomic and Environmental1 - Pre-deployment

Incomplete usage definition

Since foundation models can be used for many purposes, a model’s intended use is important for defining the relevant risks of that model. As the use changes, the relevant risks might correspondingly change.

Source: MIT AI Risk Repositorymit1323

ENTITY

1 - Human

INTENT

2 - Unintentional

TIMING

1 - Pre-deployment

Risk ID

mit1323

Domain lineage

6. Socioeconomic and Environmental

262 mapped risks

6.5 > Governance failure

Mitigation strategy

1. **Establish and Enforce a Constrained Intended Purpose.** Mandate a legally and technically precise definition of the **Intended Purpose** of the foundation model (FM) at the design stage. This definition must explicitly delineate the scope of acceptable downstream tasks, required operating environments, and a comprehensive list of known prohibited uses. For models whose weights are openly released, this necessitates embedding restrictive licensing or technical mechanisms to prevent fine-tuning that is likely to circumvent established safety guardrails. 2. **Implement Continuous Use-Case Monitoring and Marginal Risk Assessment.** Establish ongoing mechanisms to monitor the practical application of the model in diverse, real-world settings to detect use-case drift or the emergence of unassessed capabilities. This monitoring must be coupled with regular **Marginal Risk Analysis** to systematically evaluate the incremental risk introduced by new applications or model adaptations relative to the initial safety baseline and inform timely mitigation responses. 3. **Institute Cross-Value Chain Governance and Accountability.** Embed the responsibility for adhering to the Intended Purpose and mitigating use-drift risks across all actors in the AI value chain (including Model Providers, Adapters, and Application Developers). This requires formal **AI Governance Frameworks** that define clear roles, responsibilities, and escalation protocols for addressing the discovery of unapproved or high-risk uses.