Brief

"On December 18, 2024, the European Data Protection Board issued an update regarding EDPB opinion on AI models: GDPR principles support responsible AI. The EDPB adopted an opinion on using personal data for developing and deploying AI models, focusing on anonymity, legitimate interest, and unlawfully processed data."

Brussels, 18 December – The European Data Protection Board (EDPB) has adopted an opinion* on the use of personal data for the development and deployment of AI models. This opinion looks at 1) when and how AI models can be considered anonymous, 2) whether and how legitimate interest can be used as a legal basis for developing or using AI models, and 3) what happens if an AI model is developed using personal data that was processed unlawfully. It also considers the use of first and third party data.The opinion was requested by the Irish Data Protection Authority (DPA) with a view to seeking Europe-wide regulatory harmonisation. To gather input for this opinion, which deals with fast-moving technologies that have an important impact on society, the EDPB organised a stakeholders’ event and had an exchange with the EU AI Office.
EDPB Chair Talus said: “AI technologies may bring many opportunities and benefits to different industries and areas of life. We need to ensure these innovations are done ethically, safely, and in a way that benefits everyone. The EDPB wants to support responsible AI innovation by ensuring personal data are protected and in full respect of the General Data Protection Regulation (GDPR).”

Regarding anonymity, the opinion says that whether an AI model is anonymous should be assessed on a case by case basis by the DPAs. For a model to be anonymous, it should be very unlikely (1) to directly or indirectly identify individuals whose data was used to create the model, and (2) to extract such personal data from the model through queries. The opinion provides a non-prescriptive and non-exhaustive list of methods to demonstrate anonymity.
With respect to legitimate interest, the opinion provides general considerations that DPAs should take into account when they assess if legitimate interest is an appropriate legal basis for processing personal data for the development and the deployment of AI models.
A three-step test helps assess the use of legitimate interest as a legal basis. The EDPB gives the examples of a conversational agent to assist users, and the use of AI to improve cybersecurity. These services can be beneficial for individuals and can rely on legitimate interest as a legal basis, but only if the processing is shown to be strictly necessary and the balancing of rights is respected.
The opinion also includes a number of criteria to help DPAs assess if individuals may reasonably expect certain uses of their personal data. These criteria include: whether or not the personal data was publicly available, the nature of the relationship between the individual and the controller, the nature of the service, the context in which the personal data was collected, the source from which the data was collected, the potential further uses of the model, and whether individuals are actually aware that their personal data is online.
If the balancing test shows that the processing should not take place because of the negative impact on individuals, mitigating measures may limit this negative impact. The opinion includes a non-exhaustive list of examples of such mitigating measures, which can be technical in nature, or make it easier for individuals to exercise their rights or increase transparency.
Finally, when an AI model was developed with unlawfully processed personal data, this could have an impact on the lawfulness of its deployment, unless the model has been duly anonymised.
Considering the scope of the request from the Irish DPA, the vast diversity of AI models and their rapid evolution, the opinion aims to give guidance on various elements that can be used for conducting a case by case analysis.
In addition, the EDPB is currently developing guidelines covering more specific questions, such as web scraping.
Note to editors:*An Article 64(2) opinion addresses a matter of general application or produces effects in more than one Member State.

Highlights content goes here...

Purpose:

The European Data Protection Board (EDPB) has adopted an opinion on the use of personal data for the development and deployment of Artificial Intelligence (AI) models. This opinion aims to provide guidance on when and how AI models can be considered anonymous, whether legitimate interest can be used as a legal basis for developing or using AI models, and what happens if an AI model is developed using personal data that was processed unlawfully.

The purpose of this opinion is to support responsible AI innovation by ensuring personal data are protected and in full respect of the General Data Protection Regulation (GDPR). The EDPB has organized a stakeholders’ event and had an exchange with the EU AI Office to gather input for this opinion, which deals with fast-moving technologies that have an important impact on society.

The opinion was requested by the Irish Data Protection Authority (DPA) with a view to seeking Europe-wide regulatory harmonisation. The EDPB Chair, Talus, emphasized that “AI technologies may bring many opportunities and benefits to different industries and areas of life,” but it is essential to ensure these innovations are done ethically, safely, and in a way that benefits everyone.

Effects on Industry:

The adoption of this opinion will have significant effects on the AI industry. Firstly, it provides clarity on when and how AI models can be considered anonymous, which is crucial for ensuring the protection of personal data used in AI development and deployment. Secondly, it offers guidance on whether legitimate interest can be used as a legal basis for processing personal data for AI development and deployment.

The opinion also highlights the importance of considering the use of first and third-party data in AI models. This will have implications for companies that develop or use AI models, as they must ensure that their practices comply with the GDPR requirements. The EDPB’s three-step test to assess the use of legitimate interest as a legal basis will provide a framework for companies to follow.

Additionally, the opinion emphasizes the need for mitigating measures when processing personal data in AI development and deployment. This will require companies to implement technical measures or increase transparency to limit the negative impact on individuals. The non-exhaustive list of examples of such mitigating measures provides guidance for companies to consider.

Relevant Stakeholders:

The stakeholders affected by this opinion include:

  1. Companies that develop or use AI models, including those in various industries such as healthcare, finance, and technology.
  2. Individuals whose personal data are used in AI development and deployment.
  3. Data Protection Authorities (DPAs) in the European Union, who must assess whether legitimate interest can be used as a legal basis for processing personal data for AI development and deployment.
  4. The Irish DPA, which requested the opinion with a view to seeking Europe-wide regulatory harmonisation.

Next Steps:

To comply with this opinion, companies that develop or use AI models should:

  1. Assess whether their AI models can be considered anonymous and take steps to ensure anonymity if necessary.
  2. Consider using legitimate interest as a legal basis for processing personal data for AI development and deployment, but only after conducting the three-step test provided by the EDPB.
  3. Implement mitigating measures when processing personal data in AI development and deployment to limit the negative impact on individuals.

The EDPB will continue to develop guidelines covering more specific questions, such as web scraping. Companies should stay informed about these developments to ensure compliance with relevant regulations.

Any Other Relevant Information:

This opinion is part of a broader effort by the EDPB to provide guidance on AI-related issues. The EDPB has organized a stakeholders’ event and had an exchange with the EU AI Office to gather input for this opinion, which reflects the importance of considering the impact of fast-moving technologies like AI on society.

The adoption of this opinion demonstrates the EDPB’s commitment to supporting responsible AI innovation while ensuring the protection of personal data in line with the GDPR requirements. The opinion will have significant effects on the AI industry and relevant stakeholders, as outlined above.

European Data Protection Board

Quick Insight
RADA.AI
RADA.AI
Hello! I'm RADA.AI - Regulatory Analysis and Decision Assistance. Your Intelligent guide for compliance and decision-making. How can i assist you today?
Suggested

Form successfully submitted. One of our GRI rep will contact you shortly

Thanking You!

Enter your Email

Enter your registered username/email id.

Enter your Email

Enter your email id below to signup.

Enter your Email

Enter your email id below to signup.
Individual Plan
$125 / month OR $1250 / year
Features
Best for: Researchers, Legal professionals, Academics
Enterprise Plan
Contact for Pricing
Features
Best for: Law Firms, Corporations, Government Bodies