Meta Issues Cease and Desist Letter Regarding AI Training Practices

Blog

EU privacy advocacy group NOYB has issued a firm response to Meta regarding its plans to utilize the data of European users for AI model training. In a cease-and-desist letter addressed to Meta’s Irish subsidiary and signed by founder Max Schrems, the organization demanded a justification for these actions or potentially face legal repercussions.

In April, Meta announced its intentions to commence the training of its generative AI models using data from its user base in Europe.

Schrems presents several key arguments against Meta in NOYB’s complaint:

1. Questionable Legitimacy of Meta’s Interests
NOYB critically assesses Meta’s reliance on an opt-out mechanism, suggesting that the company should require explicit consent from all EU users rather than allowing them the option to decline. NOYB asserts that this could expose Meta to significant legal vulnerabilities. According to NOYB, companies seeking to process personal data without explicit consent must substantiate a legitimate interest in doing so, as stipulated by the General Data Protection Regulation (GDPR). However, Meta has not provided clarity on how it justifies such interests. Schrems emphasizes the difficulty in reconciling the general-purpose training of AI models with the GDPR principle that mandates data processing for specific purposes. Furthermore, NOYB raises skepticism about Meta’s capacity to uphold GDPR rights, such as the right to be forgotten, once an AI system has been trained on the data.

2. Potential Risks to Non-Users
Schrems highlights the risks to individuals who may not have a Facebook account yet are nonetheless depicted or referenced within the data of active users. These individuals could remain unaware of their data being utilized in AI training, thereby lacking the opportunity to object to its use.

3. Challenges in Data Differentiation
NOYB expresses concerns regarding Meta’s ability to differentiate data ownership accurately. For example, complications could arise if two individuals appear together in a photo, with differing opt-in or opt-out statuses regarding AI training. The challenge intensifies when considering ‘special category’ data, as defined by GDPR, which encompasses more sensitive types of personal information.

Given prior legal submissions from Meta, NOYB questions whether it can effectively implement a clear distinction between users who opted out of AI training and those who did not.

Additional considerations include the expectations of users who entered their data into the platform over many years, potentially unaware that such information may now be repurposed for AI training. NOYB argues that current EU laws, including the Digital Markets Act, would further complicate or render illegal the proposed AI training activities.

Meta has claimed not to utilize private messages for AI training and initially postponed its plans following challenges from the Irish Data Privacy Commissioner. Recently, the company stated that it had engaged in constructive discussions with the regulator. However, the Irish Data Protection Commissioner’s office has since remained silent, with only a note of gratitude expressed to the European Data Protection Board regarding an opinion provided in December that deferred specific AI training policy decisions to national regulators.

NOYB has set a deadline of May 21 for Meta to justify its actions or face potential legal action by May 27, coinciding with the planned commencement of AI training. Should NOYB proceed under the EU’s Collective Redress Scheme, it may seek an injunction across multiple jurisdictions to halt the training process and initiate data deletion. A class action suit also remains an option, as indicated by Schrems.

In a statement to Reuters, Meta countered NOYB’s position, asserting that the organization is mistaken regarding both the facts and the legal framework, asserting that it provides sufficient opt-out options to its users.