In response to the developments surrounding Meta's plans to use user data for training its AI systems, Stephen Almond, the ICO's Executive Director for Regulatory Risk, indicated that the ICO will continue to closely monitor major developers of generative AI, including Meta. The aim is to review the safeguards these companies have put in place and ensure that the information rights of UK users are protected2. This highlights the importance of transparency and trustworthiness in the development and deployment of AI technologies, as well as the need for companies to comply with data protection regulations.
Meta relied on a GDPR provision called "legitimate interests" to justify its actions as compliant with the regulations. This is not the first time Meta has used this legal basis in defense, as it had previously done so to justify processing European users' data for targeted advertising. The company believes that this legal basis is the most appropriate balance for processing public data at the scale necessary to train AI models while respecting people's rights.
The main concerns raised by the privacy activist organization NOYB (None of Your Business) regarding Meta's planned privacy policy changes are related to the use of user data for AI training and the lack of proper consent from users. NOYB argues that Meta's plans to use years of everyone's posts, including images, to develop and improve AI at Meta, contravene various facets of the General Data Protection Regulation (GDPR).
One of the main issues is the opt-in versus opt-out approach. Under GDPR, when personal data processing takes place for the purposes of improving AI systems, users should be asked for their permission first, rather than requiring them to take action to refuse. However, Meta was relying on a GDPR provision called "legitimate interests" to contend that its actions were compliant with the regulations, which did not involve explicit user consent.
Another concern is that Meta made it difficult for users to "opt out" of having their data used. Users had to complete an objection form where they put forward their arguments for why they didn't want their data to be processed, and it was entirely at Meta's discretion as to whether this request was honored. The objection form was also not easy to find, hidden behind multiple clicks and buried in the privacy settings.
NOYB founder Max Schrems stated that Meta's approach is the opposite of GDPR compliance, as it allows the company to use any data from any source for any purpose and make it available to anyone in the world as long as it's done via AI technology5. This lack of specificity and control over how user data is used and shared raises significant privacy concerns.