None of Your Business
Meta tried to use the ‘legitimate interest’ clause to target users with advertising, but the European Court of Justice ruled against Meta, so a number of privacy advocate groups are filing cease and desist letters with Meta, triggering a return to the courts. Should Meta lose the case, aside from not being able to use the personal information, they could be liable for damages, which are quite severe under GDPR, especially as once the personal data is included in Mata’s training data, they will also be unable to comply with other GDPR rules, such as those that require that user data be forgotten upon request or those that require user access to individual personal data and the ability to correct inaccurate data. Because the Meta models are open-source, once the personal data is incorporated in the training material it cannot be called back and removed.
What makes this unusual is that there is a simple solution, just ask users for their consent, but Meta is making the case that any difficulties in using this data would remove the ‘local flavor’ of the training data and hinder the development of Ai in the EU, letting the US and China take the lead. In fact if Meta made it understood that the data would be anonymous, they would most likely receive enough opt-ins to give significance to the local trends that they are trying to capture. The necessity for absorbing all personal data from Facebook and Instagram users in the EU is ridiculous as other AI companies, such as OpenAI (pvt), the leader in the AI space, and France’s own Mistral (pvt) do not have any personal information from social media in their training data yet still perform in line with or better than those that do.
Meta’s ‘legitimate interest’ ploy is forcing a number of privacy groups to file injunctions before the proposed date when Meta will begin access to user personal information. This is being done in Ireland where Meta’s EU headquarters resides and in other jurisdictions. If Meta were to lose in the EU, not only would it have to stop processing user personal information but it would have to delete any Ai systems that were trained with that data, and if EU data is mixed in with non-EU data during training, the entire model would have to be deleted. The injunctions also stop the statute of limitations clock, and every day that Meta continued to use what would then be illegal data, would increase the potential damage claims by users or the equivalent of class action suits. The privacy groups estimate that if the fine were only €500, with ~400 million monthly active Meta users in Europe, the aggregate fine would be over €200 billion, It would be a lot easier to just ask users to opt-in.