Clearview AI fined by Italy for GDPR failures

18-03-2022 | By Robin Mitchell

Recently, controversial AI face recognition developer Clearview AI has received a €20m fine from the Italian data protection agency for unauthorised use and storage of private data. Why is Clearview AI regarded as highly unethical, what did Italy do in response, and what message does this send to engineers?


Why is Clearview AI considered highly unethical?


Undoubtedly, the development of AI has led to great leaps and bounds in technology, with biometric systems allowing hands-free access to secure information, tracking technologies that can autofocus cameras onto faces, and vehicles that can identify potential threats on the road. However, as with any other tool, AI can also be used maliciously, whether it is the creation of an intelligent cyberattack that adapts as it attacks, tracking software that can recognise individuals in a crowd, or used in a military application whereby targets can be quickly identified and eliminated.

Clearview AI is a company that focuses on developing face recognition technologies using AI. While this may seem perfectly normal, what they have done with their software is far from acceptable. Despite Clearview AI being one of the top 100 companies with an evaluation of over $100m, they have been able to stay off the radar from public knowledge regarding their use of AI and facial recognition.

So, what was it that Clearview AI did that is so reprehensible? Was it the creation of accurate facial recognition systems? Was it the deployment of such software in law enforcement? What Clearview AI did was arguably far worse; they have created a database of over 3 billion individuals with personal details and photos, all available to law enforcement.

Their platform has scoured the internet for any, and all photos with faces in them stored them on a private database and then catalogued the individuals in that photo. Law enforcement can then pass photos of suspects from video feeds and stills, and the system will find potential matches. This essentially means that whether people like it or not, Clearview AI has taken their data and added them to law enforcement databases despite not having a criminal record.


Italy announces €20m fine for Clearview AI


When it comes to launching fines for breach of privacy and data protection laws, it is up to each country what action should be taken. In the case of Italy, it has been decided by the Italian Data Protection Agency that Clearview AI will face a fine of €20m and an order to remove all data relating to citizens of Italy.

These charges were brought to Clearview AI as not only did the company illegally hold biometric data of Italian citizens, but it also had no transparency on how the stored data would be used. Furthermore, Clearview AI did not inform the public on their site to the full degree to which gathered data would be used.

Italy is not the only country asking Clearview AI to stop its data gathering practices; even the UK and France have requested Clearview AI to stop citing violations of GDPR and private data. However, as Clearview AI does not operate in Italy, there is no guarantee that it will indeed comply with the order by paying the fine and removing the held data.


What message does this send to engineers?


The constant attack that Clearview AI has faced sends a solid message to engineers worldwide that just because people’s data is publicly available does not entitle anyone to use it as they wish. While it may be legitimate to observe photos and people’s data on their respective pages (such as Facebook and LinkedIn), downloading this data offline onto databases where it can then be packaged and sold to third parties is not going to be tolerated.

The world is increasingly becoming more sensitive to privacy, and this can be observed with the many new digital laws coming into play whereby user data must be protected and removed upon request. AI may play a powerful role in modern technologies, but engineers must be careful when implementing its use, especially when it comes to biometric data.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.