Clearview AI facing £17m fine in the UK for breach of data protection laws

08-12-2021 | By Robin Mitchell

American company Clearview AI is currently fighting a battle with the UK’s ICO after being issued a £17m fine for multiple data rights violations. Who is Clearview AI, why are they facing penalties in the UK, and what does this tell engineers about the dangers of personal information?


Who is Clearview AI?


Clearview AI is an American technology company that focuses on facial recognition systems in law enforcement, universities, and individual projects. The system, founded in 2017, gathers photos of individuals online and compiles the data into a large database. From there, submitted photos of faces can be used to find matches in the database, and matching results return information on the database file, including the image source. According to Clearview AI, photos of faces are sourced from a wide range of public sites, including news media, mugshot websites, and social media.

Of all applications that Clearview AI claims to offer, it is mainly geared towards law enforcement. According to Clearview AI, law enforcement “should have the most cutting-edge technology available to investigate crimes, enhance public safety, and provide justice to victims”.

Clearview AI is of particular interest because it operated with near secrecy from the public while it was utilised by law enforcement agencies. Once the company was made publicly aware, many sources of images, including Google, Twitter, and Facebook, all sent cease and desist orders that Clearview AI deletes all content gathered from those sites.


Why is Clearview AI facing fines in the UK?


Recently, the UK Information Commissioner’s Office (ICO) has raised significant concerns regarding data gathered by Clearview AI and has presented the company with a fine of £17m. Furthermore, the UK ICO has also ordered Clearview AI to stop processing UK data and delete any pre-existing data.

This case is interesting because Clearview AI claims they only obtain photos from publicly available sources available to anyone with a browser (including law enforcement). The only difference between Clearview AI and Google is that Clearview AI creates a database of faces and links from where they come from, which helps speed up finding individuals.

However, this is where there is a clear difference between Clearview AI and Google. While Google allows image searching, it does not keep a database of faces with links to profiles whose sole purpose is for law enforcement. Furthermore, Google’s data is changed frequently, and sites that mark photos as being hidden or removed will also be removed by Google in line with data protection. Clearview AI, however, may not have the same capability and store data for long periods that remain inaccessible to the original data sources.

Owners of public data (such as profile pictures) may also not know how their data is being used. Even if data is made public, those who produced the data are the rightful owners, and their data cannot be used without explicit permission. Companies such as Facebook have repeatedly been in trouble using user data. Clearview AI is doing the same thing by taking user photos, creating a database (without their permission), and charging law enforcement for the data. Even if this is not an invasion of privacy and/or data protection laws, it is undoubtedly a violation of copyright.

Other claims made by the UK ICO against Clearview include failing to treat UK citizens equally, not having a mechanism that allows data to be deleted, lacking a legal basis for keeping data, and not providing users with the option to have it erased.

Currently, the fine is only provisional, and Clearview AI intends to fight back against the fine. The results of the investigation and final ruling will be made next year.


What does this case teach engineers about the importance of personal data?


Privacy and data protection are increasingly becoming important in modern life, and this case clearly demonstrates that data being public does not give one the right to use it. This presents an interesting case because one could argue that the concept of public data is to have it available to the world. If such data as personal and private, it would not be accessible to outside computers, nor would it be posted on sites such as Facebook and Twitter.

This is where the concept of copyright comes in; public images of the Hollywood sign and the Eiffel Tower at night are available, but those images are copyrighted, meaning that they cannot be redistributed. Because Clearview AI has downloaded those images, bundled them into a database, and then sold access to that database, it could be argued that Clearview is charging for publicly available content.

But engineers designing products or services using personal data must be extremely careful about how data is stored and how those who generated the data can access it. Customers must be able to request that their data be deleted and ask what data is gathered and how it will be used.

To help cope with this, engineers can think about creating data tables that list every piece of data that their system will generate, where it comes from, and if it is owned by someone outside of the device. Appropriate measures can be implemented to allow users to control that data.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.