10-03-2021 | | By Robin Mitchell
Recent reports from Singapore reveal how the police utilise tracing apps to attempt to find criminals in serious crimes. What are COVID tracing apps, how do they work, and why are they a terrible idea?
The COVID-19 pandemic has done nothing but made headlines since the first recorded death in the UK back in 2020, and a year later vaccines are being released with the hope that the pandemic will be over in the coming months. But before the vaccine was available, the only real method to protect the vulnerable against the virus was to prevent transmission of the virus.
While many will disagree (whether through personal experience or scaremongering), the virus is mostly harmless to a vast amount of the population. But the deadly capabilities of the virus lies in its inability to do harm. Since most show no signs of symptoms, the virus can rapidly tear through a population and attack the vulnerable without warning.
Since the virus is nearly impossible to detect in the healthy population, contact tracing apps became of interest. Such apps use Bluetooth and Wi-Fi in smartphones to determine the distance between itself and others nearby. If two users, both using the app, remain in close proximity for too long, the devices exchange unique IDs, and these are either submitted to a global database (centralised) or kept locally (decentralised).
If a person is detected to have COVID, their phone publicly announces that they have COVID, and anyone who has that devices UID also alerts their user to self-isolate. The idea is that tracing who people have had contact with can help prevent the spread of the virus.
The reason why tracing apps have had a hard time for adoption comes in the name; tracing. Generally speaking, most people like their privacy, and the idea of being tracked by an app or government usually brings about feelings of unease.
It was feared that if large portions of the population use such tracing apps, the phone's data could be recorded to track each individual such as GPS location. Many apps developed stated that they would not store this data, but there was a big disagreement between using a centralised and decentralised system even then.
A centralised tracing system would store everyone’s details on a single server accessible to the system's developers. However, a decentralised system keeps all data on-device and is only publicly made available when the user chooses to submit their COVID infection and UID. The use of a centralised system caused concern when the UK government said that medical professionals would use the data gathered to work on pandemic models. This raises concern as when one department has access to such tracking data. It is very easy for this data to be shared to others.
Recently, reports in Singapore revealed that the police have been using COVID tracing apps to track and find people in criminal cases. The tracing app used in Singapore, called TraceTogether, was stated to not take GPS readings, not violate personal rights, and only be used for COVID tracing.
But as soon as the police realised the system's capabilities, it didn’t take long for it to be abused. A murder inquiry in Singapore led the police to use the tracing app to find those involved, but while the victim had the tracing app, the suspect did not.
A 1-dimensional argument would say that such tracing apps could be useful in crimes to find perpetrators. Still, if basic logic is applied to said argument, the use of tracing apps for criminal cases quickly becomes a slippery slope. The inaccuracy of tracing apps, combined with coincidence, could see passers-by of a crime scene marked as potential suspects despite having nothing to do with the situation.
To make matters worse, false suspects can be fingerprinted, ID, and saved to police databases, making them more likely to be targeted by police in the future. Furthermore, the police can often form biases against people, and as a result, they cause grief for those under the police's microscope, all because they happened to walk by and two tracing apps exchanged keys.
If this example of extreme abuse teaches anything, it’s that you can always count on someone to misuse data. This example demonstrates how current society may not be ready for smart cities utilising thousands of sensors and tracking systems.
For smart cities to become popular, there needs to be an element of trust whereby those living in such cities know that their personal data and privacy are protected. Smart street lights can be good for saving power, but authorities should not use authorities to track an individual, even if they can. Simultaneously, just because an AI can change all the traffic lights on-the-fly to save you minutes on a car journey does not mean that some government can look at the results of the traffic data to find out where you drove and why.