In the world of Covid-19, which we are all facing implications from, enter the ‘Test and Trace’ app, hoping to control and contain the virus. With this, however, come implications of both privacy and data protection, as well as what these mean for the future, and the possibility of a dangerous precedent of surveillance. In addition to this, the impact of working from home brings its own problems with laws of privacy, with apps such as ‘Zoom’ and ‘Microsoft Teams’ being used on an unprecedented scale to prevent a total halt of businesses and also education. The question for both of these problems is this – to what extent is the user’s data protected, and in accordance with the General Data Protection Regulations (GDPR).
Test and Trace: The Facts
When two phones which both have the app downloaded are close to each other, they connect through Bluetooth. This connection is then acted on by the app if one of those users receives a positive coronavirus test on the app, which then alerts the other user. There are also check-in options with QR codes at many venues, such as pubs, cafes, and restaurants, allowing you to get notified if someone in that venue has also received a positive test. However, both of these alerts will only happen if the person who has tested positive for Covid-19 has input the result into the app.
When the app was in developmental stages, it would have been impossible to work in this way. The European Union did supply guidance on the compliance of the app with GDPR, but the United Kingdom originally wanted to establish a centralised database of contacts, which was rejected by Apple and Google, who’s devices have privacy settings ingrained into the hardware.
So, what are the problems with privacy?
According to a Sky News article from July, the Government failed to conduct a full data privacy impact assessment before the app was released in May. There is a recent assessment currently available on the gov.uk website, however this still does not cover all of the aspects of the app. The section on ‘necessity and proportionality’ is interesting, as it addresses the aforementioned problems with Apple and Google, with explaining that the original design was dropped due to the functionality problems. This was because the old version of the app failed to recognise iPhones due to the privacy settings of the phones, and therefore they had to change to the current system that we currently experience now.
Even members of the Government petitioned the Information Commissioner, who is in charge of analysing data protection and prosecuting for breaches, for not holding the Government accountable and asking the Information Commissioner’s Office (ICO) to consider fining the Government ‘if it fails to adhere to the standards the ICO is responsible for upholding’. This is perhaps reassuring, as it shows that even where the Government may have bypassed some data protection laws, this was not an intention at the forefront of their minds, and they hope to rectify it if notified of problems.
There were additional problems where data stored has been used by those in the hospitality industry for unsolicited messages, alongside contractors for the app sharing some users data on social media. This seems to show that the vast amount of data that is being collected is not being protected as adequately as it should be, and therefore more restrictions must be put in place. If we consider that two of the key principles of GDPR are transparency and confidentiality, the lack of clarity on how the collected data is stored and processed results in little confidence in the eyes of the public as to exactly what is happening when they check in to a venue through the app.
Problems Posed by Remote Working
The problems with remote working are a contrast to those faced by Test and Trace, as with the app, consent has been given to data being processed by the downloading of the app. Over ‘lockdown’ particularly, apps such as ‘Zoom’ were subject to random cyber-attacks, referred to as ‘Zoom-bombings’. These were instances where random users joined private meetings, which is a problem, especially where these meetings were between members of a business, as it risks confidential information being leaked.
However, even the educational sector has also been hit by these events, with lectures taking place across the app, and people unlawfully joining to use hurtful, inappropriate language towards students and staff. Therefore, even though Zoom itself has purported ‘end to end encryption’ it begs the question – how is this possible? By hacking the system, this means that users private data has been accessed, and so their data has been collected. Even though Zoom is an American company, it is still subject to GDPR where the app affects EU and UK nationals, meaning Zoom could become liable if the problem persists.
Conclusion
In conclusion, the inherent problem with relation to data protection with the Test and Trace app is primarily the clarity on which it operates. This is a problem because the less the public know about it, the more risk there is that there have been breaches of data protection laws which are simply being hidden from public view. As it is not clear exactly what is happening with the massive amount of data being received, it is impossible to tell for certain whether its operation is in itself a breach of data protection laws.
With regards to remote working, there are many confidentiality problems that occur, but there is no option to use anything over than video equipment to keep businesses, schools and universities continuing in the wake of Covid-19. As a result, there need to be much more protections in place for not just the user’s privacy, but the privacy of any clients. Where clients of law firms benefit from legal professional privilege, this may not be possible with continuing hacks, therefore the question that needs to be answered is how we can best reduce this risk.
Comentarios