Data Blast: UK ICO gives Covid19 guidance, Cathay Pacific fined, Facebook suspends dating app, French Court prohibits facial recognition in schools…
ICO issues guidance on data protection implications of COVID-19
The UK Information Commissioner has issued guidance on the data protection implications of responding to the COVID-19 coronavirus pandemic, answering important questions for organisations, businesses and employers.
The guidance is aimed at public bodies and health practitioners, and whilst helpful to everyone, we urge some caution in assuming your business and interests can override compliance with the data protection laws. The ICO guidance covers the following topics:
- Data protection compliance: The ICO will not seek to penalise organisations that divert resources away from usual data privacy compliance measures in responding to the pandemic. However, we stress the comment ‘responding to the pandemic’ – your response for a business need may not be seen as an acceptable reason to divert. Furthermore, while statutory timeframes will not be extended, the ICO makes clear that people making information rights requests can expect to experience delays, and so justified missing of deadlines may be treated with some leniency.
- Contacting individuals: Data protection law does not prohibit the Government or the National Health Service (NHS) from sending public health messages, as these are not for direct marketing purposes. Public bodies may also require additional collection and sharing of personal data in order to protect against public health threats.
- Staff working from home: Data protection law does not prohibit working from home or on different devices. However, businesses are warned to ensure that proper security measures have been put in place.
- Telling employees that co-workers may have contracted the virus: Staff should be informed about possible cases in the workplace, however, employers should not provide any more information than is necessary, and should likely refrain from naming potentially exposed individuals.
- Collecting health data from employees and visitors: While you are obliged to safeguard employee health, this likely does not require gathering a lot of their information. However, it is reasonable to ask if they have or plan to visit particular countries, or if they are experiencing symptoms.
- Sharing information with the authorities: Yes, this is allowable in relation to specific individuals, but only where necessary.
You can read the full ICO guidance via their website https://ico.org.uk/.
ICO announces fine for Cathay Pacific Airlines
On March 4th, the ICO announced that it had fined Cathay Pacific Airways £500,000 as a result of a data breach that exposed the personal data of 9.4 million people.
The breach was first suspected to have taken place in March 2018, when Cathay Pacific’s database was hacked, and the breach was confirmed in May of that year.
The data breach occurred between October 2014 and May 2018, and exposed the names, dates of birth, passport details, postal and email addresses, phone numbers and travel history of passengers, as well as 430 credit card numbers, only 27 of which were active.
The ICO’s subsequent investigation found that Cathay Pacific’s database was compromised by malware, which automatically harvested the passenger data. The investigation also found several other security flaws, including non-password protected file backups, an out-of-support operating system, and insufficient antivirus protection.
The £500,000 fine is the maximum penalty that the ICO may issued under the Data Protection Act 1998, which was replaced by the GDPR and Data Protection Act 2018. Cathay Pacific was fortunate that, given the timing of the breach, they were investigated and ultimately fined under the old law. The new law, which applies to data breaches occurring after May 25th 2018, affords the ICO the power to fine companies £17 million or 4% of global turnover, whichever is greater.
This new fining power was displayed last summer when the ICO announced plans to fine British Airways £183 million (initially covered here). However, the ICO has extended the time for which it can issue a fine to British Airways until March 31st of this year, a situation which we will be monitoring closely.
Facebook suspends dating feature after Irish DPC raid
On February 12th, it was announced that Facebook Ireland had suspended plans to roll-out a dating feature in the EU, after the Irish Data Protection Commission (DPC) raided its Dublin offices.
Facebook Dating, a feature already available in the US, allows users who are over the age of 18 to create separate dating profiles on their Facebook account. However, the planned roll-out sparked concerns at the DPC.
A DPC statement said ‘we were very concerned that this was the first we had heard from Facebook Ireland about this new feature, considering that it was their intention to roll it out [tomorrow, February 13th]….our concerns were further compounded by the fact that no information or documentation was provided to us on February 3rd in relation to the Data Protection Impact Assessment (DPIA) or the decision-making processes that were undertaken by Facebook Ireland.’
The GDPR requires companies to carry out a DPIA for any new processing projects in order to minimize data protection risks. After failing to provide the requested DPIA documentation, DPC officers conducted an inspection of Facebook Ireland’s offices on February 10th, and the roll-out has now been postponed.
French court prohibits proposed use of facial recognition in schools
On February 27th, a French court cancelled a series of tests using facial recognition at the entrance to two high schools, finding the tests to be illegal and in violation of the GDPR, marking the first time that a French court has applied the GDPR to facial recognition technology (FRT).
The proposed use of FRT was approved in December 2018 by the PACA regional authority, in order to grant or refuse access to students at two high schools in Nice and Marseilles. The FRT was put in place in February of 2019, supposedly on the basis that the students had provided their consent, but was subsequently challenged by a group of concerned parents.
In October 2019, the French data protection authority (CNIL) published a notice expressing its concerns over the FRT deployment, particularly using such intrusive biometric mechanisms in relation to minors. The notice reiterated that the GDPR requires that measures be necessary and proportionate, and that school security and access could likely be achieved by less intrusive means. CNIL concluded that the FRT program was contrary to the GDPR principles of proportionality and data minimisation.
The February 27th decision confirmed CNIL’s position, overturning the PACA region’s decision and banning the proposed FRT scheme on three main grounds. First, in relation to competence, it was held that the Schools’ Head, and not the Region, is tasked with school safety, and therefore the decision was ultra vires. Second, in assessing whether student consent was a proper legal basis, the court held that receipt of consent by simple signature, from a student of their guardian, is insufficient for GDPR purposes. Specifically, given that the students are under the direct authority of their school heads, the court did not believe that there was not a sufficient guarantee that consent was given in a free, specific and informed way.
Lastly, regarding proportionality, the court reaffirmed the need to assess processing proportionality strictly, following CNIL’s ‘less intrusive means’ requirement. The court held that the Region had not shown why a badge/ID card approach, coupled with video surveillance, was insufficient in achieving the purpose of processing for access and control of the schools.
The court concluded that the FRT test violated Article 9 of the GDPR, and did not fall under one of the exceptions listed in paragraph 2 of that article. As a result, it was held that the decision to authorities the test must be invalidated.
The decision follows on from the European Commission’s digital chief’s statement regarding how the use FRT is problematic for the purposes of obtaining proper consent, which we covered here.
For more information please contact Partner, James Tumbridge at email@example.com.