See below for the latest Data Blast from our legal team: Cybersecurity a growing concern amid high profile breaches; French court enforces limits on Covid monitoring and reverses regulator’s attempt to ‘make new law’ on cookies; Uber drivers use GDPR to seek access to profiles used to allocate rider requests; Still no full risk assessment for UK Test & Trace program...

UK, US and Canadian universities suffer data breach from hackers

On July 23rd, the BBC reported at least 10 universities in the UK, US and Canada had student and alumni data stolen during a hack of Blackbaud, a provider of education administration and finance management software, in May of this year.

Blackbaud has received criticism for failing to disclose the hack until now, and also for paying the hackers an undisclosed ransom. While most of the stolen data pertained to alumni who had been contacted regarding donations, some related to current students and staff. The institutions impacted include Oxford University, the Rhode Island School of Design in the US, and Ambrose University in Alberta, Canada.

Blackbaud declined to identify all of its institutional customers which may have been affected by the hacking. A statement on the company’s website confirmed that ‘in May of 2020, we discovered and stopped a ransomware attack. Prior to our locking the cyber-criminal out, the cyber-criminal removed a copy of a subset of data from our self-hosted environment.’ The stolen data included donation history, contact numbers and event attendances, but does not appear to have included credit card or payment details.

Under the GDPR, companies must report significant data breaches within 72 hours of learning of the breach, or face potential fines. Whilst the breach occurred and was dealt with by Blackbaud in May, the UK ICO was only informed in Mid-July, when those potentially affected by the breach were also sent email notifications of the breach. An ICO spokesperson has stated that ‘Blackbaud has reported an incident affecting multiple data controllers to the ICO. We will be making enquiries to both Blackbaud and the respective controllers, and encourage all affected controllers to evaluate whether they need to report the incident to the ICO individually.’

Twitter users suffer security breach – hackers take control of some high profile accounts

Barack Obama and Kanye West were among the high profile users of Twitter to have control of their accounts briefly taken over by hackers this month. Whist the precise details of the security breach remain under investigation, Twitter has said it appeared the hackers used ‘social engineering’ tactics to take advantage of employees with administrative privileges over user accounts; possible methods used include ‘phishing emails or phone calls’ in order to gain access.

The attack was played out in real time on Twitter, as the attackers issued tweets from the A-list accounts calling for readers to send bitcoins to illegitimate addresses; one bitcoin exchange reported having blocked over 1000 attempted transactions at the time. Whilst control over the accounts appears to have been regained promptly, concerns remain over the nature of data that the hackers may have been able to access, including direct messages between the account owners and other Twitter users.

Cybersecurity has been thrust into the spotlight since much of the world has adopted remote working in response to Covid 19, with a marked rise in attempted hacking by those seeking to take advantage of less robust security protections than organisations are able to implement in their centralised workplaces. Organisations remain responsible for ensuring the security of their networks whilst their employees work remotely, and increased training, in particular, is highly recommended in order to help employees to recognise illicit attempts to gain access to their work environment.

Further temperature testing initiatives halted in France

Both public and private temperature testing responses to Covid 19 have been held to be illegal under the GDPR in France, serving as a reminder that data protection law continues to apply despite the pandemic and legal advice should be sought before intrusive measures are deployed.

We previously reported, employers wishing to use temperature testing to secure their workplaces are strictly limited in how such testing can be applied; only non-contact testing such as using infrared thermometers can be used, and results cannot be recorded (in essence the testing must remain outside the scope of data processing which engages GDPR protections). It appears that not all employers have taken on board the guidance from the French data protection authority, the CNIL; a French worker recently Tweeted about systematic temperature recording by their employer, and the CNIL replied with a reminder that such recording is not authorised by law in France.

In response to Covid 19, the government of the municipality of Lisses installed thermal cameras at an entrance to the municipal offices, and at the entrances to local schools. The practice was challenged unsuccessfully in the lower courts, before arriving at the Conseil d’État, where campaigners succeeded in having the school thermal cameras declared illegal.

The court held the mandatory nature of the temperature testing, in the absence of demonstrable consent, was not permissible as it constituted processing of health data, the consequence of which was to be the exclusion from school of any students testing above a certain temperature. The municipality attempted to seek parental consent for the testing, but could not demonstrate consent sufficient to satisfy the court. The court held that cameras installed at the municipal offices could remain, as they were not mandatory and access to the building could easily be gained without passing by the cameras.

French regulator’s position on ‘cookie walls’ annulled by French Administrative Court

In draft guidance published by the CNIL, the regulator expressed the view that the legal requirement to obtain user consent for the use of non-essentials cookies by websites, meant that so-called ‘cookie walls’ requiring such cookies in order to access a website, were not permitted.

The Conseil d’État, however, held that the CNIL could not seek effectively to create binding law by way of a ‘soft law’ instrument such as their draft guidance. The CNIL draft guidance accorded with the European Data Protection Board (EDPB) guidelines on cookie consent under the GDPR (see our earlier report here), which states that cookie walls are not permissible. However, EDPB guidelines do not have the force of law, and are not binding upon national regulators; no judicial determination to date has specifically considered whether cookie walls may be legitimate in some circumstances.

Whether cookie walls are permissible has been a contentious question, in particular since the introduction of the GDPR; the view against cookie walls being that consent which is ‘freely given’ should not be a pre-condition for accessing a service such as a website. The countervailing view is that many websites, including many online news publications, offer unpaid access to their content, and rely upon ‘targeted’ advertising in order to support the free access they offer. If it is determined that cookie walls are in fact impermissible under the current laws, we may see fewer publications offering online access to those who do not have a paid subscription.

Gig economy in the spotlight as drivers’ union commences legal action against Uber

On July 20th, the UK-based App Drivers and Couriers Union (the Union) launched legal action against Uber in Amsterdam, seeking to uncover the algorithm used by Uber to facilitate its driver workflow.

In addition to the algorithm, the suit seeks considerable data the company has collected on their drivers, and how this data is used to make management decisions. The Union claims that transparency is essential in order to determine if Uber discriminates between drivers, and that the information sought will allow drivers to bargain collectively in a way that is not currently possible. The Union is also seeking to learn how automated decision-making is used by Uber in managing its drivers.

The number of people working for online driving and delivery apps, including Uber and Deliveroo, doubled between 2016 and 2019, and now accounts for roughly 10% of the workforce. The suit claims that Uber employs tags on particular drivers’ profiles, for example ‘late arrival’ or ‘inappropriate behaviour,’ and that such tags impact the quality of rides allocated to certain drivers. The Union has complained that drivers have not been provided with information regarding how Uber manages access to work, and they wish to know the specifics of how the tagging regime effects drivers.

Two Union members claim that Uber has failed to meets its GDPR obligation in response to the drivers’ information access request. Specifically, the drivers have sought information regarding their profiles, comments made by Uber staff, and how dozens of categories of their data have been processed. The claim states that ‘Uber collects large amounts of data that provide a very penetrating picture of, among other things, the use of the Uber driver app, the location and driving behaviour of the driver, communication with customers and the Uber support department,’ and that in both the UK and the Netherlands, the GDPR grants app drivers the right of access to such ‘profiling’ data.

The claim comes at a critical time for Uber, as the company pursues a legal challenge in the UK Supreme Court against a ruling that its drivers should be treated as workers, as opposed to self-employed contractors, and are therefore are entitled to legal protections including minimum wage and paid holiday entitlement.

UK Government admits that mandatory risk assessment has not been carried out for the NHS Test and Trace program

On July 20th, the UK Government confirmed that a mandatory risk assessment of the handling of health data in relation to the NHS Test and Trace system, which went live on May 28th, has not been carried out.

The government confirmation was in response to a threat of legal action by Open Rights Group (ORG), a privacy and free speech organisation, who two weeks earlier published a letter seeking publication of a data protection impact assessment (DPIA) regarding Test and Trace.

The GDPR requires that a DPIA assessing the potential privacy concerns associated with collecting and processing special category data (in this instance health data) be completed prior to collection. If the DPIA indicates that processing would result in a high risk to the data subjects concerned, absent measures taken by Public Health England to mitigate those risks, they must consult the UK Information Commissioners Office (ICO) prior to initiating processing. Risks can include everything from the threat of hacking to staff accessing data without proper authorisation.

The government statement sets out that a number of separate DPIAs were conducted for certain aspects of Test and Trace, and asserts that ‘the absence of a DPIA for every aspect of the programme cannot be and should not be equated with a failure to ensure that the protection of personal data has been an important part of the programme’s design and implementation.’ The statement appears to conceded, though, that DPIAs should have been carried out to address data processing by external companies such as Amazon Web Services, upon which Test and trace relies.

As we previously discussed here, Test and Trace asks which of its users have tested positive for Covid-19, in order to determine who they have been in contact with recently. Those contacts are then messaged and told to self-isolate, in case they may have been exposed to the virus. Those who test positive are asked to volunteer their NHS number, email, date of birth, telephone number and symptoms, as well as information pertaining to their contacts, and roughly 35,000 have provided this information thus far. While this information is useful in containing the spread of the virus, the DPIA would help limit the risks involved in the divulging of this information.

The government’s announcement did not provide a timetable for when a comprehensive DPIA would be completed. ORG warned that it expects the DPIA to cover ‘a full list of purposes, clarity of the involvement of third parties, justifications for the data processing and retention periods, clear mechanisms for individuals to assert their rights and mitigation steps for any risks.’

For more information please contact Partner, James Tumbridge at jtumbridge@vennershipley.co.uk.