See below for the latest Data Blast from our Legal team: Guidance on fines, consumer protection & data, UK Clearview fine, malicious apps are unwelcome in the UK, plus US police are interested in your cars…
Guidelines on the Calculation of Fines under the GDPR
The European Data Protection Board (‘EDPB’) has produced new guidelines on the calculations of administrative fines under the GDPR, which can be found here.
In paragraph 17 of the guidance, the EDPB provides an overview of the methodology as follows:
- Establish the infringement that is leading to a fine
- Set a starting point based on the following:
- The classification under Art. 83(4)-(6) GDPR;
- The seriousness of the infringement pursuant to Art. 83(2)(a), (b) and (g) GDPR; and
- The turnover of the undertaking as a relevant element to allow imposing an effective, dissuasive and proportionate fine, under Art. 83(1) GDPR.
- Evaluate the aggravating and mitigating circumstances, including past and present behaviour of the processor/controller, and increase/reduce the fine accordingly.
- Ensure that the proposed fine does not exceed any legal maximum.
- Analyse whether the final amount of the calculated fine meets the requirements of effectiveness, dissuasiveness and proportionality, as under Art. 83(1) GDPR, and increase/decrease the fine accordingly.
Andrea Jelinek, Chair of the EDPB, has explained that the purpose of the guidance is to ‘boost further harmonisation and transparency of the fining practice of DPAs,’ but she caveated that by saying ‘the individual circumstances of a case must always be a determining factor and DPAs have an important role in ensuring that each fine is effective, proportionate and dissuasive.’
The EDPB’s guidance is open to public consultation, which closes on the June 27th 2022.
Consumer Protection Societies and Data
The CJEU has decided, under Art. 80(2) GDPR, that national legislation may allow consumer protection associations to bring legal proceedings for GDPR violations, even when they have not been specifically authorised for this purpose.
Federal Union of Consumer Organisations and Associations, Germany (the ‘FUCO’) brought an action against Meta Platforms Ireland Limited (which runs Facebook for the entire EU). Facebook has an App Centre which allows users to access free third-party content (usually games). When accessing this content, personal data and the permission to publish such is given to the third party. The FUCO asked the Regional Court in Berlin (the ‘LG Berlin’) to for an injunction against Facebook Germany, claiming that the information provided by the games in the App Centre is unfair, and the ability to post information on behalf of the Facebook user as a general condition to play the game, is to the disadvantage of the user. The LG Berlin ruled against Meta, and Meta then appealed to the Federal Court of Justice (‘BGH’), who referred a question to the CJEU.
The BGH asked the CJEU whether the Arts. 80(1) and (2) and 84(1) preclude national rules from allowing associations, entities and chambers entitled under national law, to bring proceedings for breaches of the GDPR, independently of the infringement of specific rights of individual data subjects and without being mandated to do so by a data subject in the civil courts on the basis of the prohibition of unfair commercial practices or breach of a consumer protection law.
The CJEU found that interpretation of the Articles mentioned was not necessary to answer the question as asked, with the exception of Art. 80(2). The Court found that Art. 80(2) gives Member States discretion as to its implementation, stating:
‘In order for it to be possible to proceed with the representative action without a mandate provided for in that provision, Member States must make use of the option made available to them by that provision to provide in their national law for that mode of representation of data subjects.’
Germany did not make any such exercise of discretion, as the national law already allowed consumer protection associations to bring legal proceedings against the person allegedly responsible for an infringement of the laws protecting personal data. The existing law was used to transpose the EU law requirements.
The CJEU considered that the FUCO was a consumer protection organisation falling within the public interest objective of safeguarding rights of consumers. The Court also found that the FUCO cannot be ‘required to carry out a prior individual identification of the person specifically concerned by data processing that is allegedly contrary to the provisions of the GDPR.’ This is because the GDPR defines personal data as not only belonging to an identified person, but also belonging to an identifiable person. Further, a representative action can be initiated when a representative organisation considers that an infringement and has taken place – it does not need to prove harm to the actual data subject.
The CJEU concluded that it was beneficial for public policy to allow consumer protection associations such as the FUCO to bring such cases, as it strengthen the rights of data subjects, and found that a representative action might be more beneficial (and more efficient) than individuals attempting to exercise their rights on a one-by-one basis.
The ICO Takes a Dim View of Clearview
After proposing a fine of EUR 17m, the UK ICO has come to a final decision and fined Clearview AI EUR 7.5m. The fine is for using images of people in the UK and elsewhere that were collected from the internet (especially social media platform) to create a global online database that could be used for facial recognition.
The background was that Clearview collected more than 20 billion images of people’s faces (along with other data) from publically available information online for its database, but the data subjects were not informed that their images were being collected for this use. Clearview uses its image database to allow clients (including law enforcement) to upload images to the app which is then checked against all the images on the database. The app provides a list of images with similar characteristics, and a link to the website from which the additional images were scraped (be that Facebook, Twitter etc.).
The UK Information Commissioner said:
‘People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement. Working with colleagues around the world helped us take this action and protect people from such intrusive activity.’
After a joint investigation with the Office of the Australian Information Commissioner, the UK ICO found the following breaches by Clearview:
- Failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
- Failing to have a lawful reason for collecting people’s information;
- Failing to have a process in place to stop the data being retained indefinitely;
- Failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
- Asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.
The decisions against Clearview have been coming for some time, the interesting part is the clear growth in co-operation between regulators. It is likely that we will see increased international cooperation between data protection authorities as more and more data is used on an international and even global scale by multinational companies.
Driverless Cars – A Cache of Personal Data
It is an emerging desire for US police to want to make use of data from driverless cars (also known as ‘autonomous vehicles’ or ‘AVs’). As we warned in a previous data blast, there are concerns about how the police and other law enforcement agencies might use data-driven technology and whether there is sufficient oversight.
Following a public records request from Motherboard (an online tech magazine owned by VICE), the San Francisco Police Department released a training document with regard to AVs. The document can be found here. The most relevant section when it comes to data privacy concerns is under the heading ‘Investigations’:
- Autonomous vehicles are recording their surroundings continuously and have the potential to help with investigative leads
- Information will be sent in how to access this potential evidence (Investigations has already done this several times)
The fact that the police are capable of using such data (and in some cases already are doing so), has led to concerns from a number of organisations. Adam Schwartz, senior staff attorney at the Electronic Frontier Foundation said that cars in general are troves of personal consumer data, but AVs have even more of that data from capturing the details of the world around them, ‘So when we see any police department identify AVs as a new source of evidence, that’s very concerning.’
Chris Gilliard, Visiting Research Fellow at Harvard Kennedy School Shorenstein Center said that ‘As companies continue to make public roadways their testing grounds for these vehicles, everyone should understand them for what they are—rolling surveillance devices that expand existing widespread spying technologies. Law enforcement agencies already have access to automated license plate readers, geo-fence warrants, Ring Doorbell footage, as well as the ability to purchase location data. This practice will extend the reach of an already pervasive web of surveillance.’
San Francisco is not the first police department to use AVs as mobile surveillance cameras.
Vehicles, both AVs and also regular vehicles (using parking assist cameras etc.) are now part of the ‘Internet of Things.’ The question then becomes at what point law enforcement will be able to use the data produced by such vehicles, whether by sending out own vehicles around towns and cities or confiscating footage gathered by private vehicle. At present it is somewhat telling that all the experiments with regard to AVs and the police are being conducted in the USA – in Europe there would be serious GDPR concerns with such activities. Time will tell if that position will change.
Similar concerns about lack of oversight with regard to law enforcement’s use of data have been applied to the facial recognition software designed by Clearview AI.
UK Government to Crack Down on Malicious Apps
The UK Department for Digital, Culture, Media and Sport (DCMS) has asked the tech industry for views on measures to make the app market safer and more secure. The DCMS proposed Code of Practice will set minimum privacy requirements for app store operators. This is intended to safeguard the privacy and safety of app users. The consultation will run until June 29th 2022.
This consultation was in response to a report by the National Cyber Security Centre (NCSC) showing that consumer’s data and money are at risk due to fraudulent apps containing malware or poorly developed apps which can be easily compromised by hacker exploiting security weaknesses. One proposal for minimum standards is that apps would have to explain why the need access to users’ contacts or their location.
Speaking about the consultation, Cyber Security Minister Julia Lopez said:
‘[N]o app should put our money and data at risk. That’s why the Government is taking action to ensure app stores and developers raise their security standards and better protect UK consumers in the digital age.’
The NCSC report states that the Government’s proposed Code of Practice will reduce the chance of customers downloading malicious apps onto their devices. Ian Levy, Technical Director of the NCSC said:
‘Our threat report shows there is more for app stores to do, with cyber criminals currently using weaknesses in app stores on all types of connected devices to cause harm.
I support the proposed Code of Practice, which demonstrates the UK’s continued intent to fix systemic cybersecurity issues.’
A new product security law
As well as whatever comes from this consultation, there is a new product security law making its way through Parliament. This will place new requirements on manufacturers, importers and distributors of consumer tech, including banning easy-to-guess default passwords, requiring manufacturers to be transparent about the length of time that products will receive security updates, and requirements to disclose vulnerabilities.
Separately, in the Queen’s Speech in May 2022, the Government trialled a Data Reform Bill, promising to ‘take advantage of the benefits of Brexit to create a world class data rights regime.’ However, to date no text is available. This unseen bill promises to ‘modernise’ the ICO and contain legislation that is ‘pro-growth’ and ‘reduces burdens on businesses.’ To our understanding, this is not going to be a ‘world class data rights regime’ when it comes to individual data subjects.