Data Blast: UK Treasury missteps on biometric data, breach reports are up, fines for text messages, and Spain nullifies law

HMRC to delete biometric data

Her Majesty’s Revenue and Customs (HMRC) has announced its intention to delete five million voice recordings, after receiving an enforcement notice from the ICO.

After an investigation into HMRC’s Voice ID service, the ICO found that HMRC had failed to provide customers with sufficient information on how their biometric data (in this case, voice recordings) would be processed and the opportunity to give or withhold consent, in violation of the General Data Protection Regulation (GDPR). The ICO issued an enforcement notice on April 4th 2019, requiring HMRC to delete all voice recordings under the Voice ID service for which HMRC had not received explicit consent. Under the GDPR, biometric data is considered special category data (along with ethnic origin, political opinions and); these types of data are subject to stricter conditions for processing.

HMRC uses the Voice ID system to make it easier for callers to pass security barriers in order to discuss their accounts with HMRC advisors and to prevent others from accessing those accounts. However, individual Voice IDs were created prior to the implementation of the GDPR, and the required consents were not gathered. As a result, HMRC will delete roughly five million Voice IDs where explicit consent was not received, and which were never used since being created, while retaining around 1.5 million which are currently in use.

Many financial institutions have begun using a ‘my voice is my password’ system of recognising and securing users’ accounts, and HMRC will continue to use this system in line with GDPR rules. In a letter, HMRC’s chief executive, Sir Jonathan Thompson, stated “I have informed the ICO that we have already started to delete all records where we do not hold explicit consent and will complete that work well before the ICO’s 5 June 2019 deadline.”

We expect that the use of biometric data will be an area for increased focus for regulators, as more and more businesses rely on this form of personal data as an effective security measure. You can read our analysis of the French data protection authorities new Regulations concerning biometric data use in the workplace here.

Insurance claims company fined for text messaging campaign without valid consent

Payment protection insurance claims company Hall & Hanley Ltd has been fined £96,000 by the ICO after sending more than 3 million direct marketing texts between January and June 2018.

The ICO’s investigation was opened following over 1,000 complaints concerning the text messaging campaign. It was found that while Hall & Hanley had used a third party to conduct the campaign, it did not have valid consent in order to do so.

Hall & Hanley claimed that valid consent had been obtained when receivers of the texts subscribed to one of four websites. However, Hall & Hanley was not named in two of these websites’ privacy policies, and subscribers were forced to provide consent to third party marketing as a condition of subscription, which runs afoul of the Privacy and Electronic Communications Regulation (PECR).

In a statement, ICO Director of Investigations, said “Companies which are responsible for generating these types of marketing messages should make sure they are operating legally or face a potential fine. Hall and Hanley should have known better. The laws on these types of marketing messages are strict because they can be very intrusive.”

Legal action commenced over facial recognition 

Ed Bridges, a former Liberal Democrat councillor in Cardiff, has launched a first-of-its-kind legal action concerning the use of facial recognition technology by the South Wales police.

Bridges claims he was distressed by the apparent use of the technology capturing his image while walking in the street, and that its use breaches data protection and equality laws.

Automated facial recognition (AFR) technology can be used to map captured faces and can be used to compare those maps to a variety image watch lists, including missing persons and persons of interests to the police. AFR technology is capable of functioning in large crowds such as in shopping centres and at football matches.

In a statement, Bridges’ counsel claimed that “AFR enables police to monitor people’s activity in public in a way they have never done before. The reason AFR represents such a step change is you are able to capture almost instantaneously the biometric data of thousands of people.”

Bridges argues that he should reasonably expect his face not to be scanned in public or be processed without his consent while he was not suspected of wrongdoing, and that the relatively small number of people arrested through the use of AFR does not justify the processing of thousands of peoples personal information.

Public facial recognition has been in use by three UK police forces since June 2015: South Wales Police, Leicestershire Police and the Metropolitan Police.

South Wales Police have claimed that AFR usage does not infringe Bridges’ privacy or data protection rights, and compared its use to that of photographing a person’s activity in public. Furthermore, they stated that image data is not retained, unless the individual captured is on a ‘watch list’; the non-retention of facial images could be a key consideration as to whether the processing meets the GDPR standard.

The ICO recently published guidance concerning sensitive data processing by law enforcement, the conditions for which are outlined under Part 3 of the GDPR. In order for law enforcement to carry out sensitive data processing they must satisfy one of the Part 3 conditions, and demonstrate that the processing is strictly necessary, meaning it has to relate to a pressing social need and be unachievable through less intrusive means.

Increase in UK data breach reports under the GDPR

The ICO has reported that the number of data breach reports it has received in the year since the introduction of the GDPR has increased four-fold.  The ICO stated that it received 14,072 breach notifications, an increase from 3,311 during the previous year.

Increase in UK data breach reports under the GDPR

The ICO has reported that the number of data breach reports it has received in the year since the introduction of the GDPR has increased four-fold.  The ICO stated that it received 14,072 breach notifications, an increase from 3,311 during the previous year.

The ICO has identified an increase in ‘over-reporting,’ as data controllers are so concerned about non-compliance with the notification requirements that they are notifying the ICO of breaches which do not meet the threshold, out of an abundance of caution. In accordance with Article 33 of the GDPR, the data controller must notify the ICO of a personal data breach without undue delay, and where feasible, not later than 72 hours after becoming aware of such breach.

The number of public complaints made to the ICO has also doubled to just over 41,000, indicating a higher level of public awareness of the importance of personal data since the GDPR was introduced.

The increases seen in the UK reflect similar increases in reporting across the EU countries which have implemented the GDPR. However, there have been relatively few GDPR fines issued to date, despite the considerable increase in complaints.

Spanish Constitutional Court nullifies new Data Protection Act

On May 22nd 2019 it was announced that the Spanish Constitutional Court has found select provisions of Spain’s new Data Protection Act, namely those concerning political parties’ ability to collect personal data, to be unconstitutional, less than 6 months after the new law came into force.

The ruling, which found as unconstitutional indent 2 of final provision 3 of the Act, also nullified Article 58 of the General Election Act. These provisions allowed political parties to screen social networks and other sites in order to collect the data of potential voters, and use that data to customise campaign messaging. Additionally, political party messaging was exempted from e-marketing laws concerning spam.

These provisions were last minute additions to the Act, which received considerable criticism during debate, particularly in the Spanish Senate. However, it was argued that their removal could delay the laws implementation by more than two years, which the Parliament viewed as unacceptable, and they were ultimately included.

The complaint filed by the Spanish Ombudsman with the Constitutional Court was dealt with very quickly, as the Act had been in effect for less than 6 months. The Constitutional Courts full rationale is expected to be published in the coming weeks.

However, the final round of Spanish elections was completed on May 26th, so the impact of this decision will not be felt until the next Spanish elections, which is scheduled for 2023.

For more information please contact James Tumbridge at; jtumbridge@vennershipley.co.uk