Out-Law / Your Daily Need-To-Know

Out-Law News 4 min. read

ICO: privacy-enhancing technologies can help harness power of personal data


The UK’s Information Commissioner’s Office (ICO) has published new guidance on the use of ‘privacy-enhancing technologies’ (PETs) for data protection officers and bodies that handle large personal data sets.

PETs are designed to protect personal data from unauthorised access or misuse and the ICO guidance advises on how they can be used to safely share personal data to detect and prevent financial crimes and related harms such as fraud, money laundering and cybercrime. The ICO guidance offers technical advice on the different types of PETs, including homomorphic encryption; secure multiparty computation; zero-knowledge proofs; federated learning; synthetic data; trusted execution environments and differential privacy and the benefits that can be reaped from their use.

PETs also have the potential to help organisations comply with the principles of data protection law by offering a secure environment, building data protection in from the beginning of a project, and minimising the amount of data that needs to be collected and retained.

Unveiling the guidance, Information Commissioner John Edwards urged organisations that handle large amounts of data – particularly special category data – to adopt PETs in the next five years. Edwards added that PETs can enable personal data to be shared “safely, securely and anonymously” and that such technologies offer “unprecedented opportunities” for organisations to harness the power of personal data.

Holly Lambert

Associate, Pinsent Masons

The development of new and innovative technologies is dependent on access to rich datasets, though the personal information contained in those datasets is highly sensitive and carries a lot of risk. The use of privacy-enhancing technologies like synthetic data help to mitigate that risk

The guidance offers the example of synthetic data as being a PET that can be used to minimise privacy risk while enabling the creation of innovative products. Synthetic data is artificial data that is generated from and carries the same properties and characteristics of real-world datasets. The ICO said this PET is useful when developing and training AI tools, as synthetic data should “produce very similar results to analysis carried out on the original real data”.

Holly Lambert of Pinsent Masons said: “Early-stage companies often face difficulties when trying to access and share datasets which contain personal information although they are a key part of the AI model training process. This is particularly relevant in the financial services sector where the development of new and innovative technologies is dependent on access to rich datasets, though the personal information contained in those datasets is highly sensitive and carries a lot of risk. The use of PETS like synthetic data help to mitigate that risk while facilitating the access to and innovative use of data in this sector.”

Rosie Nance of Pinsent Masons said: “The ICO has been engaging with the Financial Conduct Authority (FCA) and the Alan Turing Institute on the use of synthetic data in the financial services industry. These collaborations have highlighted the opportunities but also the challenges to address and the need for further use cases in the financial services sector.”

The guidance also contains two new case studies that demonstrate how PETs can be used. The first examines how homomorphic encryption makes it possible to perform computations on encrypted information without first decrypting it. The ICO said homomorphic encryption could be used to share information relating to an individual in encrypted form – as part of a criminal investigation, for example, where different entities might hold information on the same individual.

A second case study highlights the potential of differential privacy for financial services companies. The ICO said that adding ‘noise’ to a dataset reduces the likelihood of being able to determine with confidence that information relating to a specific person is present in the data. In this way, differential privacy can be used by banks and other financial services companies to limit what can be inferred or learned from information before it is actually made available.

However, the ICO guidance flags that a PET is not a “silver bullet” that exempts organisations from needing to comply with data protection requirements. Lambert said: “Use of these technologies does not negate an organisation’s requirement to ensure that adequate data protection measures are in place for any personal data that is being processed. For example, if organisations decide to generate synthetic data from personal data, a lawful basis must be identified to collect and use the personal data for that purpose and it must be made clear to the data subject that their personal data is being used for that reason.”

The ICO also said that that some PETs are technically immature and not yet ready to be utilised by organisations at scale. Nance said further guidance and case studies on how PETs can be used would be welcome. “There will always be some use cases in which data free from any anonymisation, synthesisation, or ‘noise’ will be needed. The more uses case we have of where PETs can be used without undermining the purpose of a project, the more scope controllers will have to build these technologies into their privacy by design processes.”

As for how organisations can look to adopt PETs over the next five years, the Centre for Data Ethics and Innovation (CDEI) has announced that it would work with the ICO to develop a PETs cost-benefit analysis tool. It said the tool would help organisations interested in adopting PETs to better understand the costs and benefits involved and point to resources that can address challenges to adoption. This tool will form part of the CDEI’s updated PETs adoption guide, which includes a use case repository.

The CDEI also flagged the role PETs could play in approaches to accessing demographic data for bias detection and mitigation and suggested that differential privacy or homomorphic encryption could be used to enable intermediaries to supply access to demographic data, or analysis derived from it, without sharing the underlying data.  It also reiterated the government’s intention to amend Schedule 1 of the 2018 Data Protection Act via secondary legislation. The change would enable processing of sensitive personal data for the purpose of monitoring and correcting bias in AI systems. 

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.