The security implications of the Data (Use and Access) Bill

The Data (Use and Access) Bill is the latest data protection legislation currently making its way through parliament and will apply to any organization operating in the UK. This bill aims to update the UK’s data policies, including UK GDPR, Investigatory Powers Act 2016 and Data Protection Act 2018, as well as relaxing the permitted uses of data processing for scientific research and streamlining data processing for public services.
There has been a growing expectation for some time that the UK’s data protection policies will be updated. The Data Protection Act 2018 was the previous such update and new technologies requiring legislative attention have emerged since then. A proposed update was planned earlier this year, but the Data Protection and Digital Information (DPDI) Bill was prorogued due to the change in government.
“My view is that the changes which have been proposed, even when taken together, won’t threaten the UK’s adequacy status,” says Anthony Lee, a partner specialising in information technology and data protection at gunnercooke LLP. “Whereas I felt with the DPDI, there were quite a few changes that might well have resulted in the UK losing its adequacy status.”
Those hoping for a simple and condensed approach to data protection may be disappointed. The Data (Use and Access) Bill is massive, at over 260 pages. However, the reason for the bill’s size is that it is revising and amalgamating a diverse array of legislation.
“The Data Use and Access Bill weakens our rights and gives companies and organizations more powers to use automated decisions,” said legal and policy officer Mariano delli Santi, in a statement from the Open Rights Group. “This is of particular concern in areas of policing, welfare and immigration where life-changing decisions could be made without human review. The Government says that this Bill will generate billions for the economy, but at what cost to the privacy, security and dignity of the British public?”
One of the key elements is Part 1, where the Data (Use and Access) Bill makes clear distinctions between business data and customer data.
Business data is defined as anything regarding goods, services and digital content supplied or provided by the trader. Meanwhile, customer data is information relating to goods, services and digital content supplied or provided by the trader to the customer, or to another person at the customer’s request.
Much like the previous DPDI, the Data (Use and Access) Bill attempts to establish the regulatory foundations for digital identities. Chapter 27, subsection 1 concerns the reliability of digital verification services and the Secretary of State would be expected to prepare and publish, in collaboration with the information commissioner, the “DVS trust framework” document. This would set out further rules and regulations regarding the provision of digital verification services.
The DVS trust framework would replace the UK’s current Electronic Identification, Authentication, and Trust Services (eIDAS) regulation. The bill would also abolish the Information Commissioner’s Office and replace it with the Information Commission, with the functions transferred to the new organization.
Part 5 of the bill updates the UK data protection and privacy. In particular, clause 67 expands the use of data processing for statistical and research purposes.
The same section also revises Article 4 of UK GDPR, explaining that references to the processing of personal data for scientific research now includes publicly and privately funded research as well as research carried out as part of a commercial or non-commercial activity. However, the bill reinforces that consent still needs to be given for the processing of personal data for scientific research.
Article 22B of the bill will be added to the UK GDPR, which restricts automated decision-making processes. The Article declares that a significant decision, based entirely or partly on the processing of special categories of personal data (such as faith ethnicity, genetic data or biometric data), may not be taken based solely on automated processing, unless one of a series of conditions is met.
These conditions include where the data owner has explicitly agreed to automated decision making, necessary for completing a contract agreed between the data subject and the data controller, or when the data processing is required by law.
The bill also outlines the appropriate safeguards for data processing. It states that the safeguards have not been met if the processing “is likely to cause substantial damage or substantial distress to a data subject.” Furthermore, the requirement is only satisfied if the safeguards include technical and organizational measures for the purpose of ensuring respect for the principle of data minimization, such as pseudonymization.
Overall, the bill as it currently stands is a lengthy document, but not all of it will be relevant to all businesses. It is also still making its way through parliament, and therefore changes may well occur – digital verification is especially controversial. However, there is an expectation that the UK’s data protection policies will evolve and it is therefore incumbent that stakeholders engage with legislators to ensure that they are prepared and fully aware of what is to come.
“The more the UK diverges from the GDPR and dilutes people’s rights and protections, the more likely it is that the UK will be regarded by the European Commission as not being a safe pair of hands from a privacy perspective,” concludes Lee.
“Whilst there are changes proposed which will result in some rights being diluted (such as making automated decision making more flexible than under the GDPR and having a list of legitimate interests where an assessment is not required, such as in emergencies), my sense is that changes will not result in the UK being considered to having inadequate privacy laws.”
He adds that there are controls in the bill to keep human oversight for sensitive data, with some wiggle room open for what does and doesn’t constitute the minimum amount of human participation in a given task.
“A decision will have been taken by solely automated means if there has been no meaningful involvement. It is going to be interesting to see the debates about what is meant by no meaningful human involvement.”
Source link