Our world has changed, has GDPR evolved with it?07 June 2020
Cast your mind back to preparations for the new year 2020, the resolutions you made, and the plans formulated for a new decade. It was just six months ago, but who could imagine the world as we live it today?
The global coronavirus pandemic has changed our world as we know it. From the language we now use, such as ‘furlough’, ‘social distancing’ and ‘shielded’, to the behaviours we display in public, to our protective clothing, our new working environments, and our increasing reliance on technology to stay connected.
It was once unimaginable, but it is our new reality.
GDPR in a changing world
The global pandemic has had an immediate and significant impact on people and businesses alike, and the impact continues to develop. But how are the rules and regulations that guide us evolving with this new reality?
Let’s explore the GDPR as an example. Just two years old in May 2020, this ‘principles-based’ regulation strengthened previous data protection legislation in the UK and across the European Union (EU), placing greater emphasis on individual rights and outlining enforcement rules as never seen before.
In these past two years, we have seen that individuals in the workplace have come to a greater understanding of ‘why’ they need to protect personal data, and there is a broad understanding that businesses are simply custodians of that data. In addition, senior executives and board members are more acutely aware of the requirement to protect data and the consequences of non-compliance.
This is all positive progress in such a short amount of time, but the world has rapidly changed in response to the global health crisis. How have supervisory authorities responded to our new reality?
The principles-based approach of the GDPR means the regulation works for organisations of all sizes, from small charities to large corporate companies. The regulation places a requirement to evidence compliance with the principles in the individual setting. This approach enables the GDPR to keep up with change and technological developments.
Some of the processes previously established to ensure compliance with GDPR, however, may no longer work effectively within our new working environments. For example, there is the right of the individual to access their personal data, commonly known as a subject access request, or SAR.
The GDPR requires all companies to comply with SARs free of charge and within 40 days, but some remote working arrangements can make it difficult to comply with these timescales.
The Information Commissioner’s Office (ICO), the supervisory authority in the UK, has acknowledged this challenge and issued revised guidance. In an official statement, it indicated that while individual rights must be upheld, the ICO does accept there may be a justifiable delay in response times. In doing this, the ICO has demonstrated its ability to be pragmatic and flexible when needed.
Test, track and trace: a data privacy minefield?
Key to the long-term management and containment of the COVID-19 virus is the UK government’s ‘test, track and trace’ programme, which includes NHSX’s contact-tracing app. The ICO is taking a central role in the development of this app.
Elizabeth Denham, the UK Information Commissioner, has confirmed that the ICO sees itself as the appropriate ‘independent body’ to advise on NHSX’s data protection impact assessment (DPIA) and the privacy notice for use of the app.
The ICO also describes itself as a ‘pragmatic, proportionate and independent data protection regulator’. It states that its published guidance on the core principles and best practice for the development of the contact-tracing app is designed and written to protect the public.
The new guidance outlines some key design principles:
- Be transparent about the purpose, design choices and benefits of the app.
The ICO flags the potential risk of the app’s purpose and functionality evolving beyond the minimum required for contact-tracing.
- Protect users and do not weaken their privacy.
Actions include data minimisation and pseudonymous identifiers, rather than registration details.
- Ensure users have control over their data.
This applies both at onboarding and during use, when controls should be accessible in the app’s settings. The app should ensure that users can opt-in or opt-out without any negative consequences.
- Securely process data.
For example, use cryptographic/security techniques both when data is at rest and in transit.
- Store data for as short a time as possible.
The guidance also details best practice recommendations and lawful bases for processing user data. While relying on ‘public interest’ as a lawful basis is acceptable, this must also be ‘necessary’.
The ICO acknowledges that while consent may not be required for strict contact-tracing functionality, where data is ‘stored’ consent must be obtained. The ICO also notes that location data is not ‘necessary’, as only identifier data which shows that two people were in close proximity is required.
Contact-tracing: questions and criticisms
The NHSX contact-tracing app is receiving broad-based criticism from privacy and human rights groups alike, with a call for more transparency and more detailed explanations on key issues. For example, why NHSX has chosen to develop a ‘centralised’ framework for the app, where data is stored on a central cloud server, rather than a more privacy-friendly ‘de-centralised’ framework, which other countries have adopted, where data is stored locally on the device.
Privacy groups are also seeking further clarification on how user data will be used in the future, when personal data will be deleted and how the government will restrict scope creep.
While the contact-tracing app remains a political minefield, the attention it has garnered demonstrates how data protection has become a central issue when holding public and private bodies to account.
AI, GDPR, RTB, ICO – acronyms ahead!
In addition to the examples explored above, there have been further developments in the last two years. The ICO, in partnership with the Alan Turing Institute, has published detailed guidance for companies using artificial intelligence (AI) to make or support decisions about individuals.
It provides companies with a framework and system to explain decisions made using AI. The guidance clearly outlines that organisations must still comply with the GDPR’s processing principles of fairness, transparency, and accountability.
There is also the ongoing battle between the ICO and advertisers on real time bidding (RTB), a process which allows companies to place adverts on a webpage or app using personal data for targeted advertising. RTB raises significant privacy issues due to the amount of personal data shared with large numbers of organisations without the individual being aware of it. The ICO is keen to ensure legal compliance while organisations are looking to innovate.
Then there is Brexit and beyond. The EU and the UK are committed to continued alignment on data protection, and the UK government is seeking an ‘adequacy’ decision from the European Commission. This would demonstrate that the UK Data Protection Act 2018 meets EU requirements, thus ensuring a continued free flow of personal data between the UK and the EU.
Only time will tell how these negotiations will play out.
One thing is for sure, in the two years since the introduction of the GDPR much has improved, and we have seen a massive evolution in the guidance available. The ICO has proved itself to be pragmatic and responsive to our new reality, but what will the next two years hold?