The ICO has released the first two reports from its regulatory sandbox about innovation in data protection. Launched in September last year, the scheme trialed a number of innovations which sought to use personal data to deliver a variety of new services, while still maintaining data protection requirements.
Many of the projects have been delayed because of COVID-19, but these first projects include a ground breaking study in biometrics at Heathrow and a further education not-for-profit company, JISC. Both can shed light on the potential of data and the challenges involved in maintaining privacy.
Heathrow’s biometric passports
Heathrow’s experiment relied on using biometrics such as facial recognition to automate the passenger journey and ensure people can move through check-in, baggage drop, and onto the aircraft without having to constantly stop in order to show their passports.
Their study had to confront two important data issues. The first was who controlled the data. Heathrow would be considered a joint data controller for the activities and so would have to ensure complete security and transparency about what data it had.
Under GDPR rules it would also struggle to achieve compliance through the argument of a legal obligation, and so would have to seek explicit passenger consent for using the data throughout the passenger journey. This can be difficult to achieve in a system which is intended to minimimse interruption to the passenger’s journey throughout the process.
Both Heathrow and the ICO agreed that affirmative action completed by the passenger would not be a compliant means of shorting an express statement of explicit consent.
In the light of the trial then, Heathrow has decided to postpone its plans until it can come up with a GDPR compliant process for automating passenger journeys.
The JISC project meanwhile, aimed to protect student well-being while showing how the data about their activities could be used to improve the services on offer.
It faced issues around data protection as well as purpose compatibility. It would have to assess whether the data they intended to use would be fit for the original purposes for which it was collected.
Thanks to COVID-19 related delays, one aspect of the project could not be met, a report into the mental health analytics which both sides agreed would take place outside of the sandbox process.
Universities using data would have to demonstrate compliance through the accountability principle of GDPR which would include identifying the lawful basis for using the data and providing adequate notification notices to all students including those under the age of 18.
According to the report, both sides agreed that universities would rely on Article 6 of GDPR in which is covered public tasks or legitimate interests for the processing of certain categories of personal data.
Both projects demonstrate the opportunities and risks associated with data. Much has been written about the use of facial recognition and data, in general, to streamline the process of fighting your way through the airport. As anyone who has endured a tough route through check-in would agree, anything that can make this easier will be welcome.
But systems have always run into the issue of data identification and consent, and this appears to be something Heathrow is yet to crack. GDPR sets the bar extremely high in achieving explicit consent as well as maintaining the necessary transparency and reassurance about how data will be used, and how and when it should be deleted.
By ironing out these issues and collaborating between regulators and innovators, sandbox initiatives such as this will be crucial in blending the potential of data with its regulatory obligations.