skip navigation
skip mega-menu

Working for a regulator like the ICO

#AI

If you work in digital technology, as a researcher, software developer, or designer, you have probably considered the impact of what you are building on rights, freedoms, and the public interest. Navigating this complicated terrain requires a mixture of people with different skills and experience including legal, policy, and technology expertise. But as increasingly complex technologies and data flows are integrated into consequential decisions in society, there is a greater need for technologists to explain technology and highlight where engineering and design choices will create significant impacts on individuals.

During my time as a Postdoctoral Research Fellow in AI at the ICO, I had the opportunity to engage in this kind of public interest technology work on a day-to-day basis. Working for a regulator like the ICO, whose role is to uphold information rights, including data protection, in the public interest, puts you in a unique position. I was amazed by the sheer variety and complexity of data protection and technology issues that emerge on the ground in particular contexts, which challenged many of my preconceptions about both data protection and technology. Having to translate the jargon and demystify the hype around technologies like AI to my colleagues, ultimately led me understand them better both in theory and practice.

During my time working in the Technology department at the ICO, a typical day might have included:

  • Reading a technical report on an AI system that is being developed by a data controller and is referenced in their data protection impact assessment. Then, preparing a briefing for the DPIA team on the risks it raises and the adequacy of measures proposed to mitigate them.

This kind of work offered fascinating insights into how these technologies are being deployed in practice and an opportunity to influence how they are designed to achieve data protection compliance.

  • Running a workshop with data protection officers and software engineers to explore ways to better implement data protection by design in the context of automated decision-making systems.

This frequently involved getting to grips with the differing mindsets and approaches of lawyers and technologists and trying to align them around feasible solutions. Consulting with a wide range of stakeholders in a variety of roles and sectors was enlightening and proved crucial when it later came to crafting guidance that communicated the important points while also being applicable to the wide variety of different contexts organisations find themselves in.

  • Responding to questions about the data protection implications of particularly unusual and esoteric technologies.

This sometimes required me to revisit obscure concepts from computer science - the kind of strange things one learns about in theory but rarely gets to see implemented in practice, let alone have to consider their bearing on fundamental concepts in data protection like data controllers or personal data. These cases sparked debates between myself and my colleagues, which were not only fascinating, but also important because our conclusions would be directly relevant to our colleagues in casework and ultimately to the data subjects and controllers involved.

The main project I worked on was to develop a framework for how the ICO approaches auditing AI. This included working with the Assurance and Investigations teams to equip them for AI-related work through auditing tools and procedures to be used in audits and investigations as well as producing detailed Guidance on AI and Data Protection. This was challenging but highly rewarding work, which would not have been possible without the expertise and skill of many colleagues within the Technology team and beyond.

Of course, when the Covid-19 pandemic struck, our work had to pivot to address the new challenges it raised for data protection. This required us to work quickly to get to grips with various proposals for technologies to address the pandemic which involved personal data; from contact tracing apps to AI tools for predicting the next hotspots. On the basis of this analysis we produced guidance and opinions (for example on contact tracing app development and the Google / Apple API), to help ensure that personal data was appropriately protected while playing its vital role in the pandemic response.

Throughout my time at the ICO I had the opportunity to work with a range of incredibly smart, conscientious, and fun people, as well as to engage with a wide array of external stakeholders. As a result, I came away with a deeper understanding of the important challenges facing data protection today and in future, and a newfound appreciation for the work of regulating data protection.

The potential societal consequences of newly powerful digital technologies are too significant to leave to technology providers, firms, and governments, even those with good intentions. Regulators like the ICO have the responsibility and democratic mandate to shape these technologies by ensuring appropriate safeguards are built into them from the beginning or even preventing their deployment where they fail to meet legal requirements. In order to do this effectively, they need parity of arms in terms of technology expertise with those they regulate to ensure they can engage with them on an equal basis. This requires people with technology expertise to unpack the complex ways that personal data is processed and the consequences for affected individuals. Not only is that work important, it is also enormously rewarding.

Subscribe to our newsletter

Sign up here