AI, machine learning and personal data

By Jo Pedder, Interim Head of Policy and Engagement.

AI, machine learning and personal data

Today sees the publication of the ICO’s updated paper on big data and data protection.

But why now? What’s changed in the two and a half years since we first visited this topic? Well, quite a lot actually:

 

  • big data is becoming the norm for many organisations, using it to profile people and inform their decision-making processes, whether that’s to determine your car insurance premium or to accept/reject your job application;
  • artificial intelligence (AI) is stepping out of the world of science-fiction and into real life, providing the ‘thinking’ power behind virtual personal assistants and smart cars; and
  • machine learning algorithms are discovering patterns in data that traditional data analysis couldn’t hope to find, helping to detect fraud and diagnose diseases.

The complexity and opacity of these types of processing operations mean that it’s often hard to know what’s going on behind the scenes. This can be problematic when personal data is involved, especially when decisions are made that have significant effects on people’s lives. The combination of these factors has led some to call for new regulation of big data, AI and machine learning, to increase transparency and ensure accountability.

In our view though, whilst the means by which the processing of personal data are changing, the underlying issues remain the same. Are people being treated fairly? Are decisions accurate and free from bias? Is there a legal basis for the processing? These are issues that the ICO has been addressing for many years, through oversight of existing European data protection legislation.

When the General Data Protection Regulation (GDPR) comes into force in 2018, the regulatory toolkit will be further sharpened. Some of the key changes will be:

  • more powerful rights for individuals, including rights in relation to automated decisions and profiling;
  • new accountability provisions, including the implementation of codes of conduct and certification mechanisms that will help to improve standards and hold organisations to account in areas such as automated decision making; and
  • increased enforcement powers for the ICO, including the ability to issue fines of up to €20,000,000 or 4% of annual worldwide turnover for infringements of the of the regulation.

These changes, and more, will contribute towards a relevant and effective regime for the regulation of personal data in the world of big data, AI and machine learning.

The paper we are publishing today takes these changes into account and gives our views on the implications both now and moving forward. For those involved in big data, the paper also offers some practical advice on tools and approaches that can help to meet and go beyond compliance with data protection legislation. There’s also some specific guidance on undertaking privacy impact assessments – a valuable tool in a big data context and one which the ICO has championed for many years now.

As a final point, this paper does not mark the end of our work on big data. Far from it. We have a number of key work-streams which are related to and will continue our work in this area, a brief taster of which is summarised below. Watch this space.

  • New Information Rights Strategy. We are currently working hard on our new Information Rights Strategy in preparation for its launch at the end of this financial year, with big data and AI set to feature.
  • Policy and guidance. We are planning to publish some of our thinking on the GDPR ‘profiling’ provisions which are relevant in a big data context, to gain feedback to inform our policy in this area. We will then use this as we contribute to Article 29 Working Party guidance planned for publication later in 2017.
  • Grants and contributions programme. In the near future, we plan to setup a ‘grants and contributions’ fund for research. This programme will be separate to the research that the ICO will continue to conduct and commission; its intention is to assist and enable independent research on a range of information rights issues, including matters relating to big data, AI and machine learning.
  • Mergers and acquisitions. Often, organisations merge or are acquired by other companies for the sole purpose of obtaining and combining datasets. The data can then be exploited in intrusive ways that individuals didn’t expect. We recognise the implications this has in terms of data protection and we are currently working on a report, to be published later this year, looking at the key issues and possible solutions.
  • Social scoring. We will soon be launching a tender for a piece of independent research on social scoring. This will be looking into the practice of using personal data from social networks to form part of assessments on people in relation to areas such as employment, housing and finance.
  • Privacy Bridges. The Privacy Bridges Project aims to create a framework with effective mechanisms to bridge the gap between the EU and the US with regards to privacy protection. The ICO has an ongoing role in this, a project that is becoming increasingly important in the world of big data.
  • Higher education. Embedding information rights into higher education is something that we have been aspiring to for a number of years now and we have recently been making good progress on assessing the feasibility of achieving this aim. Ultimately, we hope this will help to address the generally recognised privacy and security skills shortage, particularly in the information technology sector.
jo-pedderJo Pedder is Interim Head of Policy and Engagement. She has lead responsibility for the ICO’s guidance on the Data Protection Act and the Freedom of Information Act.
This entry was posted in Jo Pedder and tagged , , , , , , , . Bookmark the permalink.

Leave a Reply