Facial recognition technology and law enforcement

By Elizabeth Denham, Information Commissioner.

Technological advances in the last 20 years have rapidly increased the ability of online systems to identify individuals. These advances can make many transactions straight forward, such as passing through passport control or unlocking a mobile phone but they can also increase the risk of intruding into our privacy.

Technology represents both a risk and an opportunity and this is why I have recently published our first Technology Strategy which addresses these new technological developments and ensures the ICO can deliver the outcomes which the public expect of us.

Continue reading

Posted in Elizabeth Denham | Tagged , | Leave a comment

Raising the bar – consent under the GDPR

By Steve Wood, Deputy Information Commissioner.

Consent-blog-image.jpgWe’ve already tackled some myths around consent when it comes to the General Data Protection Regulation (GDPR) and you’ll be pleased to hear we’ve now published our final detailed guidance on consent to help you on your GDPR journey. This follows the guidance issued by the European Group of Data Protection Authorities, the Article 29 Working Party.

From marketing agencies, to clubs and associations, to local authorities, consent has been a hotly debated topic.

Some of the myths we’ve heard are, “GDPR means I won’t be able to send my newsletter out anymore” or “GDPR says I’ll need to get fresh consent for everything I do.”

I can say categorically that these are wrong, but if misinformation is still being packaged as the truth, I need to bust another myth.

Continue reading

Posted in Steve Wood | Tagged , , | Leave a comment

New regulatory powers for the digital age

James Dipple-Johnstone explains our new regulatory powers and how we’ll exercise them. Organisations and data protection professionals will find this information useful when responding to our consultation on our Regulatory Action Policy.

Regulatory powers

Our updated Regulatory Action Policy is out for consultation. As well as setting out the objectives that will guide the ICO as we take regulatory action, it presents our new powers and explains how we aim to use them.

The power to levy penalties of up to 4% of global turnover or £17 million, whichever is greater, has come through the General Data Protection Regulation (GDPR) but other powers will be introduced by the new Data Protection Bill currently before Parliament. Continue reading

Posted in James Dipple-Johnstone | Tagged , , , , , | Leave a comment

ICO seeks comments on draft Data Protection Impact Assessment guidance

14 May 2018: We have now published our detailed guidance on DPIAs, as well as a podcast on this topic.

By Ian Deasha, Information Rights Regulatory Development Group Manager.

pia-guidance-consultation-blog

The ICO has for many years championed the benefits of voluntary Privacy Impact Assessments for new, innovative but potentially high-risk types of processing under the Data Protection Act 1998.

The new General Data Protection Regulation (GDPR), which will apply from 25 May, now formalises this situation by making the use of Data Protection Impact Assessments (DPIAs) a legal requirement in certain circumstances.

But make no mistake, this is not just more red tape or an unnecessary burden being placed on data controllers. DPIAs will be an extremely useful tool allowing organisations to positively demonstrate their compliance with data protection obligations, meeting people’s expectations of privacy and helping prevent potential reputational damage.

Even where they are not a legal requirement, they can be a very beneficial process for responsible controllers to incorporate privacy by design and by default principles into their projects. These concepts are at the heart of GDPR compliance.

So what is a DPIA?

Essentially, it’s a documenting process which will allow an organisation to systematically describe and analyse its intended processing of personal information, helping to identify and minimise data protection risks at an early stage.

As well as being a key element of a controller’s accountability obligations under GDPR, an effective DPIA could have real benefits down the line in ensuring compliance, building external trust and avoiding the possible reputational and financial implications of enforcement action following a breach.

Under the GDPR, controllers will be required to complete a DPIA where their processing is ‘likely to result in a high risk to the rights and freedoms of natural persons’.

‘Likely’ does not mean the risk is certain, but it will be the responsibility of the controller to assess the level of risk of their intended processing by making a reasoned judgement on the likelihood and potential severity of harm.

Our guidance includes examples highlighted in the GDPR and also a further list that the ICO is legally required to develop of the types of processing likely to be high risk – we are also seeking comment on this.

Our draft DPIA guidance builds on our previous PIA code, with further detail on specific GDPR requirements. This includes a DPIA template, although controllers who anticipate doing lots of DPIAs may wish to consider develop their own.

It also gives detail on the circumstances when controllers will be required to consult the ICO prior to the processing if they cannot identify measures to reduce the potential risk identified in their DPIA to an acceptable level.

The ICO is required to provide written advice, when prior consultation is engaged, within eight weeks. This period can be further extended where the processing of personal data is especially complex.

As well as offering advice, the ICO could in some circumstances issue a formal warning to an organisation, or even take formal action to ban the processing altogether.

We are seeking comment on the draft guidance published last week, particularly on whether or not it is clear when a DPIA will be necessary.

In addition, we would also like controllers to tell us whether they consider they may need to submit a DPIA to us for written advice in the 12 months following 25 May 2018.

The consultation is expected to run from 22 March until 13 April 2018. We are also planning an ICO podcast on our DPIA guidance in the next few weeks.

We have also published detailed guidance on the area of legitimate interest as a basis for processing under the GDPR.

ian-deasha-blogIan Deasha is the ICO’s Information Rights Regulatory Development Group Manager, working on reform of data protection law in the UK.
Posted in Ian Deasha | Tagged , , , | 10 Comments

A win for the data protection of UK consumers – WhatsApp signs public commitment not to share personal data with Facebook until data protection concerns are addressed

By Information Commissioner Elizabeth Denham.

People have a right to have their personal data kept safe, only used in ways that are properly explained to them, and for certain uses of their data, to which they expressly consent. This is a requirement of the Data Protection Act.

My office has just completed an investigation, which commenced in August 2016, into whether WhatsApp could legally share users’ data with Facebook in the manner they were considering. In 2014 Facebook acquired WhatsApp, which offers an instant messaging service for smartphones.

My investigation found:

  1. WhatsApp has not identified a lawful basis of processing for any such sharing of personal data;
  2. WhatsApp has failed to provide adequate fair processing information to users in relation to any such sharing of personal data;
  3. In relation to existing users, such sharing would involve the processing of personal data for a purpose that is incompatible with the purpose for which such data was obtained;
  4. I found that if they had shared the data, they would have been in contravention of the first and second data protection principles of the Data Protection Act.

I am pleased to state that WhatsApp has now signed an ‘undertaking’ wherein they have given a public commitment not to share personal data with Facebook until they can do so in compliance with the upcoming General Data Protection Regulation (GDPR), which comes into force in May this year. I reached the conclusion that an undertaking was the most effective regulatory tool for me to use, given the circumstances of the case. As WhatsApp has assured us that no UK user data has ever been shared with Facebook (other than as a ‘data processor’, as explained below), I would not be able to meet the criteria for issuing a civil monetary penalty under the Data Protection Act.

For those of you who wish to read this undertaking, I have enclosed a copy. As outlined in the undertaking, WhatsApp has assured us that it shall not, from the date of the undertaking, share personal data with companies in the Facebook family, for Facebook’s own purposes, until it can satisfy the requirements of the GDPR.

It is also important to state that UK consumers do not need to take any action as a result of this update.

My investigation has not been concerned about WhatsApp’s sharing of personal data with Facebook when Facebook are only providing a support service to WhatsApp. The technical term for such sharing is that WhatsApp can use Facebook as a data processor. This is common practice and if done consistently with the law, under contract, does not generally raise data protection concerns.

Data protection law does not prevent a company from sharing personal data – they just have to follow the legal requirements.

I therefore compliment WhatsApp in signing this undertaking, which I believe will build trust amongst their many UK users. I would also like to stress that signing an undertaking is not the end of story and I will closely monitor WhatsApp’s adherence to it.

There are two other interesting elements to this investigation that merit mention.

The first is the possibility of WhatsApp and Facebook sharing data and the broad concerns raised both in the community and the world of regulators. Concerns about possible inappropriate data sharing were raised by media reports, civil society groups, and data protection authorities globally as a result of WhatsApp updating their terms and conditions and privacy policy. At the heart of these concerns lies a desire for improved transparency, control, and accountability, at a time when personal data is ever more central to the business models of key players in the digital economy.

The issue was seized by European Data Protection Authorities of which I am a member. As Chair of the Article 29 Task Force on WhatsApp-Facebook data sharing, we actively worked with our European colleagues to bring a common focus and information base to our investigation. The Article 29 Working Party wrote collectively to WhatsApp to set out our concerns in October 2017.

The Hamburg Commissioner of Data Protection and Freedom of Information issued a press release on 2 March 2018, indicating that the Higher Administrative Court (OVG) Hamburg had confirmed his administrative order, banning Facebook from using WhatsApp user data for its own purposes.

The French data protection authority (CNIL) is in the process of bringing enforcement action against WhatsApp.

Other EU Data Protection Authorities also have ongoing investigations.

The second element of interest is the path ahead. The GDPR strengthens the rules on what constitutes ‘consent’. It also provides a stronger emphasis on effective transparency and accessible information for the public. This will be good news for UK users of social media services. We will be monitoring changes to WhatsApp’s privacy and terms and conditions under the new legislation.

Finally, in the interest of transparency I am enclosing a copy of my letter to WhatsApp dated 16 February 2018, which outlines the history and results of the investigation.

elizabeth-denham-blogElizabeth Denham was appointed Information Commissioner in July 2016. Her key goal is to increase the UK public’s trust and confidence in what happens to their personal data.
Posted in Elizabeth Denham | Tagged , , | 9 Comments

Making or selling Internet of Things (IoT) devices? Six reasons you need to be thinking about data protection

By Peter Brown, Technology Group Manager.

making-or-selling-IoT-devices-blueWith the demand for connected toys, smart watches and smart home accessories growing rapidly it’s safe to say the IoT market is booming.

At the same time, barely a week goes by without hearing of a connected device that has serious yet basic security flaws, leaving personal data potentially exposed to malicious third parties.

Most manufacturers and retailers pride themselves on their health and safety compliance when developing and selling products. But as internet-enabled devices process increasing amounts of personal data, as a manufacturer or retailer how much do you really know about the rules around IoT and the way your products use personal information?

Here are six points to consider as a starting point for manufacturers and retailers of IoT devices:

Continue reading

Posted in Peter Brown | Tagged , | 2 Comments

Meltdown and Spectre – what should organisations be doing to protect people’s personal data?

By Nigel Houlden, Head of Technology Policy

IT security

This week Google’s Project Zero team published details of serious security flaws, Meltdown and Spectre, which affect almost every modern computer, and could allow hackers to steal sensitive personal data. The three connected vulnerabilities have been found in processors designed by Intel, AMD and ARM. The full technical details of these vulnerabilities can be found in this blog post, and papers have been published under the names Meltdown and Spectre that give further details.

In essence, the vulnerabilities provide ways that an attacker could extract information from privileged memory locations that should be inaccessible and secure. The potential attacks are only limited by what is being stored in the privileged memory locations – depending on the specific circumstances an attacker could gain access to encryption keys, passwords for any service being run on the machine, or session cookies for active sessions within a browser. One variant of the attacks could allow for an administrative user in a guest virtual machine to read the host server’s kernel memory. This could include the memory assigned to other guest virtual machines.

Continue reading

Posted in Nigel Houlden | Tagged , , , , | 8 Comments

GDPR is not Y2K

Listen to our March 2018 podcast answering your questions on GDPR myths.

By Information Commissioner Elizabeth Denham.

gdpr-myths-201712-blog
I’ve been pleased to hear from many of you that the eight GDPR myth busting blogs we’ve run this year have been helpful in your preparations for the new legislation.

There are still some myths out there though and, as we approach Christmas and New Year, there’s one in particular I wanted to bust:

Myth #9: GDPR compliance is focused on a fixed point in time – it’s like the Y2K Millennium Bug

I’m still picking up a lot of concern from organisations about preparing for the GDPR by May.

Much of that is understandable – there’s work required to get ready for the new legislation, and change often creates uncertainty.

However some of the fear is rooted in scaremongering because of misconceptions or in a bid to sell ‘off the shelf’ GDPR solutions.

Continue reading

Posted in Elizabeth Denham | Tagged , , , | 24 Comments

ICO seeks comment on draft Children and GDPR guidance

By Elizabeth Denham, Information Commissioner.

GDPR-children-blog-rbgChildren today are truly digital natives. With that in mind, we need to ensure that they have the tools to be contributing digital citizens. This means that the protection of children’s personal data is fundamentally important.

That is why the General Data Protection Regulation (GDPR) will introduce new, specific legal responsibilities for organisations processing children’s data from 25 May 2018.

I am pleased that the special case of children’s privacy rights is part of the wider conversation about the UK’s digital future. Protecting children online is the shared responsibility of lawmakers, companies, platforms, parents and regulators and we need to get this right.

Continue reading

Posted in Elizabeth Denham | Tagged , , | 11 Comments

Update on ICO investigation into data analytics for political purposes

By Elizabeth Denham, Information Commissioner.

data-analytics-update-blog.pngIn May I announced a formal investigation into the use of data analytics for political purposes. We’re looking at how personal information was analysed to target people as part of political campaigning and have been particularly focused on the EU Referendum campaign.

We are concerned about invisible processing – the ‘behind the scenes’ algorithms, analysis, data matching, profiling that involves people’s personal information. When the purpose for using these techniques is related to the democratic process, the case for a high standard of transparency is very strong. Continue reading

Posted in Elizabeth Denham | Tagged , , , | 40 Comments