1. Skip to navigation
  2. Skip to content
  3. Skip to sidebar

ICO Publishes Age Appropriate Design Code of Practice for Online Products and Services accessed by Children

On 21 January 2020, the ICO published the Age Appropriate Design Code of Practice. The Code is available here.

Who does the Code apply to?

  • The Code applies to information society services which are likely to be accessed by under-18s. The ISS does not have to be deliberately directed at children.
  • This includes any online products or services (e.g. apps, programs, websites, games). This also includes Internet of Things (IoT) connected toys and devices – whether with or without a screen.
  • The Code applies to ISS with an establishment in the UK OR those that are outside the UK (but target goods and services to, or monitor children in the UK).

What does the Code say?

The Code sets out 15 headline “standards of age appropriate design”:

  • Best Interests: The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.
  • Data Protection Impact Assessments: You should undertake a DPIA before launching the product or service to assess and mitigate risks to the rights and freedoms of children.
  • Age Appropriate Application: You should take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing OR apply the standards in this code to all your users instead.
  • Transparency: The privacy information you provide to users must be concise, prominent, and in clear language suited to the age of the child.
  • Detrimental Use of Data: You should not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions, or Government advice.
  • Policies and Community Standards: Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).
  • Default Settings: Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).
  • Data Minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
  • Data Sharing: You should not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
  • Geolocation: You should switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation to be switched on by default, taking account of the best interests of the child), and provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others should default back to ‘off’ at the end of each session.
  • Parental Controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
  • Profiling: You should switch options which use profiling ‘off’ by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
  • Nudge techniques: You should not use nudge techniques to lead or encourage children to provide unnecessary personal data or turn off privacy protections.
  • Connected Toys and Devices (IoT): If you provide a connected toy or device, ensure you include effective tools to enable conformance to this code.
  • Online Tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.

What should businesses do?

There are five steps that businesses should take now to prepare themselves (as set out in the Code):

  • Step 1: Implement an accountability programme
  • Step 2: Have policies to support and demonstrate compliance
  • Step 3: Train staff
  • Step 4: Keep proper records
  • Step 5: Be prepared to demonstrate compliance with the Code 

What happens now?

  • The Code needs to be notified to the European Commission and laid before Parliament (in case there are any objections). This process will likely be concluded in July / August 2020.
  • Businesses will then have 12 months to implement the changes from the date the Code takes effect. Based on the timescales above, we anticipate the Code will take effect around August/September 2021.
  • The ICO will enforce the Code in line with their Regulatory Action Policy and may impose fines under the Privacy and Electronic Communications Regulations (PECR) and/or GDPR, depending on the nature of the breach.
ICO Publishes Age Appropriate Design Code of Practice for Online Products and Services accessed by Children

Brexit: New UK Guidance if there’s “No Deal”

Yesterday, the ICO published new guidance on data protection implications of a “no deal Brexit”. This includes a “Six Steps to Take” Guide, a blog with embedded guidance and FAQs.  In addition, UK government published its plans for “No Deal Brexit”.

Here are the key points:

  • Substantive changes to GDPR rules: GDPR continues to apply under the EU Withdrawal Act.  But UK Government will amend it to remove references to “EU institutions and procedures” and references to “Union or Member State law”.
  • ICO role: The ICO will remain the ICO’s Independent privacy regulator. It will no longer be a member of the European Data Protection Board. But the UK and EU have agreed to implement rules on co-operation between the ICO and the Board.
  • Data Transfers to EEA countries and Gibraltar: the UK will transitionally recognise all EEA states and Gibraltar as providing adequate protection for personal data.  Personal data continues to flow freely from the UK to these countries.  But this may be kept under review.
  • Data Transfers from the EEA to the UK: you need a transfer solution in place.  This may require re-papering with SCCs to be clear that the UK is a data importer or another transfer solution.
  • Data Transfers under EU adequacy decisions: The UK will preserve the effect of the EU adequacy decisions on a transitional basis.  Data Transfers to these jurisdictions can continue uninterrupted.  This covers: Andorra, Argentina, Canada (commercial organisations), Faroe Islands, Guernsey, Israel, Isle of Man, Jersey, New Zealand, Switzerland, Uruguay and USA (under Privacy Shield framework). As Privacy Shield is an EU/US agreement, it is less clear how the UK can recognise it post-Brexit.  The ICO have actually said that Privacy Shield would be excluded from this arrangement but that the UK government’s intention is to make arrangements for it to continue to apply.  This will need a “watching brief”.  It may require an alternative solution to be in place for transfers from UK to US if these arrangements are not in place in time.
  • Data Transfers from countries with an existing EU adequacy decision to the UK:  These transfers were based on an adequacy decision in place with the EU.  It will be for each individual country to determine whether it will respect that decision regarding transfers to UK.  But transfer solutions may be necessary.
  • Data Transfers from UK under EU Standard Contractual Clauses (SCCs): you are probably using SCCs to export data to countries like the US.  No action is required on these at this time provided you have SCCs in place.  The UK government plans to recognise EU SCCs.  The ICO will be given the power to issue new SCCs (presumably customised for UK terminology) post-Brexit.
  • BCRs: Existing authorisations of BCRs made by the ICO continue to be recognised in UK law post-Brexit.  The UK will also recognise BCRs approved by other EU supervisory authorities pre-Brexit.  The DCMS paper suggests that post-Brexit, the ICO will continue to be able to authorise new BCRs but only under domestic law.  It is not clear why BCRs approved post-Brexit by the EU would not be potentially valid for transfers from the UK (as UK BCRs are for transfers from adequate jurisdictions).  BCRs (both approved and in-flight applications) will presumably need to transition to a new Lead Supervisory Authority.  Existing BCRs will also need to be updated to reflect the UK as a third country.
  • One Stop Shop:  If you’re only established in the UK post-Brexit (not the rest of the EU), you’ll lose the benefit of “One Stop Shop”.  You will also lose the benefit of “One Stop Shop” where you no longer undertake any cross-border processing in the EU due to Brexit (e.g. you previously processed only in two EU countries one of which was the UK).  This may mean that in the event of a breach you would need to deal with both the ICO as well as the supervisory authorities in the each of the relevant EU countries in which individuals are affected.   This raises the possibility of multiple enforcement actions (including fines).

There are a number of other significant implications:

  • Consider updating GDPR documentation (e.g. Article 30 records) and privacy notices (e.g. references to the UK as part of the EU and in relation to data transfers).
  • If you end up not established in the EU post-Brexit but are caught by the EU extra-territorial scope, you’ll probably need to appoint a Representative (one Representative in the jurisdiction in which you have the majority of your customers). Conversely, if you target products into or monitor data subjects in the UK but are not established here, you probably need to appoint a UK Representative.
  • Consider reviewing DPIAs (if they involve data transfers).

DCMS plan to issue draft regulations soon to implement the above proposals.

Brexit: New UK Guidance if there’s “No Deal”

What have the ICO said about data breach?

The ICO have been discussing data breach reporting under GDPR in a new webinar.

Here are the key points:

  • GDPR introduces mandatory breach reporting.  This applies to accidental breaches and internal breaches – not just those that are deliberate or are about losing personal data externally.  Don’t forget about integrity and availability breaches (e.g. damage to records due to fire or flood as well as ransomware).  Temporary loss of data, according to EDPB Guidance can be a personal data breach.
  • This does not mean that you have to report all general breaches of GDPR (eg. failure to present a suitable privacy notice).  Breach reporting only applies to breach of confidentiality, integrity or availability of data: the so-called the “CIA Triad”.  Similarly, breach notifications do not apply in relation to records relating to deceased persons (not covered by GDPR).
  • The 72 hour timeline kicks in from “awareness” of the breach.  This equates to having a “reasonable degree of certainty” that the breach has occurred.  The ICO gave an example of a customer who complains that he/she has received someone else’s information.  This would constitute “awareness”.  It may be less clear, at the initial stage, whether an IT issue has resulted in a personal data breach as that may require more forensic/detailed investigation.
  • In addition to deciding whether or not to notify a breach, you should always undertake a risk assessment to identify the scope and extent of the breach, contain it and stop it repeating or harming individuals.  This risk assessment will also impact the shape of the overall response.
  • If a personal data breach has occurred and you are aware of it, it is then necessary to decide the level of risk associated with it to determine whether or not to notify the ICO.  In order to require notification, there should be more than a remote chance of harm.  If there is more than a remote chance of harm, then this would make the risk to rights and freedoms of individuals likely, triggering Article 33.  Equally, mere inconvenience is not enough.
  • Article 33 sets out a number of pieces of information that should be provided with a notification.  It’s no excuse not to be able to provide this, even within 72 hour timeline.  So basic information will be required even if further information will be provided later as permitted by GDPR.
  • The 72 hour deadline is “72 real hours” – so this includes evenings and weekends.  If a breach comes to your attention on Friday morning, it will need to be reported by Monday afternoon.  Extra resources are likely to be required to respond promptly.
  • The ICO response will be quick (same day/next day) for serious breaches.  Less serious breaches may mean the ICO gets back to you in a couple weeks.
  • You can report a breach by phone (available during working hours), or web form (available 24/7).  You don’t have to use the official ICO web form, but the ICO prefers it if you do as it contains all the relevant information.
  • You always have to record breaches in your data breach log – the ICO can come and inspect this later if they wish.
  • The ICO acknowledge the risk of “notification fatigue” and say that that’s the reason why notification to data subjects under Article 34 is only required where there is a likely high risk to rights and freedoms of relevant individuals.
  • The sectors that have typically notified data breaches since 25 May are health, education, general business, local government and some law firms.
  • The ICO repeat their general advice that “not every breach needs to be reported”.  It’s also the controller’s decision as to whether or not to report.  They also mention practical points such as an example where someone reported a loss of payslips and rang back a couple of hours later to say they had found them!  Better not to do this.
  • The webinar also covered a number of live questions: One question was whether to report the situation where access rights to particular data have been inappropriately broad, but there is no evidence of actual unauthorised access.  The ICO think that this could be reportable if the situation had been allowed to last for a long time so there is, therefore, a significant risk of unauthorised access.  Presumably, if this happened for a short time, you could argue that the likelihood of unauthorised access was very limited.
  • Someone else asked about data sent to an old address and then finding that the data subject had moved addresses without telling the controller.  This is not a breach of security, although you could separately ask yourself whether sending sensitive information by post is an appropriate security risk in the first place.
What have the ICO said about data breach?

ICO Release Annual Report

The Information Commissioner’s Office have released their Annual Report for 2018.  This blog summarises the key messages.

Information Commissioner’s Thoughts

Elizabeth Denham highlights the following in her foreword to the Report.

  • The ICO has been involved in producing significant GDPR guidance in the last 12 months and has also run an internal change management process to ensure it is up to the demands placed upon it by GDPR (think: extra staff, new breach reporting functions and helplines).
  • The ICO’s pay levels have fallen out of step with the rest of the public sector.  UK Government has given the ICO 3-year pay flexibility and some salaries have increased.
  • The ICO has taken decisive action on nuisance calls and misuse of personal data.
  • The ICO began investigation of over 30 organisations in relation to use of personal data and analytics for political campaigns.
  • The ICO launched a “Why Your Data Matters” campaign – designed to work as a series of adaptable messages that organisations can tailor to inform their own customers of their data rights.

The Laws that the ICO Regulates

The Report refers to the Data Protection Act 1998 and the new Data Protection Act 2018 as well as the Freedom of Information Act 2000.

But don’t forget about the Privacy and Electronic Communications Regulations and the Investigatory Powers Act 2016. The ICO is also an authority to which organisations can report cyber incidents under the new Network and Information Systems Regulations 2018 (NIS).

Key Guides

The ICO has produced a Guide to GDPR – definitely worth a read.

The ICO has also produced an introduction to the Data Protection Bill and a Guide to the Law Enforcement Directive as well as significant other guidance.

The ICO have also supported other bodies in producing their own GDPR guidance:

  • Direct Marketing Association;
  • The National Health Service (NHS);
  • The Health Research Authority; and
  • The Department for Education.

There is also a new guidance on international transfers to reflect the Privacy Shield and guidance on the new case law on the concept of “disproportionate effort” in the Subject Access Code of Practice.

Data Sharing Codes of Practice

The ICO engaged with UK Government on data sharing codes arising from the Digital Economy Act 2017. This includes the publicly available register of information sharing agreements.

ANPR

Automatic Number Plate Recognition data used to be retained for 2 years. The ICO and the Surveillance Camera Commissioner raised concerns and the UK police have agreed to reduce the retention period to one year.

Participation in Global Networks

The ICO led the 2017 Global Privacy Enforcement Network Sweep with 24 regulators around the world looking at the control users have over their personal information. Privacy Notices of 455 websites that were assessed and often found inadequate.

Civil Monetary Penalties – Fines

The ICO issued 11 fines for serious security failures. The joint highest fine ever (£400k) was served on Carphone Warehouse.  There were significant fines for nuisance callers and spammers.

Criminal Investigations

The ICO launched 19 prosecutions and gained 18 convictions for data theft under the old Section 55 Data Protection Act 1998.

It also ran two investigations into acquisition of data in the Automotive Repair Industry and alleged breaches of Section 55 DPA 1998 by clients tasking private investigators to unlawfully obtain personal data. The case law involving the prosecution of private investigators and clients continues.

Self Reported Data Breaches

The number of self report breaches has increased by 29%. Under GDPR it is mandatory to report data breaches to the ICO.  There has been a significant spike in GDPR breach notification since 25 May 2018.

The sector that reported the largest number of breaches was health (37% of all cases).

Telephone Preference Service (TPS)

This is the central UK opt out register where individuals can object to telemarketing calls. In January 2017, the ICO took over responsibility for running TPS.  This enables quicker receipt and assessment of intelligence for ICO enforcement teams.

Funding/Notification Fees

Registration/notification fees collected in the last year totalled £21 million. This regime has, with effect from 25 May 2018, been replaced by a new fee regime which will be used to fund the ICO going forward.

Helpline calls

For obvious reasons, there has also been a spike in calls to the ICO helpline. Call numbers have increased by 24.1%.  Live chat has increased by 61.5%.  Written advice has increased by 40%.  Needless to say, the ICO is expanding its operations and recruiting more staff.

Brexit

We think the ICO has probably got enough of it on its plate with GDPR, e-privacy and all the new guidance. Then there’s Brexit!  There’s actually little comment on Brexit in the Annual Report other than to flag that it is one of the issues for the ICO.  Then again much of the detail on this has yet to be worked out.

The Commissioner concludes in her “foreword” that “the ICO is the proactive digital regulator the UK needs for ongoing challenges of upholding information rights in the digital world”.

Much more work to be done!

ICO Release Annual Report

ICO request feedback on profiling and automated decision-making

The ICO has published a request for feedback on the GDPR rules on profiling and automated decision making. They say it’s not guidance and just initial thoughts but we think it is a good steer on what the ICO thinks are the key issues.  You can respond with feedback to the ICO by 28 April or just use this to “issue spot”.  Both would be a pretty good use of time.

Key points:

  • Don’t be fooled by the “legal / similar effects” threshold in Art 22. The general GDPR rules will affect lots of business operations which involve profiling. This is not just about profiling having “legal effects” like e-recruitment.
  • Consider the risk of unfair discrimination.  How do you ensure your profiling is fair. How does that algorithm actually work? Check out “Weapons of Math Destruction” by Cathy O’Neil.  What is an acceptable error rate for inferences?
  • Think about raw input and output data and how to apply GDPR rights and obligations to each tranche.
  • How do you validate compliance where some/all of the process is carried out by a third party / vendor?  All the fairness, transparency and data hygiene rules apply.
  • Consent is mentioned as a legal basis but won’t work unless there is a genuine free choice as per the recent ICO consultation.
  • Beware of inadvertently generating special category data. This usually requires explicit consent.
  • Consider practical steps like identifying the “logic” of the legal effects decisioning in privacy policies and in response to DSARs.
  • Get ready to justify profiling if someone exercises their right to object. The other rights also apply of course.
  • Consider algorithmic auditing, seals, codes of conduct and ethical review boards to underpin profiling safeguards.
  • There will be a wide range of profiling requiring a DPIA: includes location tracking, loyalty programmes, and OBA as well as more obvious ones like credit scoring. DPIAs also apply to partly automated profiling with legal/similar effects. So this goes wider than the rules in Art 22 which only applies to decisions solely by automated means.
  • Do not profile children where this has legal/similar effects and is solely automated. This is a prohibition.
  • ICO to publish guidance on children’s data later this year (to cover gateway conditions / age verification / parental authorisation).
ICO request feedback on profiling and automated decision-making