1. Skip to navigation
  2. Skip to content
  3. Skip to sidebar

ICO Publishes Age Appropriate Design Code of Practice for Online Products and Services accessed by Children

On 21 January 2020, the ICO published the Age Appropriate Design Code of Practice. The Code is available here.

Who does the Code apply to?

  • The Code applies to information society services which are likely to be accessed by under-18s. The ISS does not have to be deliberately directed at children.
  • This includes any online products or services (e.g. apps, programs, websites, games). This also includes Internet of Things (IoT) connected toys and devices – whether with or without a screen.
  • The Code applies to ISS with an establishment in the UK OR those that are outside the UK (but target goods and services to, or monitor children in the UK).

What does the Code say?

The Code sets out 15 headline “standards of age appropriate design”:

  • Best Interests: The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.
  • Data Protection Impact Assessments: You should undertake a DPIA before launching the product or service to assess and mitigate risks to the rights and freedoms of children.
  • Age Appropriate Application: You should take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing OR apply the standards in this code to all your users instead.
  • Transparency: The privacy information you provide to users must be concise, prominent, and in clear language suited to the age of the child.
  • Detrimental Use of Data: You should not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions, or Government advice.
  • Policies and Community Standards: Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).
  • Default Settings: Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).
  • Data Minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
  • Data Sharing: You should not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
  • Geolocation: You should switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation to be switched on by default, taking account of the best interests of the child), and provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others should default back to ‘off’ at the end of each session.
  • Parental Controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
  • Profiling: You should switch options which use profiling ‘off’ by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
  • Nudge techniques: You should not use nudge techniques to lead or encourage children to provide unnecessary personal data or turn off privacy protections.
  • Connected Toys and Devices (IoT): If you provide a connected toy or device, ensure you include effective tools to enable conformance to this code.
  • Online Tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.

What should businesses do?

There are five steps that businesses should take now to prepare themselves (as set out in the Code):

  • Step 1: Implement an accountability programme
  • Step 2: Have policies to support and demonstrate compliance
  • Step 3: Train staff
  • Step 4: Keep proper records
  • Step 5: Be prepared to demonstrate compliance with the Code 

What happens now?

  • The Code needs to be notified to the European Commission and laid before Parliament (in case there are any objections). This process will likely be concluded in July / August 2020.
  • Businesses will then have 12 months to implement the changes from the date the Code takes effect. Based on the timescales above, we anticipate the Code will take effect around August/September 2021.
  • The ICO will enforce the Code in line with their Regulatory Action Policy and may impose fines under the Privacy and Electronic Communications Regulations (PECR) and/or GDPR, depending on the nature of the breach.
ICO Publishes Age Appropriate Design Code of Practice for Online Products and Services accessed by Children

Global Privacy Sweep Finds Privacy Issues in Children’s Apps

Last week, the Global Privacy Enforcement Network (GPEN) released the results from their third annual Privacy Sweep. Twenty-nine privacy enforcement authorities spread across 21 countries reviewed 1,494 websites and mobile applications (apps) either targeted to or popular among children – the theme of this year’s sweep.

Canadian regulators participating in the international sweep included the Office of the Privacy Commissioner of Canada (OPC), the Office of the Information and Privacy Commissioner of Alberta and the Office of the Information and Privacy Commissioner of British Columbia, who focussed their review on websites and apps based in Canada.

Among the overall findings by GPEN, 67% of the websites and apps examined collected personal information from children, such as names, photos, videos, audio, addresses and phone numbers.

“Too many developers are collecting particularly sensitive personal information such as photos, videos and the location of children, and often allowing it to be posted publicly, when there are clearly ways to avoid it,” said Privacy Commissioner Daniel Therrien in a statement. The OPC has repeatedly recommended in its publications and report of investigations that the best practice is to never collect personal information from children.

The OPC noted that many companies are developing innovative, creative and dynamic technological tools that balance the purpose of the website or app while respecting privacy protection.

The Privacy Sweep also found that 51% of websites and apps reviewed indicated they may disclose the children’s personal information to third parties. The Privacy Sweep found that 58% of websites and apps reviewed, while purporting not to collect personal information, redirected children to sites and apps that did collect personal information. The redirection was via an advertisement or a contest that sometimes appeared to be part of the website or app.

In considering parental or some form of adult supervision or control, only 31% of websites and apps reviewed had any protective control in place that would limit the collection of personal information; even less (24%) had some form of parental involvement.

The focus of privacy protection of vulnerable groups, such as youth and children, is one of Commissioner Therrien’s current privacy priorities.

The OPC also provided recommendations for companies to consider when collecting, using or disclosing personal information that may involve children, including:

  1. Avoid collecting any personal information from children.
  2. Instead of requiring children to disclose their name or photo or other personal information – for example to register with a website or app – companies should use protective controls such as preprogrammed avatars and usernames that children can select instead.

The goals of the GPEN Privacy Sweep include creating awareness and encouraging compliance with privacy legislation; however, GPEN and the OPC note that the results of the Privacy Sweep could lead to follow-up action being taken, including outreach and investigations.


Global Privacy Sweep Finds Privacy Issues in Children’s Apps

Children’s Online Privacy Protection: U.S. Developments Compared to Canada

There were two important developments in the U.S. regarding children and mobile technologies.

FTC Staff Report

On December 10, 2012, the U.S. Federal Trade Commission (FTC) released a Staff Report entitled“Mobile Apps for Kids: Disclosures Still Not Making the Grade”. The Staff Report examines the privacy disclosures and practices of mobile apps. The survey was conducted during the summer of 2012. FTC Staff tested 400 apps. Among the interesting survey results:

  • 80% of the apps (319) apparently did not disclose any information about the apps privacy practices prior to download. Many of those that contained privacy disclosures “consisted of a link to a long, dense, and technical privacy policy” according to the FTC Staff Report.
  • 60% of the apps (235) transmitted the device ID to the developer, an advertising network, an analytics company, or other third party. The most common transmission was to advertising networks (by a large margin). Only 20% (44) of the 223 apps that transmitted device ID, geolocation or phone number to third parties provided any privacy disclosures.
  • 58% of the apps (230) contained in-app advertising, but only 15% of the apps (59) disclosed information about the presence of advertising.
  • 17% of the apps (66) contained in-app purchase functionality.

The FTC Staff Report states that FTC Staff have commenced a number of investigations where FTC have identified gaps between the company practices and disclosures, which could constitute violations of the U.S. Children’s Online Privacy Protection Act (COPPA) or the Federal Trade Commission Act’s prohibition on deceptive practices.

In Canada, app developers should be aware of provincial consumer protection legislation and the federal Competition Act, which contain prohibitions on deceptive practices, as well as federal and provincial privacy legislation, such as the Personal Information Protection and Electronic Documents Act (PIPEDA), which required transparency with respect to an organization’s practices regarding the collection, use, retention and disclosure of personal information. In addition, app developers marketing apps with in-app advertising should be aware of Quebec’s Consumer Protection Act, which prohibits advertising to children under 13 years of age.

Amendments to the COPPA Rule

On December 19, 2012, the FTC adopted the final amendments to the Children’s Online Privacy Protection Rule (COPPA Rule). Highlights from the amendments include:

  • Expanded Definition of Personal Information. The new definition includes geolocation information, photos, videos and audio files that contain a child’s image or voice. Persistent identifiers such as a unique device ID or MAC address may also be personal information.
  • Extension of Rule to Third Party Applications. The FTC perceived a gap or loophole to the existing COPPA Rule that permitted advertising networks, third party plug-ins and other applications to collect personal information from children without parental consent. The amended COPPA Rule provides that an organization will be considered an “operator” of a website directed to children if it is benefits from the collection of information by a third party even where the third party is not acting as its agent. This will place an obligation on the operator to obtain consent to the collection of the personal information collected by the third party. FTC Commissioner Ohlhausen dissented from the new COPPA Rule on the basis that this extension went beyond what the statute permitted.
  • New Rules for Verifiable Parental Consent. The new COPPA Rule permits obtaining consent by way of electronically scanned parental consent, video conferencing, government-issued identification or payment systems that provide notice to the primary account holder of each discrete transaction.

Canada contains no equivalent to COPPA; however, the Office of the Privacy Commissioner of Canada (OPC) has focused on children’s online privacy as a priority. In the OPC’s guidance regarding online behavioural advertising, the OPC stated:

“The most obvious type of information that should not be tracked involves children’s information. Operators of web sites that are targeted at children should not permit the placement of any kind of tracking technologies on the site. It is hard to argue that young children could meaningfully consent to such practices, and the profiling of youngsters to serve them online behaviourally targeted ads seems inappropriate in such circumstances. The Canadian advertising industry has indicated that it will require its members to not knowingly target children; this is a position that the OPC endorses and encourages.”

Given the increasing focus on meaningful consent to the collection of personal information, it may be only a matter of time before Canadian privacy commissioners issue a decision regarding the collection and use of personal information about children. In the meantime, app developers hoping to offer their apps in the U.S. should take note of the new COPPA Rule.


Children’s Online Privacy Protection: U.S. Developments Compared to Canada

ICO request feedback on profiling and automated decision-making

The ICO has published a request for feedback on the GDPR rules on profiling and automated decision making. They say it’s not guidance and just initial thoughts but we think it is a good steer on what the ICO thinks are the key issues.  You can respond with feedback to the ICO by 28 April or just use this to “issue spot”.  Both would be a pretty good use of time.

Key points:

  • Don’t be fooled by the “legal / similar effects” threshold in Art 22. The general GDPR rules will affect lots of business operations which involve profiling. This is not just about profiling having “legal effects” like e-recruitment.
  • Consider the risk of unfair discrimination.  How do you ensure your profiling is fair. How does that algorithm actually work? Check out “Weapons of Math Destruction” by Cathy O’Neil.  What is an acceptable error rate for inferences?
  • Think about raw input and output data and how to apply GDPR rights and obligations to each tranche.
  • How do you validate compliance where some/all of the process is carried out by a third party / vendor?  All the fairness, transparency and data hygiene rules apply.
  • Consent is mentioned as a legal basis but won’t work unless there is a genuine free choice as per the recent ICO consultation.
  • Beware of inadvertently generating special category data. This usually requires explicit consent.
  • Consider practical steps like identifying the “logic” of the legal effects decisioning in privacy policies and in response to DSARs.
  • Get ready to justify profiling if someone exercises their right to object. The other rights also apply of course.
  • Consider algorithmic auditing, seals, codes of conduct and ethical review boards to underpin profiling safeguards.
  • There will be a wide range of profiling requiring a DPIA: includes location tracking, loyalty programmes, and OBA as well as more obvious ones like credit scoring. DPIAs also apply to partly automated profiling with legal/similar effects. So this goes wider than the rules in Art 22 which only applies to decisions solely by automated means.
  • Do not profile children where this has legal/similar effects and is solely automated. This is a prohibition.
  • ICO to publish guidance on children’s data later this year (to cover gateway conditions / age verification / parental authorisation).
ICO request feedback on profiling and automated decision-making

Canadian Privacy Compliance: Time for your Online Checkup

In a previous post on online behavioural advertising (OBA), we wrote about the Office of the Privacy Commissioner’s “call to action” to stakeholders in the advertising industry on OBA, and we discussed the industry’s response to that call: self-regulation.

2012 – Call to Action: the Privacy Commissioner’s Expectations 

In its 2012 Policy Position on Online Behavioural Advertising, the Office of the Privacy Commissioner (OPC) stated that it “may” be acceptable to rely on implied or opt-out consent when tracking and targeting individuals for OBA purposes, “provided that”:

  • Individuals are made aware of the purposes for the practice in a manner that is clear and understandable – the purposes must be made obvious and cannot be buried in a privacy policy. Organizations should be transparent about their practices and consider how to effectively inform individuals of their OBA practices, by using a variety of communication methods, such as online banners, layered approaches, and interactive tools;
  • Individuals are informed of these purposes at or before the time of collection and provided with information about the various parties involved in OBA;
  • Individuals are able to easily opt-out of the practice – ideally at or before the time the information is collected;
  • The opt-out takes effect immediately and is persistent;
  • The information collected and used is limited, to the extent practicable, to non-sensitive information (avoiding sensitive information such as medical or health information); and
  • Information collected and used is destroyed as soon as possible or effectively de-identified.

2013 – Industry Response: Self-Regulation

In response, the industry developed and launched the Canadian Self-Regulatory Program for Online Behavioural Advertising (the “Ad Choices program”), an initiative tailored to meet the requirements of Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA), as well as the OPC guidelines.  The initiative is led by the Digital Digital Advertising Alliance of Canada (DAAC), and is monitored and administered by the non-profit industry body Advertising Standards Canada (ASC). A growing number of brands and media companies have registered for the program.

We noted in our previous post that the OPC would no doubt be watching to see whether and how industry self-regulation meets its expectations under PIPEDA and its OBA guidelines.  We also noted, however, that the self-regulatory solution was not designed to cover all OBA activities.  For example, certain types of activities are expressly excluded from the Ad Choices program, such as “online advertising of entities within a web site they own or control” and “contextual advertising”, including ads based on the content of a web page being visited, a consumer’s current visit to a web page, and a search query.

Ongoing OPC Guidelines, Investigations and “Sweeps”

The OPC is not staying on the sidelines – it continues to take a keen interest in OBA and online consent more broadly.  For example, in January 2014, the OPC found that Google ads triggered by web surfing on health sites violated privacy rights.  As a result, Google committed to several measures, including closer monitoring of potential violations by advertisers.  In May 2014, the federal, British Columbia and Alberta Privacy Commissioners issued new guidelines for online consent, calling for transparent and dynamic privacy notices, and greater protections for personal information belonging to children and youth.

In 2015, the OPC is investigating websites visited by Canadians for compliance with OBA requirements.

The OPC has in past years conducted investigation and enforcement “sweeps”.  In 2013, the OPC led and participated in the first annual Global Privacy Enforcement Network (GPEN) Internet Privacy Sweep.  The sweep targeted privacy policies, and the OPC published the initial results of its investigations under the headings “The Good, the Bad, and the Ugly“. In 2014, the OPC again participated in the GPEN Sweep, investigating the transparency of privacy practices for 151 mobile apps that were made in Canada or frequently downloaded by Canadians.  The Results of the 2014 Global Privacy Enforcement Network Sweep are an overall, anonymous mobile app “report card”, ranking transparency to users, ease of access/reading on the small screen, and whether privacy information is available before download.

An OPC “report card” on OBA is expected to be released sometime in the Spring.


In the news:  see the recent Globe & Mail article “Watchdog to study ‘privacy compliance’ among Canadian advertisers” 


Canadian Privacy Compliance: Time for your Online Checkup