Skip to content

California Passes the Age Appropriate Design Code

On September 15th, 2022, the Governor issued a press release signaling the signing of bill AB2273. The bill, modeled after the United Kingdom’s Age-Appropriate Design Code develops goals for protecting California’s children online. The bill is slated for enforcement beginning July 1st, 2024.

What follows is a high level overview of the bill, but I am not a lawyer, and this is not legal advice. Further, work will be done over the course of 2023, before the final requirements businesses will be subject to are fully decided upon.

Under the tile, the California Attorney General has the power to solicit broad public participation and adopt further regulations to clarify the requirements listed below. Further the California Legislature has created the California Children’s Data Protection Working Group to assist in development of best practices in regard to implementation of this law. The working group, from 2024 to 2030 will submit a report to Legislature regarding recommendations outlined under Section 1798.99.32d

Important Definitions

(4) “Likely to be accessed by children” means it is reasonable to expect, based on the following indicators, that the online service, product, or feature would be accessed by children:

(A) The online service, product, or feature is directed to children as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.).

(B) The online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children.

(C) An online service, product, or feature with advertisements marketed to children.

(D) An online service, product, or feature that is substantially similar or the same as an online service, product, or feature subject to subparagraph (B).

(E) An online service, product, or feature that has design elements that are known to be of interest to children, including, but not limited to, games, cartoons, music, and celebrities who appeal to children.

(F) A significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.

https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202120220AB2273

This definition is exceptionally broad. The service may not be designed for children, but may still be subject to this law, if a significant amount of children decide to leverage the feature or service. This may pose extensive risk to online services and businesses who do not have direct control over their audience. The clauses of D, E and F mean that a company which does not target children still may be subject to the law, and thus face enforcement action for failure to comply.

Who the bill applies to

Unless otherwise specified in the bill’s text – the bill’s provisions are designed to protect consumers under 18 years of age.

Business Requirements

A business that provides an online service, product or feature likely to be accessed by children must take the following actions:

Before any new service, feature or product is offered to the public, complete a Data Protection Impact Assessment and maintain documentation of this assessment as long as the service/feature/product is likely to be accessed by children. The business is required to review / update this assessment every two years.

The business is required to document any material detriment to children that arises from data management practices identified in the Data Protection Impact Assessment and create a timed plan to mitigate or eliminate the risk before the service/product/feature is accessible by children.

The business must turn over a list of of all Data Protection Impact Assessments to the Attorney General with-in 3 days of a written notice. The business must make a specific Data Protection Impact Assessment available to review with-in 5 days of written notice from the Attorney General.

The business is required to estimate, to a reasonable level of certainty, the age of the child user and apply privacy and data protection standards to those users, or when that is not possible apply the privacy and data protections afforded to children to all users.

The business must configure all default privacy settings provided to children to settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children.

Provide any privacy information, terms of service, policies or community standards concisely, prominently and using clear language suited to the age of the children likely to access the service/feature/product.

If the service/feature/product allows Parents to monitor the child, the child must be provided a obvious signal when they are being monitored or tracked.

The business must enforce published terms, policies and standards, including those concerning children.

A business that provides features likely to be accessed by children is barred from taking the following actions:

  • Use the personal information of any child in a way the business knows, or has reason to know, is materially detrimental to the health, mental health or well-being of a child.
  • Profile a child by default unless the business can demonstrate it has appropriate safeguards in place to protect children and profiling is necessary to provide the service/product/feature only with respect to the service/product/feature is actively and knowingly engaged OR the business can demonstrate a compelling reason that profiling the child is in the best interest of the child.
  • Collect, sell, share or retain any personal information that is not necessary to provide said online service/product or feature which the child is actively and knowing engaged unless they can show why doing so is in the best interests of the children.
  • If the end user is a child – use personal information (including precise geo-location information) for any reason other than the reason it was collected unless they can show why doing so is in the best interests of the children.
  • Use dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected, or take any action that is known, or reason to know, is detrimental to a child’s health, mental health or well-being.
  • Use any personal information collected to estimate age or age range for any other purpose or retain that personal information longer than necessary to estimate age.

Data Protection Impact Assessments

Data Protection Impact Assessments are becoming increasingly common in privacy law and regulation, and they are central to the Age Appropriate Design Code. Under the law, the DPIA must document the purpose of the product / feature / service, how it uses children’s personal information and any material risks that arise from the data management practices of the business. The DPIA shall address, the below, to the extent applicable.

(i) Whether the design of the online product, service, or feature could harm children, including by exposing children to harmful, or potentially harmful, content on the online product, service, or feature.

(ii) Whether the design of the online product, service, or feature could lead to children experiencing or being targeted by harmful, or potentially harmful, contacts on the online product, service, or feature.

(iii) Whether the design of the online product, service, or feature could permit children to witness, participate in, or be subject to harmful, or potentially harmful, conduct on the online product, service, or feature.

(iv) Whether the design of the online product, service, or feature could allow children to be party to or exploited by a harmful, or potentially harmful, contact on the online product, service, or feature.

(v) Whether algorithms used by the online product, service, or feature could harm children.

(vi) Whether targeted advertising systems used by the online product, service, or feature could harm children.

(vii) Whether and how the online product, service, or feature uses system design features to increase, sustain, or extend use of the online product, service, or feature by children, including the automatic playing of media, rewards for time spent, and notifications.

(viii) Whether, how, and for what purpose the online product, service, or feature collects or processes sensitive personal information of children.

https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202120220AB2273

The business is required to complete a DPIA on or before July 1st, 2024 for any online service/product or feature likely to be accessed by children offered to the public.

Enforcement

Following the fine structure of the California Consumer Protection Act (CCPA), businesses that violate the title are subject to up to $2,500 per affected child for each negligent violation, or up to $7,500 per affected child for each intentional violation.

Notable, if a business is in substantial compliance with the requirements of the law, the Attorney General shall provided a written notice prior to initiating enforcement action. The business would have 90 days to cure the violation. This is in contrast to the CCPA, which will repeal its cure provisions on January 1st, 2023.

Personal Thoughts

The bill is exceptionally broad and if not amended prior to enforcement could pose extreme risk to companies whom do not target children, but are still subject to the law due to the actions of children. Effectively, the brings the entire world of COPPA like requirements to general audience companies, which is something those companies / services have been largely been able to avoid to date.

Further, the law, while notable in it’s goals, presents a difficult choice for businesses whom may now have to apply several different default configurations depending on the age of the user. For less risk, it seems to me you’d just automatically assume the strictest privacy controls, but if you wanted to have multiple configurations – how would that be possible? It’s not like you can easily run age-verification across the board, especially without collecting even more personal data, and it’s not like children are always truthful regarding their age (or parents for that matter).

Further, I feel it difficult, if not impossible to craft age-appropriate language regarding privacy, and even if that was possible – who would honestly expect a child to sit there and read it prior to clicking past to get to whatever it was they wanted to do in the first place?

The bill has also faced critics from privacy and industry groups over concerns of wide-spread age verification being put into use. It remains to be seen if the bill will be challenged in court over it’s exceptionally broad terms. For now however, companies would be well served to consider how this will impact them and factor in estimated work into their roadmaps for 2023.

Published inLegalPrivacy