A Closer Look at the California Age-Appropriate Design Code Act
In California we are in the first legislative session since the passage of Proposition 24 (the California Privacy Rights Act or CPRA) and not surprisingly there are a number of proposed privacy bills to enhance and/or build upon the CPRA. [IAPP does a masterful job in tracking the proposed bills via its CCPA-/CPRA-Related Legislation Tracker.] In this blog post I am going to drill down on Assembly Bill 2273 (AB 2273) that is Assemblymembers Buffy Wicks’ and Jordan Cunningham’s proposed California Age-Appropriate Design Code Act (ADCA) that is modeled after the United Kingdom’s Age-Appropriate Design Code (aka the Children’s Code).
To me the ADCA is in fact one of three major California privacy bills in this legislative session that builds upon the CPRA, and I should note that I am proud to say that I play a part in two of the three (but not the ADCA!). The first is Senate Bill 1059 (aka SB 1059) that enhances California’s Data Broker law and moves the registration and regulation of data brokers to the Proposition 24-created California Privacy Protection Agency (PPA). I proposed this bill back in December of 2020. The second one is the “Truth in Elections” bill (aka SB 746) that lets consumers request that a business that collects personal information about them disclose whether or not it is using the personal information for a political purpose. My research into Google’s algorithmic bias during the 2020 California election was cited as part of the motivation behind the bill (see page 6 of the Senate Judiciary Committee analysis here). The third is the ADCA (aka AB 2273) that I will analyze in this blog post that was written and driven by the children advocacy group 5Rights Foundation that was the driving force behind the UK law upon which AB 2273 is based upon.
But before we dig into the ADCA, let’s first do an executive summary and provide some background context.
Executive Summary of the California Age-Appropriate Design Code Act (ADCA aka AB 2273)
Commencing on July 1, 2024, the ADCA would require businesses whose online products would likely be accessed by children to comply with specified standards, including considering the best interests of children. Businesses that own these online products would be required to protect children’s data and limit children’s online exposure. As noted by the WSJ and Politico, examples of how the bill would protect kids online include limiting the collection of children’s data, curtailing addictive interfaces, having default settings that include high privacy settings that would ban nudging kids to weaken those settings, having precise geolocation tracking set to off, and providing clear language for privacy information and/or terms of services that are best suited to the age suited to the age of children likely to access that product.
Furthermore, the bill would require the PPA to establish and convene the California Children’s Data Protection Taskforce (CDPT) to evaluate best practices for the implementation of these provisions. The PPA’s board must appoint the members of the taskforce by April 1, 2023 and would require those members to have expertise in the areas such as privacy and children’s rights. The CDPT would make recommendations on best practices. In addition, by April 1, 2024, the PPA must adopt regulations and publish guidelines in consultation with the CDPT.
Background on ADCA & AB 2273
This bill in California and a newly proposed federal bill (the Kids Online Safety Act) both come at a time where there has been much awareness raised over children’s online safety, especially in light of the intensive amounts of time spent by kids online during the Covid era. As noted by the Washington Post, the California and Federal bills are
the result of months of hearings and a congressional investigation into tech companies’ handling of children’s safety, after documents were disclosed last year by Facebook whistleblower Frances Haugen. Although the documents touched on a variety of topics, internal research examining how Instagram may affect the mental health of teen girls, first reported by the Wall Street Journal, sparked new political will to update safeguards for minors online.
Not surprisingly, the New York Times wrote that “child safety is the new tech battleground.” And Assemblymember Wicks summarizes the problem succinctly :
it’s “long past due” to recognize “that we have an environment where kids are nudged to take part in risky behaviors, they’re exposed to predators, they’re exposed to harmful material, they’re encouraged into compulsive behavior.”
Unfortunately, existing laws in the US are outdated given the recent massive growth and usage of social media, especially usage among kids. As noted by the Washington Post, the US’ existing children online privacy law — The Children’s Online Privacy Protection Act or COPPA — is “more than 20 years old and applies only to children younger than 13” and policymakers think it “does not do enough to protect children in the age of social media.”
Much like major privacy regulation was first passed in Europe with GDPR, various initiatives in Europe are leading the charge to further protect children online. The most significant is the UK’s Age-Appropriate Design Code. This was passed in 2020 and requires online products that kids will likely use to modify their design to improve kids’ safety and privacy. The backers of this UK law is the children’s digital rights advocacy group 5Rights Foundation, and they claim that the introduction of law has resulted in “radical changes” for online products that are likely to be accessed by children including:
TikTok and Instagram have disabled direct messages between children and adults they do not follow
The Google Play Store now prevents under 18s from viewing and downloading apps rated as adult-only
YouTube have turned off autoplay for under 18s and break and bedtime reminders are turned on by default
Google have made SafeSearch the default browsing mode for all under 18s
Given the success of the UK law, and given California’s influence on setting the standard for US in terms of regulations and consumer rights (aka “the California Effect”), it is not surprising that given 5Rights’ passion for protecting kids online that they are deeply involved in supporting and driving the proposed California law.
It should be noted that Wicks (who is a Democrat) introduced a similar bill last year, but it failed to pass. But it appears that AB 2273 has a better chance of success considering recent headlines, the fact that this new bill leverages the existing UK law that appears to be successful, that this new bill builds upon the passage of CPRA / Proposition 24 including leveraging the newly created Privacy Protection Agency, and that AB 2273 already has bipartisan support in that it is co-sponsored by Republican Jordan Cunningham.
More Details of ADCA & AB 2273
The bill requires businesses who provide “a good, service, or product feature likely to be accessed by a child” to comply with the following 8 requirements:
Consider the “best interests” of children when designing and developing the product in a manner that prioritizes “the privacy, safety and well-being of children.”
Undertake a Data Protection Impact Assessment for their products that is a “systematic survey” of the risks associated with the access by children of the products.
Establish the age of users using the products so as to be sure to apply privacy and data protections for kids using the products
Maintain the highest level of online privacy for kids, including disabling profiling.
Provide privacy information and terms of services in clear language that kids can understand.
Provide an obvious “signal” if the product is being used to track location or online activity.
Uphold the privacy notice as defined for kids
Provide tools as part of the product offering to help kids exercise their privacy rights and report concerns.
That’s the stuff businesses must do. Below are the actions that a business shall NOT take:
Use the personal information of a child that may harm the physical health, mental health or well-being of a child.
Don’t collect any additional personal information that is not necessary for the product
If the business does not know the age of the user of the product, it shall not collect or retain the personal information
Use the personal information of a child for any reason other than what was specified in the privacy notice.
Not disclose the personal information of a child.
Not collect any precise geolocation information by default for children, unless there is a compelling reason.
Not collect any precise geolocation information unless there is an obvious signal that this information is being collected
Collect any sensitive personal information of a child by default.
Use dark patterns to trick kids in providing additional personal information as well to forego privacy protection measures. Also not allow use of dark patterns that could have demonstrable harm of kids’ physical health, mental health or well-being — which to me is really about cracking down on the addictive nature of some online products.
In terms of regulation and enforcement, the bill requires the PPA to create establish and convene the California Children’s Data Protection Taskforce (CDPT) to evaluate best practices for the implementation of these provisions. The PPA’s board must appoint the members of the taskforce by April 1, 2023 and would require those members to have expertise in the areas such as privacy and children’s rights. The CDPT would make recommendations on best practices. In addition, by April 1, 2024, the PPA must adopt regulations and publish guidelines in consultation with the CDPT.
Summary
The goal of AB 2273 is consistent with the intent of Proposition 24 (CPRA), and in fact one of the major points of emphasis in the campaign for Proposition 24 was that it further enhanced the California Consumer Privacy Act (CCPA) by adding additional capabilities to safeguard our children’s safety. CPRA did this by tripling fines for the collection and sale of children’s private information. Proposition 24 also required opt-in consent for the sale of personal information from consumers under 16.
I appreciate that AB 2273 takes CPRA’s online safeguards for children even further, by requiring additional obligations from businesses to protect kids online. I also think having a focused task force dedicated to children’s data protection that is established under the umbrella of the PPA is an excellent idea. Furthermore, the ability to adopt regulations in this area will allow the PPA to stay current with rapid changes in technology. Hence, I am supportive of AB 2273.
With that being said, I do see some areas of improvement, which I will detail in my next blog post.