How SB 362 — the California Delete Act — Helps Protect Kids’ Online Safety
[Update October 2023: the California Delete Act aka SB 362 was signed into law on October 10, 2023 by Governor Newsom. My analysis of the law can be found here.]
California Senate Bill 362 (SB 362) — aka the California Delete Act — would create an online portal for consumers to request that data brokers delete any data they have on the consumer and no longer track them. I have documented in the past how SB 362 addresses the growing concerns over the weaponization of data collected by data brokers that even John Oliver has called attention to. I have also done deep dives on SB 362’s positive impact on immigrant rights, domestic violence survivors, and reproductive rights. In this blog post, I will discuss how SB 362 can help protect kids’ online safety.
Note that as of the end of June 2023, the bill has passed the California State Senate, passed the State Assembly Privacy Subcommittee (and gained additional legislative sponsors in doing so), and is now on to the State Assembly Appropriations Committee. [Full disclosure: I proposed this bill to Senator Josh Becker and co-drafted it.]
As I discuss in my book Containing Big Tech (and you can check out an overview of the book here and/or click here for updates and annotations), there is significant online spying on children, not only by Big Tech firms but also by data brokers. As I noted:
To give you a deeper feel for the pervasiveness of online spying on children, in 2022, the Washington Post reported on a study that showed that more than two-thirds of the one thousand most popular Apple iPhone apps that are likely to be used by kids would collect and send behavioral data to third parties. On Google Android, that number was 79 percent. Other studies have shown that 90 percent of educational apps sent data back to ad tech firms, and two-thirds of the apps played by a sample group of 124 preschool-aged kids also collected and shared behavioral data. A 1998 US federal law—the Children’s Online Privacy Protection Act (COPPA)—bans collecting data from children under thirteen without parental consent. But the law calls for “actual knowledge” that users are children. The Washington Post noted that Big Tech and app vendors use this as a loophole despite marketing features targeting children.
Furthermore, Common Sense Media recently published a report that found after analyzing the privacy practices of over 200 apps, “nearly three-quarters (73%) are still monetizing kids’ and families’ personal information in some way, such as tracking behavior and sharing that data with advertisers.”
This of course is very concerning because kids have fewer developmental tools to resist the marketing messages and are likely to accept ads as being truthful, unbiased, and accurate. As I further note in my book:
For example, research estimates that 75 percent of eight- to eleven-year-olds cannot distinguish and tell the difference between ads and other online content. Only 25 percent of eight- to fifteen-year-olds can identify the top results from a Google search as paid advertisements, despite the ads being clearly labeled as such. And given that kids have always been vulnerable to the power of advertising and don’t fully understand the purposes behind advertising, this “super-charged” form of advertising that targets them personally increases the risks associated with this vulnerability, including making them more susceptible to manipulation.
Data brokers, through the use of third-party cookies and mobile SDKS, do in fact cast a wide net and collect data about children. This raises the fundamental question of whether personal and behavioral data—which can be traced back to an individual by correlating multiple data points—should be mined and extracted from children. I believe and many others believe it should not be, not only for businesses that we have a direct relationship with, but also for entities that we do not have a direct relationship with and whose existence is based on collecting and selling our personal information (which is the definition of a data broker in SB 362 btw).
For further analysis of the risks associated with data brokers collecting data on kids, check out Privacy Rights Clearinghouse’s blog post entitled “The California Delete Act and Children's Privacy.”
How SB 362 Helps Protect Kids Online Safety
First and foremost, SB 362 requires that data brokers reveal upon registration with the California Protection Privacy Agency (CPPA) whether they collect the personal information of minors. This will put California in line with Vermont’s data broker registry which also requires this and should not be a burden to most data brokers to report this as they have been doing this for a number of years to meet Vermont’s requirements.
Second, SB 362 directs the CPPA to create a simple way for Californians to direct all data brokers to delete their personal information, free of charge. The bill also supports the ability of a consumer’s authorized agents — which could include a parent — to aid in the deletion request. The cool thing is that this deletion request also covers any net new data on the consumer (or child) that the data broker has collected after the consumer’s request.
Finally, the bill allows that after a consumer (including a parent acting as an agent on behalf of their children) has submitted a deletion request and a data broker has deleted the consumer’s data, the data broker shall not sell or share new personal information of the consumer unless the consumer requests otherwise. Basically, this means that moving forward, the child is protected from having their personal data sold or shared on an ongoing basis.
Taken all together, this means that SB 362 — the California Delete Act — dramatically raises the bar on protecting kids’ online safety vis-à-vis data brokers who collect and sell our personal data. It also represents an excellent complement to the California Age-Appropriate Design Act that passed last year, as well as the privacy enhancements for children that were added by the CPRA / Prop 24 that amends the CCPA.