California Enacts New Privacy Laws

October 19, 2025 6:30 pm

Source: site

Recently, California Governor Gavin Newsom signed into law several privacy and related proposals, including new laws governing browser opt-out preference signals, social media account deletion, data brokers, reproductive and health services, age signals for app stores, social media “black box warning” labels for minors, and companion chatbots. This blog summarizes the statutes’ key takeaways.

  • Opt-Out Preference Signals: The California Opt Me Out Act (AB 566) will require businesses that develop or maintain browsers to include functionality configurable by a consumer that enables the browser to send an opt-out preference signal. Additionally, a business that develops or maintains a browser must make clear to a consumer in public disclosures how the opt-out preference signal works and the intended effect of the opt-out preference signal. The law states that a business that maintains or develops a browser that includes the opt-out preference signal shall not be liable for a violation of the title by a business that receives the opt-out preference signal. AB 566 will take effect January 1, 2027, and provides the California Privacy Protection Agency (“CPPA”) rulemaking authority.
  • Social Media Account Deletion: AB 656 will require social media platforms that generate more than $100M per year in gross revenues to provide a “clear and conspicuous” button to complete an account deletion request. “Social media platform” is defined by reference to Section 22675 of the California code as a “public or semipublic internet-based service or application that has users in California” and where (1) a “substantial function” of the service or application is to connect users to interact socially with each other and (2) allows users to construct a public or semipublic profile, populate a list of users with whom the individual shares a social connection, and create or post content viewable by other users. If verification is needed for the account deletion request, it must be provided in a cost-effective and easy-to-use manner through a preestablished two-factor authentication, email, text, telephone, or message means.
  • Data Brokers: SB 361amends the California data broker registration law (the “Delete Act”) to require additional disclosures from brokers when they register with the CPPA. Specifically, data brokers registering with the CPPA will be required to provide certain new information, such as whether the data broker collects names, addresses, phone numbers, mobile advertising identifiers, precise geolocation, or status related to union membership, sexual orientation, and gender identity, among other topics. Additionally, data brokers will also be required to disclose whether they sold or shared consumers’ data to a foreign actor, to a federal or state government, to law enforcement, or to a developer of a generative AI (“GenAI”) system or model in the past year. A developer of a GenAI system is defined as a business, person, corporation, or similar entity that designs, codes, produces, or substantially modifies a GenAI system. A GenAI system is defined as an AI system that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the system’s training data.
  • Reproductive and Health Services: AB 45 will amend existing law to provide additional privacy protections for persons seeking or providing reproductive and health services at a family planning center. AB 45 will prohibit the collection, use, disclosure, sharing, sale, or retention of personal information of any person physically located at or within 1,850 feet of a family planning center, except to perform a requested service or provide requested goods. Additionally, AB 45 will prohibit geofencing entities that provide in-person health services to, among other things, identify or track a person seeking, receiving or providing health care services or to send advertisements related to these sensitive locations or health services. In addition to civil penalties, the statute also provides a private right of action for certain violations.
  • Age Signals for App Stores: The Digital Age Assurance Act (AB 1043) will require an operating system provider to collect birth date or age from account holders at account setup. Operating system providers must use this age information to provide an age signal to application developers, who will be required to request that information when a user downloads and launches an application. The law applies broadly to operating systems on a computer, mobile device, or any other general purpose computing device. The law will take effect on January 1, 2027.
  • Social Media “Black Box” Warning Labels: The Social Media Warning Law (AB 56) will require covered platforms to display a mental health “black box warning” label to users under the age of 18 each calendar day that a user uses the covered platform, including when the user first accesses the platform, after three hours of cumulative use, and thereafter, once per hour of continued cumulative use. The platform will not be required to display the label if the platform has reasonably determined that the user is over 17. The law will take effect January 1, 2027.
  • Companion Chatbots: SB 243 will impose new requirements for operators of companion chatbot platforms. The law defines a “companion chatbot” as an “artificial intelligence (‘AI’) system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user’s social needs,” including sustaining a relationship across sessions or exhibiting anthropomorphic features. If a reasonable person would believe they are interacting with a human, then the operator must provide a clear and conspicuous notice indicating the chatbot is artificially generated. Additionally, operators must implement measures to prevent the companion chatbot from providing suicidal ideation, suicide, or self-harm content to the user. Beginning July 1, 2027, operators must annually report to the Office of Suicide Prevention their protocols to detect, remove, and respond to certain content, such as suicidal ideation and self-harm. There are additional requirements if the operator knows the user is a minor. The operator must (1) disclose to the minor user that they are interacting with artificial intelligence, (2) provide by default a notification every three hours to the minor user that reminds the minor user to take a break and that the chatbot is AI, and (3) institute “reasonable measures” to prevent the chatbot from sharing or encouraging the minor user to engage in sexually explicit content.

© Copyright 2025 Credit and Collection News