CCPA

CalPrivacy Director Discusses New Audits Division and Other 2026 Actions to Come


California Privacy Protection Agency (CalPrivacy or CPPA) Executive Director Tom Kemp has helped steer the agency toward a busy 2026. Since January, it hired its first chief privacy auditor, established an Audits Division to review companies’ compliance, launched the Delete Request and Opt-Out Platform (DROP) for consumers and data brokers, and brought enforcement actions against brokers S&P Global and Datamasters. Signaling ongoing active enforcement, on March 3, CalPrivacy issued a decision requiring PlayOn Sports to pay a $1.1 million fine for privacy violations involving students. The agency also sponsored two pieces of new legislation and persuaded the governor to promote privacy. It highlighted most of these efforts in its 2025 Annual Report, published on February 27, 2026.

One of the new bills, AB 2021, would incentivize privacy whistleblowing inside companies, a novel measure that would give CalPrivacy’s enforcers a potent tool to expose and punish privacy violations buried deeper in the data economy.

Looking ahead, to encourage business compliance, CalPrivacy will issue guidance on its new regulations concerning automated decision-making, privacy risk assessments and cybersecurity audits. Four proposed regulations will open for public comment, while the Enforcement Division’s sweeps – with other enforcers around the world – and joint efforts with other states surely will deliver more settlements and actions in 2026.

Kemp spoke to the Cybersecurity Law Report about the latest CalPrivacy developments and what lies ahead in 2026 for the agency. He also discussed what CalPrivacy enforcers are seeing, considerations for company leaders and balancing innovation and regulation.

See “Outgoing CPPA Board Member Discusses Rulemaking and Looming Privacy Issues” (Sep. 25, 2024).

Launch of Audits Division

CSLRWhat should businesses know about CalPrivacy’s new Audits Division?

Kemp:  The new Audits Division will develop and apply privacy‑compliance audit procedures and conduct complex regulatory examinations of businesses and their practices. The statute calls for a chief privacy auditor to work alongside the executive director, and we are thrilled to have hired Sabrina Ross for the role. Under the leadership of the chief privacy auditor, the division will obtain and analyze privacy and technology records to determine compliance with the CCPA. It complements the Enforcement Division [(Enforcement)]. The Audits Division evaluates compliance and identify gaps, while Enforcement investigates potential violations.

CSLR:  What models are informing how you shape the division?

Kemp:  We’re fortunate that board member Alastair Mactaggart authored the initiative and its provision for the auditor. Sabrina will sit down with him to get his vision. She also has reached out to comparable agencies in the U.S. and internationally; to data protection authorities in Europe, Canada and Australia; and to the AG and the Department of Financial Protection and Innovation in California. She’s analyzing what the state of the art is while staying forward‑looking about technologies like AI and automated decision‑making tools [(ADMT)]. We’re very excited to have her build a team, and she brings expertise in technology and privacy regulatory compliance. She’s held leadership roles and knows the types of companies operating in the information industry and how they operate.

CSLR:  What will trigger audits? Will companies be on a regular cycle?

Kemp:  We have flexibility and could assess overall compliance or focus on a specific statutory requirement. We could go deep in one industry or practice or focus on a particular privacy harm, such as the collection of children’s data. Sometimes we may pre-announce a thematic audit in an area. Other times we may work behind the scenes without prior notice. For now, we’re not publicly discussing the initial focus for audits.

CSLR:  What should companies expect concerning audit submissions?

Kemp:  As provided for in the regulations, the Audits Division will collect the risk assessment attestations, due by April 1, 2028, for any assessments that must happen the next two years. The division will collect attestations annually thereafter, but, if a program or service doesn’t change, the business does not have to resubmit its attestation. It also will collect the cybersecurity audit certifications. Businesses with over $100 million in revenue must submit their certification by April 1, 2028; those with $50 million to $100 million must submit by April 2029; and those with less than $50 million must submit by April 2030. The division plans to follow up on the submissions.

CSLR:  Will there be further audit guidance?

Kemp:  The Audits Division will provide additional information about both the risk assessment attestations and the cybersecurity audit certifications. We have until 2028 before the first cybersecurity certifications are due, so expect more detail next year. For now, the division’s mandate is to evaluate for compliance gaps and refer potential violations to Enforcement.

[See “Steps to Address the New California Audit Rule That Seeks to Reset Reasonable Security” (Nov. 5, 2025).]

New Whistleblower and Data Deletion Bills

CSLR:  CalPrivacy proposed a privacy whistleblower law, AB 2021. Why is it needed, and how would it work?

Kemp:  Most data processing happens in a black box. It’s difficult for outside entities like us to see what’s happening. Even when we have an inkling, investigations can take years. We face companies with enormous resources, with armies of lawyers and engineers. Insiders often know when things aren’t done correctly, but in Silicon Valley they have “golden handcuffs,” slugs of stock options and high salaries. They also fear they won’t get another job if they raise a flag.

The bill provides an award of between 15 and 33 percent of what we collect and strong anti‑retaliation protections to incentivize coming forward. The whistleblowers can’t come forward with just any old thing. They must bring something that we vet and designate as a case.

The bill also explicitly covers contractors, not just employees, because a lot of Silicon Valley companies leverage contractors. This gives us another tool to get better visibility into risky practices with personal information while protecting those who report on those practices.

CSLR:  What models did you draw from for the proposed whistleblower law?

Kemp:  Our deputy director of enforcement, Michael Macko, came from the SEC and saw how whistleblowers help. Recent California legislation included some protections, but not at this level, particularly around contractors and the award. We’re taking the best features of existing laws, but this bill is focused only on the CCPA and privacy. If it passes, the approach could be a model for bills in other tech areas like AI.

[See “What to Know (and Do) About DOJ’s Efforts to Identify and Prosecute Cybersecurity Fraud Under the False Claims Act” (Oct. 30, 2024).]

CSLR:  If the CalPrivacy-sponsored Expanding Privacy Rights Act introduced in January passes, what would it change?

Kemp:  Senator Josh Becker’s bill (SB 923) has two main components. First, it aligns deletion rights with other state laws by focusing on information “about” the consumer rather than only what was “collected from” the consumer. Second, it would require businesses to offer a web form, not just an email address, because email volleyball creates friction.

CSLR:  Did consumer complaints drive this bill?

Kemp:  Yes. We’ve had about 10,000 complaints since the inception of CalPrivacy, roughly 150 per week. A good chunk of those involve difficulties communicating with businesses and exercising rights. That’s occurring sometimes because of missing infrastructure and workflows, ignored requests or demands for too much information. The bill is designed to make privacy rights easier to exercise.

Time for Businesses to Adjust for the Omnibus Regulations and Growing DROP Registrations

CSLR:  CalPrivacy finalized its regulations for risk assessments, ADMT and cybersecurity audits. What are the expectations and timelines?

Kemp:  We created a big package of regulations because the statute explicitly required them. It took three to four years and thousands of public comments. In the end, we landed the airplane on the carrier, and you can debate whether it was stormy. We are proud California now sets a new standard, particularly on cybersecurity audits. New York has something comparable, but only for financial services. To educate businesses on the obligations, we’re publishing bulletins. We have already issued one on the CCPA updates, and businesses should expect more that provide analysis. We’ll host webinars and speak at events, like IAPP and the California Lawyers Association conferences.

On timing, the need to do risk assessments for covered activities under the new regulation began January 1, 2026, with the first set of attestations due April 1, 2028. Businesses should conduct assessments for any new systems coming online. The ADMT regulation takes effect January 1, 2027, and applies when an ADM system completely or substantially replaces a human to make significant decisions about a person using their personal data. There’s a set of rights consumers will have when the criteria for what the regulation defines as an “automated decision” is met. The regulation also requires consumer pre‑use notice as well as opt‑out and access rights for qualifying systems. Cybersecurity audit certifications are due starting in April 2028. The Audits Division will handle risk‑assessment attestations and cybersecurity submissions. Businesses should be aware that the general CCPA updates and risk assessment requirements are already in effect.

CSLR:  Is more guidance coming this year?

Kemp:  Yes. We’ve started with the CCPA bulletin, and businesses will see a rolling series of additional materials. Quick guides and compliance checklists for cybersecurity audits and risk assessments will be out in spring 2026, for example, and an ADM guide after that.

[See “Updating Compliance Programs to Address the CPPA’s Regulations on ADMT and Risk Assessments” (Sep. 17, 2025).]

CSLR:  How is DROP progressing and when does the sandbox for companies open?

Kemp:  The rollout is going well. As of February 27, 242,000 residents have registered. The sandbox for data brokers to test processes opens at the end of March. Beginning August 1, they must start accessing consumer deletion requests and have 45 days to process them. Consumers have clearly jumped on the DROP train even though the deletions won’t occur till the fall, which shows the pent‑up demand.

Key Dates for Compliance

Risk Assessments

January 1, 2026 (deadline to conduct assessments for covered activities)

April 1, 2028 (initial attestations due, and then annually on April 1)

ADMT Consumer Rights (Including Pre-Use Notice, Opt-Out, Access)

January 1, 2027

Cybersecurity Audit Certifications

April 1, 2028 (for companies with >$100M revenue)

April 1, 2029 (for companies with $50M–$100M revenue)

April 1, 2030 (for companies with <$50M revenue)

DROP (Delete Act)

January 1, 2026 (consumer registration opened)

March 2026 (sandbox for brokers opens)

August 1, 2026 (brokers must begin accessing deletions and process them every 45 days thereafter)

Enforcement Focus

CSLR:  What are CalPrivacy’s enforcement priorities or themes?

Kemp:  Look at all CCPA enforcement actions by us and the AG, the advisories and the sweeps to understand our priorities. We won’t reveal current activities, but we have over 100 active investigations. The AG’s office also enforces the CCPA. The recent Disney matter shows they have their own priorities.

We try to be transparent in several ways. First, enforcement advisories signal what we care about. We have issued three so far. Second, we partner on sweeps with other enforcers. We’re doing an investigative sweep with the California, Connecticut and Colorado AGs on support for the Global Privacy Control, and with about 30 privacy agencies worldwide on children’s data. Third, our settlement agreements detail where companies [like Honda, Healthline, Todd Snyder and Tractor Supply] fell short, so others can learn.

A clear pattern we’ve seen is businesses creating too much friction for consumers trying to exercise their privacy rights. One entity demanded a driver’s license photo next to the consumer’s face to do a simple opt-out of sale or sharing. That excessiveness is something we care about and could be the basis for future enforcement.

[See “Healthline’s Record-Setting CCPA Settlement Offers Lessons on Transparency and Opt-Outs” (Aug. 6, 2025).]

CSLR:  What emerging privacy harms has CalPrivacy seen that companies should be watching for?

Kemp:  One focus is, and will remain, protecting the vulnerable members of our community. Seniors experience lots of identity theft and fraud. We are addressing that via public affairs, with senior Scam Stoppers events with sister agencies, as well as enforcement. What happens with kids’ data, and how it is used, is another clear and significant harm.

CSLR:  What can the business community do to help raise awareness of these problems?

Kemp:  Clearly businesses have a mutual interest in reducing financial fraud and identity theft, along with the civil society groups like AARP, which are doing great work. We would welcome focus from the business community on reducing the likelihood that their older customers are defrauded. It would be beneficial for companies to do more to teach people how to protect themselves from a cybersecurity and from a privacy perspective. It’s also in the best interest of customers for businesses to do cybersecurity audits.

[See “CPPA’s Tractor Supply Decision Offers Lessons As Enforcement Focus Moves From Education to Deterrence” (Oct. 22, 2025).]

CSLR:  Are new technologies changing your expectations for businesses?

Kemp:  Our message is that whatever the technology, if it uses personal information, the same laws apply. In the past, personal data was maybe bought and sold for direct mail, then for websites and mobile apps, and now PI is used by AI and in ADMT to make predictions and decisions. Consumers keep the same rights to know, access, delete, etc., no matter the technology. They don’t lose that because we have a data-driven economy running on personal information. California voters created our agency alongside the AG to ensure those rights are protected.

Four New Regulations Planned

CSLR:  The board has approved new rulemaking. What’s included and how can businesses engage?

Kemp:  There are four areas: (1) employee data; (2) streamlining privacy policies and disclosures; (3) opt‑out preference signals/Global Privacy Control; and (4) reducing friction in exercising privacy rights. At our February 26 board meeting, we laid out the timeline to solicit public input and will issue requests for information and requests for comments. Processing those comments will take a few months before a draft is prepared for board consideration and formal notice‑and‑comment under the Administrative Procedures Act. We want feedback from civil society, everyday Californians and the business community.

CSLR: What specific business input would be most helpful in each area?

Kemp: We will provide guidance in our announcements regarding the requests for public comment.

Anticipated Timeline for Public Comment on Proposed Regulations

Reducing Friction in Exercising Privacy Rights & Opt-out Preference Signals

March 2026

Employee Data & Notices and Disclosures

April 2026

Data Broker Audits

Summer 2026

Source: CPPA Board Meeting, February 27, 2026

CCPA’s Two Enforcers

CSLR:  How should companies think about CalPrivacy and the AG sharing enforcement authority?

Kemp:  It’s common. At the federal level, for example, the SEC and DOJ can both be involved. Our relationship with the AG is excellent. We partner on joint actions, regularly compare notes and both participate in a multistate consortium of privacy regulators. We are the only non‑AG member.

[See “State Privacy Regulators Share Enforcement Agenda and How to Ensure a Smoother Investigation” (May 14, 2025).]

CSLR:  Any closing thoughts for leadership teams?

Kemp:  We’ve tried hard with our regulations to allow businesses to operationalize the obligations they have. For example, companies can leverage existing cybersecurity audits they have done. We’re publishing bulletins, and we have done and will continue to do our best to educate businesses about the regulations. The pace of our enforcement is increasing, as is the AG’s, so companies should respond to that. California has a long history of leading on consumer protection, from food and auto safety to data breach notification, the CCPA and the Delete Act.

We try to be innovative with our policy, to not only provide guardrails that protect consumers but also ensure that business innovation can continue to thrive. Since CCPA enforcement began, California’s economy has moved from the world’s fifth largest to the fourth. It is very possible to balance innovation and guardrails, which is what we’re trying to do here.

Corporate Governance

A Practical Cross-Functional Framework for Efficiently Driving Risk and Compliance Decisions


In fast-moving environments where privacy, compliance, security, product priorities and broader business goals intersect, organizations are required to make decisions quickly while aligning multiple stakeholders. Decisions are rarely made in isolation. They require coordination across teams, each of which bring distinct priorities and risk perspectives.

The pressure to make quick decisions intensifies when delivery cycles accelerate, but speed without clarity creates risk. The result can be delayed launches, increased regulatory exposure, internal frustration and erosion of trust. Organizations operating at scale need a structured approach that balances velocity with accountability. This article outlines a practical decision-making framework, grounded in objective criteria and real-world application, for cross-functional teams to navigate complex risk and compliance decisions across organizations of any size.

See “Eight Tips for Building a Cross-Company Compliance Network” (Sep. 17, 2025).

The Reality of Cross-Functional Decision-Making Today

Cross-functional decision-making is more complex than ever. Product cycles are shorter. Market pressure is constant. Regulatory scrutiny continues to expand. With that backdrop, decisions involving data, security and compliance carry greater consequences than in the past.

Involvement Across Functions

Issues that were once handled within a single function now require collaboration across multiple teams. Privacy, legal, security, product, engineering and business stakeholders must weigh in, often under tight timelines. Each additional voice adds necessary perspective, but also increases complexity.

See “EY Global Data Analytics Survey Finds Lack of GDPR Preparedness and Need for Cross-Functional Collaboration” (Mar. 28, 2018).

Differing Priorities

These perspectives are shaped by different priorities. Legal focuses on defensibility and compliance. Privacy prioritizes regulatory obligations and user trust. Security seeks to reduce exposure and strengthen controls. Product emphasizes innovation, speed and growth. Leadership looks for clarity, accountability and alignment with strategic objectives.

These goals are not inherently in conflict, but they rarely align automatically. What feels urgent to one team may not be critical to another. Without a shared structure for evaluating trade-offs, discussions can stall. Meetings repeat. Escalations increase. Frustration builds.

Downsides of Informal Approaches

Many organizations rely on informal methods such as last-minute reviews, consensus-driven debates, or leadership escalation to resolve tensions among stakeholders. These approaches may work temporarily, but they break down as organizations scale and create friction rather than clarity. Escalation shifts accountability without resolving underlying disagreement. Efforts to find consensus often lead to delayed or diluted decisions.

In high-velocity environments, a more disciplined approach is required, one that clarifies ownership, documents trade-offs and enables informed decisions without sacrificing speed. Speed without structure creates noise, not progress.

See “How Can a Company Mitigate Cyber Risk With Cross-Departmental Decision-Making?” (Apr. 8, 2015).

A Practical Decision-Making Framework

The framework outlined below is one that has been applied and refined across multiple organizations. It is not theoretical. It is built from real cross-functional work where speed, risk and accountability must coexist.

The objective is to drive discussions with extreme clarity, particularly when stakes are high and timelines are tight. Extreme clarity reduces ambiguity, surfaces assumptions early and allows cross-functional teams to focus on decisions rather than debate. It allows stakeholders to align around facts and informed trade-offs rather than opinions. The process also should be documented, and a sample template is provided further below.

Step 1: Defining the Problem and Setting Context

Effective decisions begin with precision. Before discussing solutions, teams must align on the problem.

This section documents what happened, when it occurred, how it was identified and why it matters. It should capture relevant background facts, regulatory triggers, product changes or operational constraints. Timeline pressures and business commitments should be clearly stated.

Equally important is identifying who is involved and who is impacted. Establishing stakeholder scope early prevents misalignment later. When teams begin with shared context, they reduce assumptions and focus on solving the actual issue rather than debating incomplete narratives. Clarity at this stage prevents misaligned solutions later and ensures the discussion is grounded in facts rather than interpretations.

See “Cybersecurity and Privacy Teams Join to Create Data Governance Councils” (May 4, 2022).

Step 2: Evaluating Options and Driving Alignment

Once the problem is defined, teams should identify realistic solution paths and evaluate trade-offs openly.

Each option should be documented with clear benefits and drawbacks. Considerations may include level of effort, implementation timeline, cost, business impact, user expectations, compliance exposure and security implications. Viewing these factors side by side enables informed comparison.

Options should reflect multiple perspectives. A compliance-oriented approach may reduce regulatory uncertainty. A legal approach may strengthen defensibility. A security-focused solution may prioritize control robustness. A product-driven option may accelerate growth and user adoption. Capturing these lenses together prevents decisions from being dominated by a single function.

Alignment does not require unanimous agreement. It requires transparent evaluation of trade-offs so stakeholders understand what each path entails. Alignment does not always mean compromise. More often, it means reframing the conversation around shared goals such as protecting users, enabling growth and reducing avoidable risk. When discussions shift from positions to shared outcomes, progress accelerates.

See “Unifying Risk Assessments: Breaking Silos to Enhance Efficiency and Manage Risk” (Jan. 29, 2025).

Step 3: Selecting the Path Forward and Formalizing Commitment

Analysis must lead to action. Once trade-offs are evaluated, a clear decision should be made and documented.

The selected option and its rationale should be recorded, along with any acknowledged residual risks or dependencies. Where necessary, formal risk acceptance should be captured.

Decision authority must be explicit. The document should identify the decision owner, responsible stakeholders and date of approval. A structured decision log should record who signed off, from which team and when. This ensures accountability and preserves institutional memory.

Ownership conflicts rarely stem from unwillingness. More often, they arise from unclear boundaries. When decision rights and accountability are defined early, friction decreases and execution improves.

Clear timelines and ownership for implementation should also be documented. Without defined responsibility and deadlines, decisions lose momentum.

Formal documentation transforms discussion into commitment and strengthens defensibility in the face of future scrutiny.

See “Five Strategies a Privacy Attorney Uses to Bridge the Gap With Tech Teams” (Jan. 31, 2018).

Step 4: Executing, Monitoring and Following Through

Commitment must translate into execution. The selected option should be supported by a clear implementation plan.

The complexity of the plan will vary, but it should, at minimum, outline the agreed objective, defined tasks, accountable owners and target dates. Checkpoints also should be established to monitor progress.

Transparency during execution is essential. Delays, blockers or emerging risks should be communicated promptly to stakeholders and leadership.

After implementation, teams should conduct a review to confirm that objectives were met. Metrics, compliance indicators and user impact should be evaluated. If adjustments are required, they should be documented and incorporated into the decision log.

Decision-making is not static. It requires reassessment when facts change. Connecting decisions to execution and measurable outcomes ensures that alignment produces tangible results.

Documentation Template: A Structured Decision

To operationalize the above framework, organizations should adopt a consistent decision document format. The structure is flexible and can be adapted across privacy, compliance, security, product or broader business contexts.

The document should be written in language that the intended audience understands. In many organizations, this means framing issues in terms of product impact, engineering effort, timelines, user outcomes and measurable results rather than relying solely on regulatory terminology. A structured framework is most effective when it translates risk and compliance considerations into practical terms that resonate across teams.

Moreover, someone must take responsibility for initiating documentation, drafting the initial context, gathering stakeholder input and driving the discussion forward. Without a clearly identified process owner, even well-designed frameworks can lose momentum before decisions are reached.

The template below is illustrative rather than exhaustive. It can be simplified or expanded depending on organizational size, issue complexity and the level of risk involved. The structure is intended to support disciplined thinking, whether decisions are being made by a small leadership group or across multiple cross-functional teams.

1) Problem Statement and Context

  • Issue definition
  • Trigger and background
  • Timeline constraints
  • Stakeholders involved

2) Options and Trade-Off Analysis

  • Description of each option
  • Benefits and risks
  • Compliance and security implications
  • Business and user impact
  • Level of effort and timeline

3) Decision, Rationale and Sign-Off

  • Selected option
  • Rationale
  • Residual risks
  • Decision owner
  • Approving stakeholders and teams
  • Date of sign-off

4) Execution and Post-Implementation Review

  • Implementation tasks and owners
  • Target dates and checkpoints
  • Monitoring metrics
  • Follow-up review date

When used consistently, this document becomes a governance tool that supports clarity, alignment and accountability.

Example of Applying the Framework in Practice: High-Risk Feature Launch Under Time Pressure

A product team planned to launch a new personalization feature requiring expanded data collection. The release was tied to competitive positioning and a public roadmap commitment. Marketing campaigns were scheduled, and leadership visibility was high.

During final review, privacy and security teams identified regulatory exposure in certain jurisdictions and gaps in existing controls. Legal raised concerns regarding documentation and defensibility. Product emphasized growth impact and launch deadlines.

Using the framework, the teams first documented the problem clearly, outlining regulatory risks, affected jurisdictions, business impact of delay and timeline constraints. Stakeholders were formally identified.

Multiple options were then evaluated: launching as designed with risk acceptance, delaying for additional safeguards or phasing rollout by jurisdiction. Trade-offs were documented across compliance exposure, engineering effort, business impact and risk mitigation.

The final decision involved a phased launch. Lower-risk markets proceeded first, while safeguards were strengthened in higher-risk regions. The rationale, residual risks, sign-offs and timelines were recorded in the decision log.

Execution followed a defined plan with assigned owners and checkpoints. Post-launch monitoring confirmed compliance alignment and acceptable risk levels. The structured approach enabled the organization to move forward without unnecessary delay while maintaining accountability.

See this two-part series on insights from Uber: “An Inside Look at Its Privacy Team Structure and How Legal and Tech Collaborated on Its Differential Privacy Tool” (Nov. 28, 2018), and “Building Bridges Between Legal and Engineering” (Dec. 5, 2018).

What This Means for Risk and Compliance Leaders

In fast-moving organizations’ evolving regulatory environments, structured decision-making is no longer optional. As regulatory expectations expand and product cycles accelerate, leaders must embed clarity and accountability into how decisions are made.

Governance is not about slowing innovation. It is about enabling it responsibly. A practical framework does not delay teams. It enables them to move forward with confidence. When cross-functional stakeholders share a common structure for defining problems, evaluating trade-offs and documenting commitments, organizations strengthen both velocity and defensibility at scale. It is a competitive advantage.

 

Pari Sarnot[1] is a privacy risk, and governance leader with experience guiding global technology organizations through complex regulatory and AI governance challenges. In her roles as a privacy, risk and compliance manager at Meta and Grant Thornton, she has led cross-functional initiatives spanning privacy-by-design implementation, enterprise risk assessments, third-party risk management and data lifecycle governance. Her work focuses on building structured, scalable decision-making frameworks that balance innovation with regulatory accountability. She is also an IAPP faculty member.

 

[1] The views Sarnot expresses in this article are her own and do not represent those of current or previous employers.

State Attorneys General

Assistant AG Highlights Colorado’s Next Phase of Privacy Regulation


Colorado was one of the first states to adopt a comprehensive consumer privacy law. Since the Colorado Privacy Act (CPA) was adopted in 2021 and took effect in July 2023, the state continues to fine tune the law and the associated regulations.

At the Bridge 2026 privacy summit, Andrea Lowe, Assistant AG in the Colorado AG’s Office, and Daniel Rosenzweig, founder of DBR Tech Law, discussed 2025 amendments to the CPA and regulations thereunder, including enhanced protections for biometric, minors’ and sensitive information. They also reviewed the Colorado AG office’s (CO AG) approach to enforcing the CPA, including cooperation with other states, the multistate investigatory sweep on global privacy controls (GPCs), and the CO AG’s focus on sensitive data, mobile app data and data protection assessments (DPAs). This article distills their comments.

See “Colorado Privacy Law Finishes Third, but Could Become the New Standard” (Jun. 23, 2021).

CPA Requirements That Took Effect in 2025

Enhanced Protection for Biometric Information

With organizations collecting increasing amounts of biometric data, in May 2024, Colorado amended the CPA to set forth requirements around those practices, said Lowe, who noted that her comments reflected her personal views, not those of the CO AG or its staff. The CO AG has found that some organizations are not prepared for the new requirements, which took effect in July 2025.

Under the expanded regime, absent consent, controllers cannot sell or disclose biometric identifiers except in limited circumstances. Most biometric information will be either a “biometric identifier” or “biometric data,” continued Lowe. Notably, biometric identifiers include biometric information even if it is not used to identify a consumer.

The CPA sets forth specific requirements for consent and requires controllers to have clear policies for handling biometric information and to provide clear notice to consumers that biometric information is being collected.

See our two-part series on legal and ethical issues in the use of biometrics: “Modality Selection, Implementation and State Laws” (Feb. 21, 2024), and “FIDO, Identity-Proofing and Other Options” (Feb. 28, 2024).

Enhanced Protection for Minors’ Data

Under an amendment to the CPA that took effect October 1, 2025, controllers that offer online products, services or features to a person the controller knows, or willfully disregards, is a minor, must use reasonable care to avoid heightened risk of harm and conduct data protection assessments related to those products, services or features, said Lowe. If a controller complies with the other provisions of Section 6‑1‑1308.5 of the amendment, there is a rebuttable presumption that the controller has used reasonable care. Those other provisions, she explained, include:

  • not processing minors’ data without consent; and
  • not “using system design features to significantly increase, sustain, or extend a minor’s use of the service, product, or feature” without consent.

In each case, consent means parental consent for minors under 13 years of age and the minor’s consent for those over 13.

Additionally, in October 2025, the CO AG announced final revisions to the proposed draft amendments to the CPA. The final revisions, which became effective December 1, 2025, set forth the factors the CO AG will consider when determining whether a controller willfully disregarded that a person is a minor and when a system is designed to increase, sustain or extend use by minors.

Companies are underestimating the risks associated with handling minors’ data, cautioned Lowe. Notably, some platforms claim they do not have information suggesting minors use their products or services “despite clear evidence to the contrary,” she observed. As indicated in the final revisions, the CO AG will consider, for example, a controller’s receipt of credible reports from a parent or consumer that a minor is using the service, or information provided by the consumers that suggests they are minors.

See “How Companies Can Meet Growing Regulatory Scrutiny Around Sharing Children’s Data” (Feb. 11, 2026).

Expanded Definition of Sensitive Data

The CPA requires opt-in consent for consumer sensitive data, explained Lowe. In October 2025, that category was expanded to include precise geolocation data.

See “State Privacy Enforcers Reveal Strategies, Priorities and Advice on Engagement” (Nov. 12, 2025).

Enforcement Outlook

The CO AG’s approach to enforcement in 2026 will be similar to 2025, said Lowe. It will continue to focus on enforcement involving high-risk data collection, as well as the new protections for biometric, minors’ and sensitive information. The CO AG expects organizations to think critically about privacy protection.

Reliance on Resources and Consumer Complaints

Enforcement will depend both on available resources and what the CO AG hears from consumers, continued Lowe. For example, consumers continue to complain about not being able to exercise their data rights – which have been in effect since the CPA took effect in 2023. The CO AG is particularly concerned when that involves sensitive data like health information.

The CO AG always refines its enforcement approach based on what it hears. It not only considers the broad issues presented by complaints, but also particular companies that are repeat offenders when they are named in complaints, added Lowe.

CO AG personnel meet once or twice monthly to review complaints, explained Lowe. There are often 20 to 40 complaints each month, some of which are just general expressions of concern about a particular company – rather than a specific allegation of a violation. The CO AG has been sending about half of the complaints it receives to its consumer mediation program, most of which have been resolved successfully. “Companies do tend to pay attention,” she said. However, “most of the consumers only come to us because they did not resolve [their complaints] successfully with the company,” she added.

Lowe also noted that in 2025, the CO AG hired a technologist, increasing its capacity to take on more technical matters.

Regulators across the U.S. are increasingly scrutinizing organizations whose technology does not actually perform in the way the organization claims it performs, observed Rosenzweig.

See “State Privacy Regulators Share Enforcement Agenda and How to Ensure a Smoother Investigation” (May 14, 2025).

Cooperation With Other States

Colorado has joined a 10‑state consortium of privacy regulators. Cooperation among state AGs is common in consumer protection matters, noted Lowe. It enables them to share resources and enhance consumer protection. Although some consumer protection issues are “hyper-local” in nature, technology and privacy issues tend to be similar across state lines. The consortium enhances efficiency by facilitating information sharing and developing enforcement priorities. Although there are variations among state laws, there are not significant differences in how states interpret their laws. Generally, their enforcement approaches are also similar.

Colorado’s regulations include detailed interpretations of provisions that also exist in other states’ laws, noted Lowe. Similarly, other states have drawn on Colorado’s regulations on consent, dark patterns and other key provisions.

See “Practical Insights Direct From U.S. State Privacy Enforcers” (Apr. 10, 2024).

GPC Concerns

Multistate Sweep Reveals Lack of Implementation

In 2025, Colorado participated in a multistate GPC investigation sweep, noted Rosenzweig. Lowe could not discuss the specifics of the sweep, but said the regulators used a technical testing process. They checked, for example, whether:

  • any cookie banners interacted with GPC signals in a confusing way or created dark patterns; and
  • sites operated in a manner consistent with the GPC-related disclosures in their privacy notices.

Concerningly, the sweep found that some organizations had not implemented systems to recognize GPC signals – even though some laws have been in place for several years, observed Lowe. Other organizations had tried to establish infrastructure to address the requirements but did so incorrectly. For example, some failed to honor the signal on webpage startup. Certain technical or configuration issues were identified quickly. Others reflected a lack of robust oversight and controls.

Many organizations use third-party integrations to help manage GPC and other privacy compliance requirements, continued Lowe. However, some do so without any oversight, testing or auditing. Others have inadequate contracts and/or fail to enforce contractual requirements.

See “Navigating Global Privacy Control’s Not-So-Simple Implementation” (Mar. 26, 2025).

Ensuring GPC Signals Are Properly Handled

Sites that receive an opt-out signal – either through GPC or a site-specific process – handle the signal in one of two primary ways, explained Rosenzweig. Some use a signal-based method that passes the signal downstream. Others use a suppression method that completely stops transmission of data.

If suppression works correctly, it could be seen as a best practice, according to Lowe. Organizations using suppression must ensure that the outgoing data is actually being suppressed. Depending on the facts and circumstances, a signal-based method could also be compliant, provided the organization exercises appropriate oversight and monitoring. Regardless of the approach, the relevant contract must comply with the CPA, and the parties must adhere to the contract and conduct the requisite monitoring.

“It’s not just a matter of executing contracts and disclosures, it’s making sure that your opt-outs and technologies work correctly,” emphasized Rosenzweig. Moreover, contracts and opt-outs go hand in hand. Even if an organization has a functioning signal- or suppression-based system, it still must have the requisite contracts in place. They are equally important.

See “Colorado Controllers: The Final (Rules’) Frontier” (May 31, 2023).

Sensitive Data

Sales of sensitive data, including sales through data brokers, continue to be an issue, according to Lowe. There have also been failures to follow opt-in consent requirements. Many brokers continue to sell sensitive health information, claiming to carve out Colorado and other state consumers. The CO AG will look into those claims.

See “Lessons From the Trenches on How Data Brokers Can Manage Consumer Rights Requests” (Jan. 7, 2026).

Mobile App Data

Consumers often have limited or no control over collection of mobile app data, noted Lowe. The CO AG looks holistically at data collection by apps and third-party integrations and assesses whether there is any undisclosed collection of data that might be surprising or confusing to consumers. That might include, for example, collecting precise geolocation data or information about all other apps installed on the user’s phone. Connected TVs raise similar concerns.

See “Navigating Evolving Mobile App Privacy Issues” (Mar. 5, 2025).

DPAs

The CPA and other data privacy laws require organizations to conduct DPAs when their processing activities present a heightened risk of harm, observed Lowe. The CO AG requests and reviews DPAs thoroughly. The DPAs provide valuable insight into how organizations view compliance. They show whether entities are “actually thoughtfully looking at their data collection and processing, weighing the risks [to consumers] and potentially reviewing alternatives,” she said.

Companies’ DPAs are often inadequate, Lowe continued. The CO AG expects DPAs to describe the types of data collection and processing being conducted. Additionally, it would be helpful for them to include “narrative or careful balancing” of the actual risks to consumers from collecting the information and whether there are alternatives. The exact format of a DPA is not important. What is important is actual, careful analysis and discussion of the data collection and processing. A DPA in which an organization answered “N/A” on most questions will raise “a ton of red flags” about whether it is complying with the CPA, she cautioned.

See “A Streamlined Approach for Integrating New Requirements Into PIAs” (Dec. 17, 2025).