Google will pay $1.375 billion to resolve two lawsuits brought by Texas AG Ken Paxton centered on the tech giant’s use of individuals’ biometric, geolocation and incognito-mode search data, according to a May 9, 2025, announcement. The AG said it was the largest penalty Google has ever paid over privacy issues.
The agreement (Settlement) is the second giant settlement under the Texas Capture and Use of Biometric Information Act (CUBI), a 2009 law. To settle the first action, Meta Platforms agreed in July 2024 to pay $1.4 million over its tagging of faces in Facebook posts.
The Settlement is sure to raise awareness of the Texas law. Although two trillion-dollar companies paying CUBI’s billion-dollar fines may seem irrelevant to most companies, the Google and Meta cases show CUBI’s power. “CUBI opens up a lot of potential damages because the AG can claim $25,000 per violation. Multiply that by millions of users, and your damages model gets pretty steep pretty quickly,” Haynes Boone partner Tim Newman told the Cybersecurity Law Report.
The Settlement also displays the AG’s broad interpretation of CUBI provisions, raising a set of questions that companies should consider. This article examines the circumstances surrounding the Settlement, offers four biometric law compliance considerations, and discusses regulatory and litigation risk factors highlighted by the case, with commentary from experts at Baker Botts, Blank Rome, Carter Ledyard & Milburn and Haynes & Boone.
See “What Texas’ Record $1.4‑Billion Deal With Meta Portends for Biometric Data Capture and Use” (Aug. 21, 2024).
Significant Settlement Despite Loss on Appeal
Texas sued Google twice in 2022. The AG brought the first case against Google under CUBI for the allegedly improper use of facial recognition in Google Photos and its Nest camera. In the other lawsuit, the AG invoked the state’s consumer protection law and alleged that Google deceived users about the collection of their location data and wrongly required users to access multiple settings to protect their privacy.
The lawsuits followed a 40‑state settlement with Google in 2022 for $391 million over how it handled location data. The AG retained outside firm Norton Rose Fulbright as lead counsel on contingency. The firm is set to receive at least a $137‑million share of the penalty.
Texas reached this Settlement despite suffering two harsh losses in court. A Texas appellate court in February 2025 dismissed one of its cases against Google, accepting the company’s argument that Texas lacked jurisdiction over the Big Tech company despite millions of Texans engaging with its products over web and mobile platforms (plus its in‑state data center). The court concluded that Google did not conduct any of the allegedly violative activities in Texas. A second court then adopted the Google argument in April 2025 to throw out the AG’s lawsuit against Allstate and Arity, the first case alleging a violation of The Texas Data Privacy and Security Act (TDPSA).
The AG appealed the appellate court’s decision in the Google case to the Supreme Court of Texas. “Google’s decision to settle was perhaps an indication that it was unwilling to risk a Supreme Court decision against it,” Carter Ledyard & Milburn partner Matt Dunn told the Cybersecurity Law Report. “There likely was concern that the Texas appellate court probably got it wrong to deny specific jurisdiction because Google has a strong presence in Texas and the tracking of the data of Texans is a meaningful connection to the state,” he said.
Google’s monetary commitment to Texas is huge, but it need not make any product changes, and all required disclosures and policy changes were either already announced or are in place, a company spokesperson said in a statement.
The AG’s appeal of the Google jurisdiction decision will now end. The AG’s office did not respond to Cybersecurity Law Report’s inquiries about whether it will appeal the Allstate decision.
See our two-part series on location data, “FTC and $391‑Million State AG Case Put Location Data Enforcement on the Map” (Jan. 4, 2023), and “A Sensitive Time for Location Data: Tips to Address New Rules and Vendor Standards” (Jan. 18, 2023).
AG Warns of More Enforcement to Come
Ignoring the losses in the appellate court, the AG delivered a strong message. “This $1.375 billion settlement is a major win for Texans’ privacy and tells companies that they will pay for abusing our trust,” he said, adding, “I will always protect Texans by stopping Big Tech’s attempts to make a profit by selling away our rights and freedoms.”
The Google and Meta settlements suggest rising compliance risks in Texas around biometric privacy. A motivated AG holds powerful legal tools like CUBI’s statutory damages, which incentivize the use of outside counsel to battle with deep-pocketed companies. “We know that the AG will target major companies that are using pretty sensitive information,” observed Baker Botts partner Matthew Baker. “But that doesn’t mean once those investigations start slowing down that they won’t look elsewhere for companies with more routine violations,” he told the Cybersecurity Law Report.
See “Connecticut AG’s Report Reveals Privacy Enforcers Reaching Deeper Into Their State Laws” (Apr. 30, 2025).
Four Biometric Law Compliance Considerations
Companies have been keenly aware of Illinois’ Biometric Information Privacy Act (BIPA). Thousands of lawsuits filed under BIPA’s private right of action have generated total payouts approaching Texas’ $2.8‑billion haul. Washington also has a biometrics law that parallels CUBI. “Step one for companies is an awareness that these laws are out there and that their company may be subject to them, even though it is not immediately apparent based on their business model,” said Blank Rome partner Jeffrey Rosenthal.
Companies should consider the following four topics to ensure they follow applicable biometrics laws.
See “Aftermath of the Ninth Circuit BIPA Liability Shake‑Up in Zellmer v. Meta” (Oct. 23, 2024).
What Constitutes Biometric Data
What counts as biometric data remains a detailed puzzle for each company because biometric technology is nuanced, Rosenthal said. Many companies with virtual features for consumers, like try-on technologies, recognize a face on a screen but do not identify the person using other data, for example.
The legal line remains unclear when “the face becomes biometric data versus mere recognition of a face on a screen” that cannot be reengineered for identification, Rosenthal cautioned.
CUBI defines “biometric identifiers” in a list, restricting them to “a retina or iris scan, fingerprint, voiceprint, or record of hand or face geometry.” But courts in Texas have not clarified what data constitutes an identifiable “record” of physical or voice data under CUBI.
“A Settlement like this raises the temperature and raises the awareness, but it doesn’t provide the clarity so that tomorrow we can all say, ‘wow, let’s pivot in this direction,’” Rosenthal said. If presented with the chance to interpret CUBI’s definition of biometric identifiers, Texas’s oft-conservative courts might not rule in the AG’s favor. “A Texas state judge could say that [Google’s use of facial recognition] is not crossing that line. Then Texas would be looking at a goose egg instead of $1.4 billion,” he offered.
Companies would love to have “bright line guidance” differentiating between scenarios that count as biometric data under the statute and those that do not, Rosenthal suggested.
See our two-part series on legal and ethical issues in the use of biometrics: “Modality Selection, Implementation and State Laws” (Feb. 21, 2024), and “FIDO, Identity-Proofing and Other Options” (Feb. 28, 2024).
How Soon to Delete the Data
“The deletion timing is different state to state. Illinois [provides for] three years to delete. Texas is one year,” Rosenthal explained. Many companies took steps to comply with BIPA but have not revised their deletion practices for Texas.
In addition to determining when to delete data, companies must decide when they start the clock for deletion. Texas and Illinois specify that the clock runs only “from the termination of the reason for which it was collected.” Many companies start after the last date the data will be used for the reasons specified in their privacy notice, Rosenthal said.
For a typical biometric use, such as to identify employees, the timing of deletion is somewhat straightforward – it is based on the employee’s departure. It is less clear and more work for companies to determine when to delete biometric data for other uses, Rosenthal stressed.
For many consumer uses, like uploaded (and app-loaded) photos, when the company must discard the data remains fuzzy. “That photo just lives up there,” Rosenthal said. The clock might start only when the user deletes the photo “because the purpose for which the photo was uploaded has ended. But I’ve never seen it play out” in a lawsuit, he reported.
See “Examining Maryland’s Game-Changing Data Minimization Requirements” (Apr. 24, 2024).
Whether the Data Use Is for a Commercial Purpose
CUBI’s requirements apply only if the biometric use is for a “commercial purpose,” making its application narrower than BIPA. The AG’s complaint against Google took an expansive view that counted secondary benefits that a company gains. “Each time Google’s algorithms process photos and videos to detect certain faces and objects or process voiceprints to better understand voice data, Google’s underlying AI becomes stronger, better-informed, more efficient, and more dominant,” the complaint states.
The AG’s position indicates that “‘commercial purposes’ are going to be interpreted as broadly as possible when it comes to enforcement,” Rosenthal said.
The AG highlighted enhancement as a factor for assessing commercial purposes. Google’s analysis of voice and speech data “provide[s] a powerful means for Google to enhance its product offerings and capabilities, which ultimately translates into market dominance and increased profits,” the AG alleged in its complaint.
Which types of product and service enhancements the AG will have an appetite to challenge as misuse of biometric data remains a question to watch, Baker posited. “How safe can innovators feel in rolling out new products in jurisdictions like Texas?” If a company uses AI training on biometric data “to do something bigger and better with an existing product, and makes money off of that as a result, that to me would fit in with the AG’s vision of a commercial use,” he opined.
To judge risk, a company processing biometric data could consider the volume of its use, how purely the improvement aids business operations versus product development, and how central the improvement is to the company’s business model, Baker suggested.
A clearer guidepost would help the market, and Texans, Baker noted. “A lot of my clients simply will not operate a biometric-related service in Illinois and turn off that functionality in New York,” he reported.
“Will Texas residents be happy about that when they can’t use a try-on technology that recognizes their features,” or, Baker queried, when they “cannot seamlessly authenticate themselves when calling in to the bank?” The TDPSA might provide reassurance to companies. While the Google case was brought under CUBI, the TDPSA exempts security and fraud uses from “commercial purposes,” which might indicate that businesses’ operational needs like security and fraud are not commercial uses under CUBI, he explained. For retailers, however, the AG could assert that the use of facial recognition – even for fraud protection – benefits the company by mitigating losses, he cautioned.
See “Examining the Intersection of Voiceprints and Data Privacy Laws” (Sep. 22, 2021).
How to Approach Consent
Companies must determine how to get individuals’ consent “in a way that is practical and that is not going to make it so burdensome that the user or consumer will give up on the process,” Rosenthal advised.
A first consideration is whether consent is necessary. Some companies conclude they do not collect biometric data based on how their or third-party software operates, but decide to obtain consent anyway, Rosenthal reported. “Other companies say that ‘if we’re not doing anything [the law regulates], we don’t need to,” he observed.
Biometrics consent is fragile, Baker said. Informed consent to satisfy the current generation of laws must state the collection is “for a particular purpose and only that purpose,” he noted.
Many companies collecting biometrics alert consumers that they will use their data for product development or “secondary uses” because they need the data to innovate so they can compete in the highly competitive technology services market, Baker reported. And developments sometimes emerge too fast to keep up with obtaining new consent. While companies believe everyone benefits when they use the biometric data to introduce a new tech feature to the market, AGs likely would not approve product development as a permissible use, so “the company has to be very, very careful.” The initial consent obtained is for the specified purpose only, he cautioned.
See “Practical Insights Direct From U.S. State Privacy Enforcers” (Apr. 10, 2024).
Assessing Regulatory and Litigation Risks
The Settlement details, including the amount Google will pay, and the attention it is getting raises emerging risk factors for companies.
Liability for Non-Users Captured in Biometric Scans
“The state took the view in the lawsuits that the AG filed that users and non-users alike are protected,” observed Newman.
Companies using biometrics can easily collect the data of non-users. The start of the AG’s CUBI complaint against Google narrates examples where Google’s facial recognition recorded people other than the Google user. “To Google, it does not matter that the three-year-olds, the bystanders, and grandma never consented to Google capturing and recording their biometric data. For Google, it is immensely valuable for Texans to continue uploading photos and videos of themselves and their non-consenting friends and family members,” the AG alleged.
The AG highlighted the non-users “to raise the temperature and raise the [public] awareness at a level that most people can understand,” Rosenthal noted. But class action liability, so far, has not been established in most instances for non-customers. If the company does not know who that person is, courts ask how they could be a class member, he said.
Claims Under State Consumer Protection Laws
“These settlements are not missed by the plaintiffs’ bar,” Rosenthal noted, which can create a “snowballing effect.”
Google’s earlier settlement with the 40 state AGs, in a case brought under longstanding consumer fraud laws and concerning location data, did not include a release against lawsuits by citizens, Dunn pointed out. “The use of the consumer protection statutes by state AGs should remind companies to consider the risk of private suits by consumers under such laws,” he cautioned. “Many have available statutory damages” that make privacy claims viable for class actions and contingency cases. New York, for example, is $550 per violation.
Companies’ use of location data is coming under more state-level pressure, Dunn highlighted. California legislators on May 7, 2025, advanced new opt-in requirements around location data, and added restrictions on sharing and retaining location data even with a person’s consent.
The state AGs’ scrutiny of AI practices is increasing, Dunn noted. This creates a crossroads for investigation. “Geolocation and biometrics collection and processing is often done using AI tools,” he said.
See “BIPA Decisions Expand Potential Liability: What’s Next in Illinois and Other States?” (Mar. 8, 2023).
