Biometrics

Texas AG’s Billion-Dollar Settlement With Google Highlights Biometric Data Use Compliance Considerations


Google will pay $1.375 billion to resolve two lawsuits brought by Texas AG Ken Paxton centered on the tech giant’s use of individuals’ biometric, geolocation and incognito-mode search data, according to a May 9, 2025, announcement. The AG said it was the largest penalty Google has ever paid over privacy issues.

The agreement (Settlement) is the second giant settlement under the Texas Capture and Use of Biometric Information Act (CUBI), a 2009 law. To settle the first action, Meta Platforms agreed in July 2024 to pay $1.4 million over its tagging of faces in Facebook posts.

The Settlement is sure to raise awareness of the Texas law. Although two trillion-dollar companies paying CUBI’s billion-dollar fines may seem irrelevant to most companies, the Google and Meta cases show CUBI’s power. “CUBI opens up a lot of potential damages because the AG can claim $25,000 per violation. Multiply that by millions of users, and your damages model gets pretty steep pretty quickly,” Haynes Boone partner Tim Newman told the Cybersecurity Law Report.

The Settlement also displays the AG’s broad interpretation of CUBI provisions, raising a set of questions that companies should consider. This article examines the circumstances surrounding the Settlement, offers four biometric law compliance considerations, and discusses regulatory and litigation risk factors highlighted by the case, with commentary from experts at Baker Botts, Blank Rome, Carter Ledyard & Milburn and Haynes & Boone.

See “What Texas’ Record $1.4‑Billion Deal With Meta Portends for Biometric Data Capture and Use” (Aug. 21, 2024).

Significant Settlement Despite Loss on Appeal

Texas sued Google twice in 2022. The AG brought the first case against Google under CUBI for the allegedly improper use of facial recognition in Google Photos and its Nest camera. In the other lawsuit, the AG invoked the state’s consumer protection law and alleged that Google deceived users about the collection of their location data and wrongly required users to access multiple settings to protect their privacy.

The lawsuits followed a 40‑state settlement with Google in 2022 for $391 million over how it handled location data. The AG retained outside firm Norton Rose Fulbright as lead counsel on contingency. The firm is set to receive at least a $137‑million share of the penalty.

Texas reached this Settlement despite suffering two harsh losses in court. A Texas appellate court in February 2025 dismissed one of its cases against Google, accepting the company’s argument that Texas lacked jurisdiction over the Big Tech company despite millions of Texans engaging with its products over web and mobile platforms (plus its in‑state data center). The court concluded that Google did not conduct any of the allegedly violative activities in Texas. A second court then adopted the Google argument in April 2025 to throw out the AG’s lawsuit against Allstate and Arity, the first case alleging a violation of The Texas Data Privacy and Security Act (TDPSA).

The AG appealed the appellate court’s decision in the Google case to the Supreme Court of Texas. “Google’s decision to settle was perhaps an indication that it was unwilling to risk a Supreme Court decision against it,” Carter Ledyard & Milburn partner Matt Dunn told the Cybersecurity Law Report. “There likely was concern that the Texas appellate court probably got it wrong to deny specific jurisdiction because Google has a strong presence in Texas and the tracking of the data of Texans is a meaningful connection to the state,” he said.

Google’s monetary commitment to Texas is huge, but it need not make any product changes, and all required disclosures and policy changes were either already announced or are in place, a company spokesperson said in a statement.

The AG’s appeal of the Google jurisdiction decision will now end. The AG’s office did not respond to Cybersecurity Law Report’s inquiries about whether it will appeal the Allstate decision.

See our two-part series on location data, “FTC and $391‑Million State AG Case Put Location Data Enforcement on the Map” (Jan. 4, 2023), and “A Sensitive Time for Location Data: Tips to Address New Rules and Vendor Standards” (Jan. 18, 2023).

AG Warns of More Enforcement to Come

Ignoring the losses in the appellate court, the AG delivered a strong message. “This $1.375 billion settlement is a major win for Texans’ privacy and tells companies that they will pay for abusing our trust,” he said, adding, “I will always protect Texans by stopping Big Tech’s attempts to make a profit by selling away our rights and freedoms.”

The Google and Meta settlements suggest rising compliance risks in Texas around biometric privacy. A motivated AG holds powerful legal tools like CUBI’s statutory damages, which incentivize the use of outside counsel to battle with deep-pocketed companies. “We know that the AG will target major companies that are using pretty sensitive information,” observed Baker Botts partner Matthew Baker. “But that doesn’t mean once those investigations start slowing down that they won’t look elsewhere for companies with more routine violations,” he told the Cybersecurity Law Report.

See “Connecticut AG’s Report Reveals Privacy Enforcers Reaching Deeper Into Their State Laws” (Apr. 30, 2025).

Four Biometric Law Compliance Considerations

Companies have been keenly aware of Illinois’ Biometric Information Privacy Act (BIPA). Thousands of lawsuits filed under BIPA’s private right of action have generated total payouts approaching Texas’ $2.8‑billion haul. Washington also has a biometrics law that parallels CUBI. “Step one for companies is an awareness that these laws are out there and that their company may be subject to them, even though it is not immediately apparent based on their business model,” said Blank Rome partner Jeffrey Rosenthal.

Companies should consider the following four topics to ensure they follow applicable biometrics laws.

See “Aftermath of the Ninth Circuit BIPA Liability Shake‑Up in Zellmer v. Meta” (Oct. 23, 2024).

What Constitutes Biometric Data

What counts as biometric data remains a detailed puzzle for each company because biometric technology is nuanced, Rosenthal said. Many companies with virtual features for consumers, like try-on technologies, recognize a face on a screen but do not identify the person using other data, for example.

The legal line remains unclear when “the face becomes biometric data versus mere recognition of a face on a screen” that cannot be reengineered for identification, Rosenthal cautioned.

CUBI defines “biometric identifiers” in a list, restricting them to “a retina or iris scan, fingerprint, voiceprint, or record of hand or face geometry.” But courts in Texas have not clarified what data constitutes an identifiable “record” of physical or voice data under CUBI.

“A Settlement like this raises the temperature and raises the awareness, but it doesn’t provide the clarity so that tomorrow we can all say, ‘wow, let’s pivot in this direction,’” Rosenthal said. If presented with the chance to interpret CUBI’s definition of biometric identifiers, Texas’s oft-conservative courts might not rule in the AG’s favor. “A Texas state judge could say that [Google’s use of facial recognition] is not crossing that line. Then Texas would be looking at a goose egg instead of $1.4 billion,” he offered.

Companies would love to have “bright line guidance” differentiating between scenarios that count as biometric data under the statute and those that do not, Rosenthal suggested.

See our two-part series on legal and ethical issues in the use of biometrics: “Modality Selection, Implementation and State Laws” (Feb. 21, 2024), and “FIDO, Identity-Proofing and Other Options” (Feb. 28, 2024).

How Soon to Delete the Data

“The deletion timing is different state to state. Illinois [provides for] three years to delete. Texas is one year,” Rosenthal explained. Many companies took steps to comply with BIPA but have not revised their deletion practices for Texas.

In addition to determining when to delete data, companies must decide when they start the clock for deletion. Texas and Illinois specify that the clock runs only “from the termination of the reason for which it was collected.” Many companies start after the last date the data will be used for the reasons specified in their privacy notice, Rosenthal said.

For a typical biometric use, such as to identify employees, the timing of deletion is somewhat straightforward – it is based on the employee’s departure. It is less clear and more work for companies to determine when to delete biometric data for other uses, Rosenthal stressed.

For many consumer uses, like uploaded (and app-loaded) photos, when the company must discard the data remains fuzzy. “That photo just lives up there,” Rosenthal said. The clock might start only when the user deletes the photo “because the purpose for which the photo was uploaded has ended. But I’ve never seen it play out” in a lawsuit, he reported.

See “Examining Maryland’s Game-Changing Data Minimization Requirements” (Apr. 24, 2024).

Whether the Data Use Is for a Commercial Purpose

CUBI’s requirements apply only if the biometric use is for a “commercial purpose,” making its application narrower than BIPA. The AG’s complaint against Google took an expansive view that counted secondary benefits that a company gains. “Each time Google’s algorithms process photos and videos to detect certain faces and objects or process voiceprints to better understand voice data, Google’s underlying AI becomes stronger, better-informed, more efficient, and more dominant,” the complaint states.

The AG’s position indicates that “‘commercial purposes’ are going to be interpreted as broadly as possible when it comes to enforcement,” Rosenthal said.

The AG highlighted enhancement as a factor for assessing commercial purposes. Google’s analysis of voice and speech data “provide[s] a powerful means for Google to enhance its product offerings and capabilities, which ultimately translates into market dominance and increased profits,” the AG alleged in its complaint.

Which types of product and service enhancements the AG will have an appetite to challenge as misuse of biometric data remains a question to watch, Baker posited. “How safe can innovators feel in rolling out new products in jurisdictions like Texas?” If a company uses AI training on biometric data “to do something bigger and better with an existing product, and makes money off of that as a result, that to me would fit in with the AG’s vision of a commercial use,” he opined.

To judge risk, a company processing biometric data could consider the volume of its use, how purely the improvement aids business operations versus product development, and how central the improvement is to the company’s business model, Baker suggested.

A clearer guidepost would help the market, and Texans, Baker noted. “A lot of my clients simply will not operate a biometric-related service in Illinois and turn off that functionality in New York,” he reported.

“Will Texas residents be happy about that when they can’t use a try-on technology that recognizes their features,” or, Baker queried, when they “cannot seamlessly authenticate themselves when calling in to the bank?” The TDPSA might provide reassurance to companies. While the Google case was brought under CUBI, the TDPSA exempts security and fraud uses from “commercial purposes,” which might indicate that businesses’ operational needs like security and fraud are not commercial uses under CUBI, he explained. For retailers, however, the AG could assert that the use of facial recognition – even for fraud protection – benefits the company by mitigating losses, he cautioned.

See “Examining the Intersection of Voiceprints and Data Privacy Laws” (Sep. 22, 2021).

How to Approach Consent

Companies must determine how to get individuals’ consent “in a way that is practical and that is not going to make it so burdensome that the user or consumer will give up on the process,” Rosenthal advised.

A first consideration is whether consent is necessary. Some companies conclude they do not collect biometric data based on how their or third-party software operates, but decide to obtain consent anyway, Rosenthal reported. “Other companies say that ‘if we’re not doing anything [the law regulates], we don’t need to,” he observed.

Biometrics consent is fragile, Baker said. Informed consent to satisfy the current generation of laws must state the collection is “for a particular purpose and only that purpose,” he noted.

Many companies collecting biometrics alert consumers that they will use their data for product development or “secondary uses” because they need the data to innovate so they can compete in the highly competitive technology services market, Baker reported. And developments sometimes emerge too fast to keep up with obtaining new consent. While companies believe everyone benefits when they use the biometric data to introduce a new tech feature to the market, AGs likely would not approve product development as a permissible use, so “the company has to be very, very careful.” The initial consent obtained is for the specified purpose only, he cautioned.

See “Practical Insights Direct From U.S. State Privacy Enforcers” (Apr. 10, 2024).

Assessing Regulatory and Litigation Risks

The Settlement details, including the amount Google will pay, and the attention it is getting raises emerging risk factors for companies.

Liability for Non-Users Captured in Biometric Scans

“The state took the view in the lawsuits that the AG filed that users and non-users alike are protected,” observed Newman.

Companies using biometrics can easily collect the data of non-users. The start of the AG’s CUBI complaint against Google narrates examples where Google’s facial recognition recorded people other than the Google user. “To Google, it does not matter that the three-year-olds, the bystanders, and grandma never consented to Google capturing and recording their biometric data. For Google, it is immensely valuable for Texans to continue uploading photos and videos of themselves and their non-consenting friends and family members,” the AG alleged.

The AG highlighted the non-users “to raise the temperature and raise the [public] awareness at a level that most people can understand,” Rosenthal noted. But class action liability, so far, has not been established in most instances for non-customers. If the company does not know who that person is, courts ask how they could be a class member, he said.

Claims Under State Consumer Protection Laws

“These settlements are not missed by the plaintiffs’ bar,” Rosenthal noted, which can create a “snowballing effect.”

Google’s earlier settlement with the 40 state AGs, in a case brought under longstanding consumer fraud laws and concerning location data, did not include a release against lawsuits by citizens, Dunn pointed out. “The use of the consumer protection statutes by state AGs should remind companies to consider the risk of private suits by consumers under such laws,” he cautioned. “Many have available statutory damages” that make privacy claims viable for class actions and contingency cases. New York, for example, is $550 per violation.

Companies’ use of location data is coming under more state-level pressure, Dunn highlighted. California legislators on May 7, 2025, advanced new opt-in requirements around location data, and added restrictions on sharing and retaining location data even with a person’s consent.

The state AGs’ scrutiny of AI practices is increasing, Dunn noted. This creates a crossroads for investigation. “Geolocation and biometrics collection and processing is often done using AI tools,” he said.

See “BIPA Decisions Expand Potential Liability: What’s Next in Illinois and Other States?” (Mar. 8, 2023).

China Data Protection Law

Update on Digital Governance in India and China


The approaches of India and China to digital governance have been helping to shape international compliance and enforcement norms. As the two largest economies in Asia, China and India have introduced, and continue to release, new laws and regulations concerning digital governance and trust, imposing comprehensive compliance requirements across data protection, privacy, cybersecurity and AI. Navigating challenges and finding practical solutions for compliance with those laws is essential for businesses. This article synthesizes insights offered during the IAPP Global Privacy Summit 2025 on China’s comprehensive regulatory regime, including its evolving approach to cross-border data transfers, India’s evolving data protection regime and ways for global organizations to navigate disparate local requirements.

The program’s speakers included Barbara Li, a partner at Reed Smith; Maren Charisius, global head of privacy governance at Volkswagen; Scott Livingston, legal counsel at Dell Technologies; and Shivangi Nadkarni, senior vice president and general manager at Persistent Systems.

See “Navigating Recent Changes to China’s Data Privacy Laws in Internal Investigations” (Jun. 19, 2024).

China’s Comprehensive Legal Regime

China’s rules are relatively new compared to other regimes like the GDPR, but are evolving rapidly, explained Li. Starting with its 2016 Cybersecurity Law, China has developed a very comprehensive legal regime, which also includes a Data Security Law and the Personal Information Protection Law (PIPL).

Although the PIPL is sometimes referred to as “China’s GDPR,” it has unique legal principles, which distinguish it from the GDPR. “Digital trust and governance rules and compliance requirements actually are very closely interwoven into China’s strategy for economic development,” continued Li. In that regard, China has the world’s largest e‑commerce market, the greatest number of smartphone users and significant AI capabilities. In addition to the three national laws, there are hundreds of thousands of secondary and tertiary regulations, as well as both voluntary and mandatory industry standards and guidelines.

See “Navigating China’s Cybersecurity Regulatory Maze” (Aug. 12, 2020).

Robust Enforcement

“Chinese laws and regulations indeed do have teeth,” said Li. Violators are subject to substantial fines and penalties. For example, in 2022, the ride-hailing company Didi was fined $1.2 billion for failing to comply with applicable laws in collecting and sharing PI from drivers and passengers. Its CEO and chairman were also fined personally. Additionally, Chinese enforcement authorities have been very active in the past two years. For example, on March 15, 2025 – Chinese Consumer Rights Day – many international and domestic companies were accused of failing to comply fully with digital governance requirements, resulting in loss of trust and business.

See “The Importance of Being a PIPL Pleaser: Update and Predictions on China’s Data Protection Law One Year In” (Dec. 14, 2022).

Cross-Border Data Transfers

China does not recognize the adequacy of another nation’s regime as a basis for cross-border transfers, explained Li. There are only three mechanisms for outbound data transfers:

  • application to the Cyberspace Administration of China (CAC) for review, which entails substantial documentation, including an impact assessment report and organizational and technical information;
  • use of standard contractual clauses, which are based on CAC requirements; and
  • certification by qualified third parties.

The first two approaches are more common because there are not many local licensed certification institutions, noted Li. However, they are time-consuming and have comprehensive documentation requirements.

In March 2024, the CAC relaxed its cross-border data transfer requirements, continued Li. It exempted many transfer scenarios and eased the approval process and documentation requirements for others. For example, a Chinese subsidiary can now transfer employee data to an overseas parent without complying with those mechanisms. Of course, the company must still comply with Chinese employment requirements. Consequently, when contemplating cross-border transfers, organizations must consider all applicable laws, not just the three data-related laws. China has also created “free trade zones” that facilitate cross-border transfers.

When contemplating data transfers out of China, organizations should start with data mapping, advised Li. They should assess the risk associated with the transfer and whether the transfer is necessary at all. For example, for high-risk scenarios, companies might consider a local solution – storing data in China – with a vendor providing the requisite services in China.

See “China’s First Information Protection Law: Compliance Essentials” (Oct. 13, 2021).

Important Data

China imposes additional restrictions on cross-border transfers of so-called “important data,” which is “the next phase of data privacy law,” according to Livingston. Privacy practitioners are used to protecting PI, but “important” data is subject to the same or more stringent protections as those for PI. Managing such data will entail a new regulatory workstream. The Vietnamese data law, which was passed in November 2024, incorporates the concept of important data. “Japan also has a new law on sensitive economic and security information, which is basically important data,” he added. More countries are likely to follow suit.

Impact on U.S. Law and Regulation

Chinese and U.S. laws actually have a great deal in common, according to Livingston. China has been good at studying the models used by other nations and adapting those models to its own system. For example, China’s PIPL is heavily influenced by the GDPR. Its approach to important data is based on the U.S. approach to controls over classified information. Similarly, its rules on critical infrastructure are informed by U.S. practices.

Influences go both ways. A decade ago, a common perspective was that “you can’t compete with China by becoming China,” according to Livingston. Now, the U.S. government’s actions appear to be influenced by China’s approach, “even if they’re not admitting this.” The Protecting Americans’ Data from Foreign Adversaries Act represents the first time the U.S. has regulated cross-border data flows from the perspective of national security.

China’s state media and some law firms initially did not like the U.S.’s new regulation of cross-border data flows – but they also acknowledged, “This is what we’ve been saying all along. There’s certain sensitive data that is important for national security concerns and should be protected. That’s why we have data localization. That’s why we push data sovereignty,” reported Livingston. China is also trying to position itself as the “better international player,” claiming that, while the U.S. is tightening its own rules, China is trying to reduce its very stringent data protections through free trade zones and regulatory changes.

See our two-part series on the DOJ guidance on bulk sensitive data rules: “Enforcement Grace Period and Prohibited Transactions” (May 7, 2025), and “Compliance Program, Recordkeeping and Reporting” (May 14, 2025).

India’s Slowly Evolving Regime

“India is still evolving,” said Nadkarni. In 2000, it adopted the Information Technology Act, an omnibus act covering a wide range of issues, including cybercrime prevention and protection of sensitive data. In 2023, after years of working on data privacy legislation, India adopted the Digital Personal Data Protection Act (DPDPA).

Unlike many other laws, the DPDPA does not categorize data as sensitive or non-sensitive, continued Nadkarni. Instead, it categorizes organizations based on the complexity and risk associated with the types of personal data they process. The DPDPA also uses terminology that is different from other laws. For example, data controllers are referred to as “data fiduciaries.” Other notable provisions of the DPDPA include:

  • additional requirements for organizations that process children’s data;
  • generally free cross-border data flows, except with respect to blacklisted countries, subject to the government’s potential authority to impose norms for certain types of data; and
  • authority for licensed third-party consent managers to facilitate giving and withdrawing consent by data subjects (referred to in the law as “data principals”).

The implementing regulations for the DPDPA have yet to be adopted. That could come as soon as the summer of 2025, when Parliament is in session again. The regulations will probably have an implementation and compliance period of 18 to 22 months, predicted Nadkarni. Enforcement of the new regulations likely will be stringent, with an early focus on larger companies.

The Indian government’s approach to governance has been referred to as an “India way of doing things,” reflecting the nation’s 1.4 billion people and the large amount of associated data, continued Nadkarni. It also appears that India will not enact a separate law for AI because existing laws are considered sufficient. In certain areas, including banking, financial services, insurance and telecom, regulators are expected to be more active. She expects regulatory action and “downstream guidance.” There is also considerable self-regulatory activity.

Although there have been some changes to India’s rules for cross-border transfers, the regime is now stricter than China’s, added Nadkarni.

See “An Analysis of the Liberal and Strict Provisions in India’s New Privacy Law” (Sep. 6, 2023).

Standard Setting

Standards have become a key component of regulatory compliance, observed Livingston. There has been “a lot of hand-wringing about China’s standard setting,” he said. China has a very robust government-led standard-setting system, which is heavily influenced by industry and academia. Those standards are now influencing international standards. Consequently, there are growing calls in the U.S. for renewed U.S. leadership in this area. In that regard, U.S. Senators Marsha Blackburn and Mark Warner have introduced the Promoting United States Leadership in Standards Act, which calls for additional input from government and academia and additional government coordination and funding, explained Livingston.

Reconciling Global Requirements

Rules in India and China are evolving rapidly, with unique local requirements, said Li.

Recognizing Different Legal Bases for Data Collection

“Legitimate interest” can serve under the GDPR as the legal basis for data collection. It is not recognized as a legal basis in China, however, where an organization must obtain consent for collection.

Additionally, when cross-border transfers are involved, Chinese government authorities always ask for evidence of consent from the affected individuals. China’s laws specify some exceptional cases, such as legal duties and public emergencies, where explicit consent for data collection is not required, but for most practical purposes, consent is still required.

There is also a strong emphasis on consent in India, Nadkarni concurred. There is no general “legitimate interest” basis or other use cases where data may be collected without obtaining consent, she highlighted.

Starting With a Data Inventory

To tackle the challenges of different legal requirements, organizations should start with a global framework that includes a data inventory. They can then tailor the approach depending on what data is exposed to the GDPR or another specific regime.

E.U. companies expanding into Asia should assess their intergroup contracts, advised Charisius. They should adopt an overarching concept that is flexible enough to add specific local benefits.

Choosing a Baseline Compliance Framework

Regulatory compliance in China has become significantly more difficult in the last few years, Livingston noted. There are new processes for PII audits, data classification and cross-border transfers. Most companies have used the GDPR as a global baseline for compliance – but things in China and other jurisdictions are becoming more complex. Consequently, some organizations are considering entirely separate frameworks for China.

The local regulatory landscape in Asia has been very active in the past two years, said Li. Vietnam’s new rules mirror those of China. Companies are taking varying approaches to managing global and local differences. Some companies tailor approaches to groups of countries with similar regulatory regimes. There are also industry-specific considerations. For example, the automotive industry is subject to particularly strict regulation in China. Additional challenges in China include the fact that rules are issued only in Chinese and have short comment periods and compliance deadlines.

Leveraging Local Resources

Organizations need local teams to interpret the regulations - not just to translate them but also to understand the rationale for them, industry practice and how they will be implemented in practice, suggested Li. They should also engage with local regulators and stakeholders and take advantage of chambers of commerce and other industry associations. Industry groups, such as Trivium consulting and the U.S. Information Technology Office, are also helpful for tracking local developments and policy, added Livingston.

Big Data

Survey Finds Investment Managers Increasing Use of Alternative Data


“Alternative data (alt data) has become a mainstream source of insights for portfolio managers and analysts worldwide, with 100% of respondents confirming its integration into their investment processes,” according to a new survey by alt data platform BattleFin and its newly acquired alt data analytics platform, Exabel. As alt data becomes part of mainstream investing, investment professionals, including those at hedge funds, are seeking new data sources and enhanced analytical capabilities. BattleFin’s survey assessed demand for alt data, data sources, key challenges associated with using alt data, resources, budgeting and management commitment to alt data. This article details the key findings from the BattleFin survey.

See “AI Drives Rise in Private Funds’ Use of Alternative Data” (Mar. 20, 2024).

Survey Methodology and Demographics

In January 2025, BattleFin retained a market research firm to survey 130 investment professionals from discretionary public equity funds, approximately three-fifths were portfolio managers and two-fifths were investment analysts. Half of the respondents were based in the U.S., with the other half based across the U.K. (30%), Singapore (11%) and Hong Kong (9%).

Respondents reported an aggregate $820 billion in assets under management (AUM), with nearly three-quarters from firms with between $1 billion and $10 billion in AUM. Most of the rest (23%) had less than $1 billion in AUM. Just 4% reported more than $10 billion in AUM. Hedge funds accounted for a plurality of respondents (42%), followed by asset managers and mutual funds (22%); pensions (19%); and family offices (17%).

BattleFin’s research report defines alt data as “non-traditional data sets that investors use to support their investment strategies.” It may include:

  • credit card transactions;
  • data from mobile devices;
  • satellite imagery;
  • social media data; and
  • website traffic.

Use of Alt Data

Virtually all respondents (98%) believe “traditional data/official figures are becoming too slow in reflecting changes in economic activity,” noted BattleFin. Consistent with that sentiment, all respondents said they use alt data to support investment research. Most (70%) began using alt data at least three years ago, including 9% that began using it more than five years ago. Most of the remaining 30% began using alt data between one and three years ago, with just 2% having embraced it in the past year.

Two-thirds of respondents use between five and ten datasets. Most of the rest use two, three or four, while a handful use more than ten. Respondents were overwhelmingly satisfied with the process of using alt data, with 25% describing the process as “excellent” and 62% describing it as “good.” Just 13% described it as “average.”

Virtually all respondents (95%) obtain from 25% to 75% of their alt data from third-party vendors. Additionally, nearly three-quarters of respondents expect the proportion of alt data they source from third-party vendors to increase over the next three years.

See “Use of Alternative Data Continues to Grow, Says New Survey” (Mar. 29, 2023).

Increasing Demand for Alt Data

Most respondents (86%) expect their overall use of alt data to increase over the next two years. The rest expect it to remain the same as today. However, their appetite for different types of alt data varied.

BattleFin asked respondents whether their use of nine types of alt data would stay the same or increase “slightly” or “dramatically.” Just over half of respondents said they expect their use of geolocation and/or consumer spending data to increase dramatically. Nearly as many expect spending to increase dramatically for web-scraped data (45%) and social media sentiment data (41%). Fewer expect dramatic increases in spending for:

  • logistics and shipping data (36%);
  • app and web data (34%);
  • satellite and weather data (32%);
  • market sentiment (32%); and
  • employment data (27%).

Consistent with those findings, three-quarters of respondents expect consumer spending data to provide an outsized informational edge in the near future. Half expect web, mobile and app data and/or natural language processing and sentiment data to provide an edge. Nearly as many cited social listening (45%) and/or employment data (43%). On the other hand, just 7% think satellite data will provide an outsized edge.

Key Challenges

The most prevalent challenge in working with alt data – cited by 79% of respondents – involves combining data from different sources into a single prediction. Nearly half cited challenges with processing raw data into a usable format (48%) and/or insufficient technical resources, including data engineers (45%). A smaller proportion cited using multiple interfaces to combine datasets (42%), comparing different datasets that are similar (39%) and/or mapping data to entities (34%).

Most respondents cited the following as the “most significant” barriers to extracting value from alt data:

  • difficulty prioritizing the large volume of alt data (44%);
  • poor data quality (39%); and
  • difficulty integrating alt data into systems (12%).

Just 3% said data was not in a usable format, and 2% lack data analysts or other internal resources.

See “Key Compliance Considerations for Fund Managers Using Alternative Data” (Jan. 15, 2020).

Resources and Management Commitment

Data Purchases and Software/Technology

Purchasing data and acquiring software and technology account for a significant portion of respondents’ alt data budgets. More than four-fifths of respondents spend between 25% and 75% of their budgets on purchasing data, including 62% that spend between 25% and 50% and 19% that spend 50% to 75%.

Similarly, 71% of respondents spend more than 25% of their alt data budgets on software and technology, including 45% that spend from 25% to 50% and 26% that spend 50% to 75%. Most of the remaining respondents spend between 10% and 25% on purchasing data and/or software and technology.

Headcount

Most respondents (94%) spend up to 50% of their alt data budgets on headcount. Nearly half spend between 10% and 25%. Most of the rest spend either 25% to 50% (33%) or up to 10% (12%).

Management Commitment

Eighty-four percent of respondents said their firms have a dedicated alt data lead. Additionally, virtually all said their senior management is either “quite committed” (77%) or “very committed” (21%) to using alt data in the investment process.

Budgets Continue to Increase

Roughly four-fifths of respondents said that, over the past two years, their budgets for acquiring and managing alt data have grown by at least 25%, including 36% who saw an increase of between 25% and 50% and 43% who saw an increase of between 50% and 75%. Additionally, 85% expect their 2025 budgets to be higher than 2024, consisting of 33% who expect a substantial increase and 52% who expect a modest increase. Thirteen percent expect budgets to be flat, and just 1% expect them to decrease.

Alt Data Technology

BattleFin asked respondents about the technology they use to analyze alt data. More than half said they use one or more of the following:

  • third-party software (66%);
  • basic software such as Excel (59%);
  • systems provided by data vendors (51%); and
  • software developed in-house (51%).

Third-party software will likely remain an integral part of firms’ efforts to analyze alt data. In that regard, 85% of respondents said that, over the next five years, their use of third-party software for alt data analysis will increase either slightly (70%) or dramatically (15%). Use by the remaining 15% will stay the same.

One of the primary drivers of the anticipated increase is the belief that third-party software is more economical than in-house software, cited by 87% of respondents. Other important drivers are the ability to work with different types and sources of data (62%) and offering data analysis that is superior to in-house systems (52%).

See our two-part series on how to manage AI procurement: “Leadership and Preparation” (Sep. 18, 2024), and “Five Steps” (Oct. 2, 2024).

People Moves

Former DHS Senior Official Joins Ropes & Gray’s Data, Privacy & Cybersecurity Practice


Ropes & Gray has welcomed Robert Silvers, former Under Secretary for Policy at the U.S. Department of Homeland Security (DHS), as co‑chair of the firm’s national security practice and a partner in its data, privacy & cybersecurity practice. He will be based in Washington, D.C., and Silicon Valley.

Silvers focuses on critical matters at the intersection of national security, technology and law, advising on cybersecurity, privacy, trade controls, AI security and governance, and crisis management. He navigates clients through parallel proceedings that accompany security issues, including internal and government-facing investigations, regulatory inquiries, litigation defense, and incident response and management. He also advises boards of directors, senior officers and fund managers.

At DHS, Silvers was responsible for driving the agency’s direction across all its missions, including cybersecurity, the Committee on Foreign Investment in the United States, AI security, international affairs and the Forced Labor Enforcement Task Force. He led the “nerve center” office for DHS and tackled some of the country’s most pressing security issues. He also oversaw efforts to bring the federal government and private sector together to respond to significant cybersecurity incidents, and to streamline the reporting of cyber incidents to better protect critical infrastructure across the country. During his tenure as Under Secretary, Silvers served as founding Chair of the U.S. Cyber Safety Review Board, driving investigations into the most severe cyberattacks.

Earlier in his career, Silvers served as Assistant Secretary for Cyber Policy at DHS and in other senior U.S. government positions. He was also a partner at Paul Hastings, where he served as vice chair of its privacy & cybersecurity practice, and founder and chair of its AI practice, working with some of the most recognizable names across the technology, financial services and healthcare sectors.

For commentary from Silvers, see “Four Steps to Secure Open-Source Software After CSRB’s Log4j Investigation” (Sep. 7, 2022).

For insights from Ropes & Gray, see our two-part series “DOJ Guidance on Bulk Sensitive Data Rules”: Enforcement Grace Period and Prohibited Transactions (May 7, 2025), and Compliance Program, Recordkeeping and Reporting (May 14, 2025).

People Moves

Foley Welcomes Cybersecurity, Privacy and Technology Partner Duo to Chicago


Daniel Farris and Joe McClendon have joined Foley & Lardner as partners in the firm’s technology transactions, cybersecurity and privacy practice as well as its energy & infrastructure and innovative technology sectors in Chicago. The pair arrives from Norton Rose Fulbright and brings deep experience in cybersecurity and data privacy, data center operations and development, and complex technology transactions.

Farris advises clients on a broad spectrum of technology, privacy, cybersecurity and compliance matters, with a focus on complex data center and digital infrastructure transactions. He counsels on technology licensing and regulatory compliance, including U.S. and international standards such as HIPAA and the GDPR. He represents clients in industries such as technology, healthcare, banking, telecommunications and retail. Prior to joining Foley, Farris was the partner in charge of Norton Rose Fulbright’s Chicago office.

McClendon focuses his practice on privacy, cybersecurity and technology transactions, with an emphasis on developing compliance programs aligned with U.S. and international regulations, including the GDPR and HIPAA. He also advises on information governance, disaster recovery planning and intellectual property matters. He applies his technical background as an IT project manager to address legal and regulatory challenges in technology-driven initiatives. Most recently, McClendon was a partner at Norton Rose Fulbright.

For commentary from Farris, see our three-part series on the keys to encryption: “Uses and Implementation Challenges” (Mar. 4, 2020), “Legal and Regulatory Framework” (Mar. 11, 2020), and “Effective Policies, Legal’s Role and Third Parties” (Mar. 18, 2020).

For insights from Foley, see “Cyber Risks in Aviation: Navigating Turbulent Skies Ahead” (Mar. 22, 2023); and “Drafting Data and Cybersecurity Provisions in Third-Party Vendor Agreements” (Mar. 30, 2022).