Analysis of Biometric Privacy Litigation Cases

There has been a significant amount of litigation related to biometric privacy in recent years. The following cases reflect the evolving landscape of biometric privacy litigation in California, highlighting the challenges and considerations in applying biometric privacy laws across different jurisdictions.

1. Clark v. Yodlee, Inc. (2024)

Court: U.S. District Court for the Northern District of California
Summary: Plaintiffs alleged that Yodlee collected and stored users’ biometric data without consent, violating privacy rights. The court denied class certification, citing lack of standing and failure to meet Rule 23 requirements.

This is a notable data privacy case in the U.S. District Court for the Northern District of California, involving allegations against Yodlee, Inc., a financial data aggregation company.

Case Background

In August 2020, plaintiffs filed a putative nationwide class action against Yodlee, asserting that the company unlawfully collected, stored, and utilized bank transaction data and account information from consumers who used Yodlee’s Instant Account Verification (IAV) service to link bank accounts to various financial applications. The plaintiffs alleged violations of (1) California’s Anti-Phishing Act (CAPA); (2) Invasion of privacy under the California Constitution and common law; and (3) Unjust enrichment.

By the class certification stage, the case had been narrowed to five plaintiffs asserting these claims.

Class Certification Denial

In September 2024, the court denied the plaintiffs’ motion for class certification, concluding that the proposed class representatives lacked Article III standing and failed to satisfy Rule 23’s typicality and adequacy requirements. Specifically, the court found no evidence that Yodlee had collected bank transaction data for any of the proposed class representatives, which was central to the plaintiffs’ claims and their compensatory damages theories. Consequently, the court determined that the claims of these representatives were not reasonably co-extensive with those of absent class members.

Summary Judgment Rulings

Following the denial of class certification, Yodlee moved for summary judgment on the remaining individual claims. In February 2025, the court largely granted Yodlee’s motion and determined that:

1. Two of the five plaintiffs lacked Article III standing, as they had used the IAV service only after initiating the lawsuit.
2. The remaining three plaintiffs failed to provide evidence that Yodlee collected their bank transaction data or wrongfully used IAV data for its own benefit, leading to the dismissal of their unjust enrichment claims.
3. The CAPA claims were rejected due to a lack of evidence showing that the plaintiffs suffered any adverse effects from Yodlee’s alleged violations.
4. The court limited the invasion of privacy claims, concluding that only the allegations regarding Yodlee’s storage of bank login credentials presented a genuine issue for trial.

Implications

The rulings in Clark v. Yodlee, Inc. underscore the critical importance of demonstrating concrete harm and a direct connection between the defendant’s actions and the plaintiffs’ alleged injuries in data privacy litigation. The decisions highlight the challenges plaintiffs face in achieving class certification and surviving summary judgment without substantial evidence of harm. This case serves as a significant reference for future data privacy lawsuits, particularly those involving financial data aggregation services.

2. Zellmer v. Meta Platforms, Inc. (2024)

Court: U.S. Court of Appeals for the Ninth Circuit
Summary: The plaintiff claimed Facebook’s “Tag Suggestions” feature violated the Illinois Biometric Information Privacy Act (BIPA) by creating “face signatures” without consent. The Ninth Circuit held that “face signatures” are not considered biometric information under BIPA, leading to the dismissal of the claims.

This is a significant case in biometric privacy law, addressing the application of the Illinois Biometric Information Privacy Act (BIPA) to non-users of a platform.

Case Background

Plaintiff Clayton Zellmer, an Illinois resident, who never registered for or used Facebook, filed a class-action lawsuit against Meta Platforms, Inc. (formerly Facebook). Zellmer alleged that Facebook’s “Tag Suggestions” feature violated BIPA by collecting and storing his biometric data without consent. This feature used facial recognition technology to analyze photos uploaded by users and suggest tags by creating “face signatures,” numerical representations of detected faces. Zellmer’s claim centered on photos of him uploaded by friends to Facebook, from which he alleged Facebook generated his face signatures without his consent.

Legal Issues

Zellmer’s lawsuit focused on two primary sections of BIPA:

1. Section 15(b): Requires private entities to inform individuals in writing and obtain their consent before collecting or storing their biometric identifiers or information.

2. Section 15(a): Mandates that private entities develop and publicly disclose a written policy outlining retention schedules and guidelines for permanently destroying biometric data.

Zellmer contended that Facebook violated these provisions by creating face signatures from his images without his consent and without a publicly available data retention policy.

District Court Proceedings

The U.S. District Court for the Northern District of California granted summary judgment in favor of Facebook on the Section 15(b) claim. The court reasoned that requiring Facebook to obtain consent from non-users before processing their images was impractical, given the lack of any direct relationship between Facebook and non-users. Regarding the Section 15(a) claim, the court dismissed it for lack of Article III standing, concluding that Zellmer did not demonstrate a concrete and particularized injury resulting from Facebook’s alleged failure to publish a data retention policy.

Ninth Circuit Appeal

On appeal, the Ninth Circuit affirmed the district court’s decisions but on different grounds:

1. Section 15(b) Claim: The appellate court focused on whether Facebook’s “face signatures” qualified as biometric identifiers or information under BIPA. The court concluded that face signatures could not be used to identify an individual on their own and therefore did not meet BIPA’s definitions of biometric identifiers or information. Consequently, the creation of face signatures from Zellmer’s photos did not constitute a BIPA violation.

2. Section 15(a) Claim: The Ninth Circuit agreed with the district court that Zellmer lacked standing. It held that the duty to publish a data retention policy is a general obligation owed to the public, and Zellmer failed to show how he suffered a concrete and particularized injury from Facebook’s alleged non-compliance.

Implications

The Zellmer decision provides clarity on the scope of BIPA, particularly regarding:

Applicability to Non-Users: The court acknowledged that BIPA protections extend to non-users. However, it emphasized that for data to be considered a biometric identifier under BIPA, it must be capable of identifying an individual. In this case, face signatures did not meet that criterion.

Definition of Biometric Identifiers: The ruling underscores that not all data derived from facial recognition technology falls under BIPA’s purview. Only data that can independently identify a person qualifies as a biometric identifier or information. This case sets a precedent for how courts may interpret BIPA’s definitions and its applicability to non-users, potentially influencing future biometric privacy litigation.

3. Renderos v. Clearview AI (2023)

Court: California State Court
Summary: Plaintiffs accused Clearview AI of using facial recognition technology to collect and store individuals’ biometric data without consent, violating California’s consumer protection statutes. The court’s decision may serve as a roadmap for future biometric privacy claims under general consumer protection laws. This is a pivotal legal case addressing biometric privacy concerns, particularly focusing on Clearview AI’s collection and use of individuals’ facial images without consent.

Case Background

In 2021, plaintiffs Steven Renderos and others filed a class-action lawsuit against Clearview AI in the Superior Court of California, County of Alameda. Plaintiffs alleged that Clearview AI unlawfully scraped billions of facial images from publicly accessible websites and social media platforms without individuals’ consent. These images were used to create a comprehensive facial recognition database, which Clearview AI then sold to various private entities and law enforcement agencies. Plaintiffs asserted that this practice violated their rights under: (1) California’s constitutional right to privacy; (2) Common law prohibitions against unauthorized use of a person’s likeness; and (3) California’s Unfair Competition Law (UCL). Plaintiffs contended that Clearview AI’s actions constituted an invasion of privacy and unauthorized commercial exploitation of their biometric data.

Legal Proceedings

In November 2022, the Alameda County Superior Court addressed Clearview AI’s motion to dismiss the case. The court denied the motion, allowing the plaintiffs’ claims to proceed. This decision was significant as it recognized that, even in the absence of specific biometric privacy legislation like the Illinois Biometric Information Privacy Act (BIPA), existing California laws could provide a basis for addressing biometric privacy violations. The court’s ruling suggested that common law privacy rights, the California Constitution, and the UCL could collectively offer protection against unauthorized use of biometric data.

Clearview AI attempted to invoke California’s anti-SLAPP (Strategic Lawsuit Against Public Participation) statute, arguing that its collection and use of facial images were protected speech under the First Amendment. The court rejected this argument, distinguishing between protected expressive activities and Clearview AI’s commercial use of individuals’ biometric data without consent. The court emphasized that Clearview’s actions constituted an invasion of privacy rather than protected speech.

Amicus Briefs

Several organizations, including the American Civil Liberties Union of Northern California (ACLU-NC), the Electronic Privacy Information Center (EPIC), and the Tech Justice Law Project, filed amicus briefs supporting the plaintiffs. These briefs argued against Clearview AI’s claims of First Amendment protection and highlighted the broader implications of allowing companies to exploit biometric data without consent. The ACLU-NC, for instance, emphasized that Clearview’s practices were invasive surveillance tactics not shielded by free speech protections.

Implications and Significance

The Renderos v. Clearview AI case has far-reaching implications for biometric privacy litigation, particularly in jurisdictions lacking specific biometric privacy statutes. The court’s willingness to allow claims based on constitutional and common law principles indicates that companies collecting and using biometric data without consent may face legal challenges grounded in broader privacy rights. This case serves as a potential roadmap for future plaintiffs seeking to protect their biometric information through existing legal frameworks.

Moreover, the court’s rejection of Clearview AI’s First Amendment defense underscores the principle that commercial exploitation of personal data, especially biometric information, does not constitute protected speech when it infringes upon individuals’ privacy rights. This distinction is crucial in balancing technological advancements with fundamental privacy protections.

As of early 2025, the case continues to influence discussions on biometric data regulation and the legal responsibilities of companies operating in this space. It highlights the evolving landscape of privacy law and the increasing recognition of the need to safeguard individuals’ biometric information from unauthorized commercial use.

4. Gullen v. Facebook, Inc. (2018)

Court: U.S. District Court for the Northern District of California
Summary: The plaintiff alleged that Facebook’s facial recognition technology collected biometric identifiers without notice or consent, violating BIPA. The court dismissed the case, highlighting the complexities of applying BIPA to companies operating outside Illinois. This was a significant legal case concerning the application of the Illinois Biometric Information Privacy Act (BIPA) to non-users of a platform.

Case Background

In 2015, Frederick Gullen, an Illinois resident, who did not have a Facebook account, filed a class-action lawsuit against Facebook, Inc. Gullen alleged that Facebook’s “Tag Suggestions” feature violated BIPA by collecting and storing his biometric data without consent. This feature used facial recognition technology to analyze photos uploaded by users and suggest tags by creating “face templates,” numerical representations of detected faces. Gullen’s claim centered on a photo of him uploaded to an organizational Facebook page, from which he alleged Facebook generated his face template without his consent.

Legal Issues

Gullen’s lawsuit focused on two primary sections of BIPA:

1. Section 15(b): Requires private entities to inform individuals in writing and obtain their consent before collecting or storing their biometric identifiers or information.

2. Section 15(a): Mandates that private entities develop and publicly disclose a written policy outlining retention schedules and guidelines for permanently destroying biometric data.

Gullen contended that Facebook violated these provisions by creating a face template from his image without his consent and without a publicly available data retention policy.

District Court Proceedings

The U.S. District Court for the Northern District of California initially denied Facebook’s motion to dismiss, allowing the case to proceed. However, upon Facebook’s motion for summary judgment, the court ruled in favor of Facebook. The court found no evidence that Facebook subjected the photo uploaded to the organizational page to facial recognition analysis. The declarations from Facebook employees and internal emails indicated that, at the time the photo was uploaded, facial recognition was turned off for organizational pages. Therefore, the court concluded that Gullen’s biometric data had not been collected or stored, leading to the dismissal of his claims.

Ninth Circuit Appeal

Gullen appealed the district court’s decision to the Ninth Circuit Court of Appeals. In 2019, the Ninth Circuit affirmed the lower court’s ruling, agreeing that there was insufficient evidence to show that Facebook had subjected the specific photo in question to facial recognition analysis. The appellate court emphasized that the evidence presented did not support the claim that the photo uploaded to the organizational page was processed in violation of BIPA.

Implications

This case highlights several important aspects of biometric privacy litigation:

Application to Non-Users: The case addressed whether BIPA protections extend to individuals who are not users of the platform in question. While the court did not make a definitive ruling on this broader issue, the case underscores the complexities involved when non-users’ data is implicated.

Evidence of Data Collection: The decision underscores the necessity for plaintiffs to provide concrete evidence that their biometric data was collected or analyzed without consent. Mere allegations without substantiating evidence are insufficient to sustain claims under BIPA.

Platform-Specific Features: The case illustrates how platform-specific settings and features, such as the disabling of facial recognition on organizational pages, can impact the applicability of biometric privacy laws.

In conclusion, these cases serve as reference points for future biometric privacy litigation particularly concerning the scope of BIPA’s applicability and evidentiary standards required to support these legal claims. Please contact our law firm to speak with a qualified internet and technology attorney regarding your questions.