Illinois Supreme Court Rules No Actual Harm Required for BIPA Lawsuit
The Illinois Supreme Court last month upheld a consumer’s right to sue for a violation of the Illinois Biometric Information Privacy Act (BIPA) without establishing actual harm – an injury or adverse effect separate from the violation. The Court held that a procedural violation of the law is enough to support a case under BIPA’s private right of action.
The decision underscores the importance of businesses improving their data privacy practices and enhancing their ability to gather consent and notify individuals of their data collection, use, sharing and retention beyond their website to include other activities such as facial recognition. There have already been over 200 BIPA lawsuits filed and BIPA provides for liquidated damages of $1000 per violation against negligent private entities, or $5000 for each intentional or reckless violation by a private entity.
BIPA requires businesses to get consent for biometric scans. They must also provide a written explanation of the use of the data and its period of retention. Although privacy is often treated as an issue for social media and other websites, businesses need to be careful in other activities as both the European Union’s General Data Protection Regulation (GDPR) and Illinois’ BIPA demonstrate. The lawsuit before the Illinois Supreme Court revolved around the taking of a fingerprint from a teenager at Six Flags for access to an annual pass without the written notice required by the law.
The question of whether an individual that has suffered a privacy violation is entitled to damages without establishing a quantitative injury or other specific loss has been a hot topic over the past few years and an important one in privacy litigation. Before the opinion from the Illinois Supreme Court, the appellate court in the case had relied on a textual interpretation of the word “aggrieved” in the statute that a technical violation of the law was not sufficient to be able to pursue the private right of action. The decision set up a conflict between the lower courts as a different Illinois appellate court had previously held that no harm in addition to the violation of BIPA was required.
The Illinois Supreme Court was unanimous in its decision that a plaintiff that alleges only a technical violation of the statute is an aggrieved person under BIPA. The Opinion of the Court indicated that a violation of the statute eliminates an individual’s right to maintain his or her biometric privacy, and the injury from it “is real and significant.” It also said: “To require individuals to wait until they have sustained some compensable injury beyond violation of their statutory rights before they may seek recourse … would be completely antithetical to the Act’s preventative and deterrent purposes.”
A similar debate is happening at the federal level with respect to BIPA and Article III standing. Separately, the US Supreme Court is considering the injury from a loss of privacy as part of its review of whether there is Article III standing in Frank v. Gaos. The case was originally argued in October 2018 over the question of whether a class action settlement with a cy pres award but no direct relief to class members met the fairness requirement for a settlement. The Justices subsequently requested additional briefing on the standing issue, which was raised by the Solicitor General and the amicus curiae brief of the United States. The United States argued that there was no injury-in-fact which was both concrete and particularized from the dissemination of the individual’s search queries, and thus no Article III standing for a lawsuit in federal court. The US Supreme Court has yet to weigh in.
As states consider new data privacy laws, biometric data and facial recognition technology is going to be an area of a lot of controversy. And more states are probably going to adopt laws similar to BIPA with a private right of action.
There was already a proposed New York City law from last year which would require every business to clearly notify when facial recognition is in use, as well as details about the storage, retention period and use of the data. Similar to BIPA, the bill provide a private right of action so individuals can bring lawsuits against companies for violations of the law. New York State has also previously considered legislation to protect biometric information called the Biometric Privacy Act.
The California legislature included biometric data within its definition of personal information for the law that will go into effect in 2020. The California Consumer Privacy Act also has a private right of action, though it is limited to data breaches where there were not reasonable security practices in place. The CCPA law specifies that consumers are entitled to statutory damages of between $100 and $750 per person per incident and it does not contain the “aggrieved” person language that fueled the controversy in Illinois’ BIPA.
As further evidence that organizations are San Francisco is also considering its own facial recognition law, called the Stop Secret Surveillance Ordinance, although this would only apply within its local government. It would require City departments, including law enforcement, to get approval from the Board of Supervisors before buying or using surveillance technology. It would cover not only facial recognition but voice, iris and gait recognition as well. Annual audits to ensure the technology was used properly would also be required.
Businesses collecting biometric information need to take a close look at their consent and disclosure notice practices and systems in order to make sure that they are ready to comply with the changing requirements.
More Resources:
Read the resources Clarip has posted on the California Consumer Privacy Act (CCPA) and contact us to see a demo of the Clarip privacy management platform used by Fortune 500 clients.