By: The Michelson Institute for Intellectual Property

Executive Editor: Nathan Mutter, Holland & Hart LLP, IPO Education Foundation


The convergence of artificial intelligence (AI) and biometrics is reshaping our world. While these technologies open new frontiers of innovation, they also expose us to significant privacy challenges. Biometric data, which captures our unique physical attributes (like our fingerprints, face, and voice) has become a powerful tool in the AI realm, with applications ranging from smartphone security to customer service. 

While the term “biometric data” is used frequently in today’s news cycle, it is easy to overlook just how prevalent the use and collection of biometric data has become. The global market for biometric data collection is estimated to be approximately $30 billion, and is expected to grow to over $76 billion by 2029. From facial recognition to unlock our phones or go through airport security, palm scans to pay for groceries, or voice recognition performed by smart devices such as Amazon Alexa, biometric data collection has become a daily occurrence for everyday life. Biometric data collection is so prevalent that the U.S. Department of Homeland Security (DHS) has collected face, iris, and fingerprint scans of more than 259 million people, which is more than 78% of the U.S. population.


Biometric Data: Benefits

The benefits of biometric data help explain its newfound prevalence in our ever-evolving, fast-paced world:

  • Convenience – since biometric data is inherently part of the individual, people don’t have to remember passwords, PINs, or remember to carry ID cards.
  • Increased security – as compared to traditional security means (e.g., passwords, PINs), biometric data is harder (but not impossible) to hack or impersonate.


Biometric Data: Risks

Yet, the same factors that make biometric data so useful also create substantial concerns, and the mishandling of biometric data can have devastating implications. Biometric information acts as the key to your digital identity. Once compromised, it opens doors to an array of potential threats:

  • Identity theft – unlike passwords, you cannot change your biometric data; if cybercriminals manage to access your biometric data, they gain a key that they can use to impersonate you indefinitely.
  • Surveillance and tracking – biometric data can be used for unsanctioned surveillance, leading to a severe infringement on privacy rights. Governments or organizations can misuse this technology to track individuals’ movements, activities, and behaviors without their consent.
  • Deepfakes and misinformation – deepfake technology, which leverages AI and biometric data, may enable cybercriminals to create highly realistic images or videos of people saying or doing things they never did, which can be used to spread misinformation, manipulate public opinion, or even falsely incriminate individuals.


Biometrics is Built on Trust

The use of biometric data is often predicated by a two-way exchange between individuals and third parties: people consent to have their biometric data collected and used, and third parties provide assurances that collected biometric data will only be used for authorized purposes. But what happens when that trust is broken and third parties use biometric data for unauthorized purposes? Further, even if you trust the company collecting your biometric data, can you trust all the partners they share data with? 

As we increasingly trust our personal biometric data to AI-driven technologies, we grapple with a key question: how do we safeguard our most intimate details in the digital realm? The answer may lie in the complex intersection of data privacy and IP.


Current Regulatory Landscape

Technological frontiers often far outpace the corresponding legal landscape, and the area of biometrics is no exception. Currently, there is no comprehensive federal regulation that governs the collection and use of biometric data, and regulations in this area have been left up to the states. Illinois, Texas, and Washington have all passed legislation that addresses the collection and use of biometric data by private entities. For example, the Illinois Biometric Information Privacy Act (BIPA) requires entities that collect, use, and store biometric data to comply with certain regulations and requirements. BIPA also provides a private right of action for individuals and entities to recover damages from entities that collect biometric data when such entities do not comply with BIPA’s regulations and requirements.

Without a comprehensive federal regime that covers biometric data, entities are forced to comply with the patchwork of legislation provided by the states. However, because BIPA was the first comprehensive state legislation to address biometric data, many states have modeled their own legislation after BIPA. Furthermore, at this stage, BIPA is the most restrictive state legislation that addresses the collection, use, and storage of biometric data. As such, at a practical level, and rather than changing biometric collection policies state by state, some entities have opted to comply with BIPA across all states where biometric data is collected and used, as compliance with BIPA may ensure compliance across all states (at least for now).


An Innovative Approach: Treating Biometrics Like Trade Secrets

Some legal scholars have suggested treating biometric data like intellectual property (IP) using the legal framework for trade secrets. Such treatment may be used to address biometric data collection and provide private rights of action for damages in addition to, or in the alternate to, state and/or federal legislation. Treatment of biometric data under the trade secret framework could reshape the approach to biometric data handling and storage in several important ways:

  •     Much like trade secrets, access to biometric data could be highly restricted. This could mean stringent controls on who can access the data and when, with significant penalties for unauthorized access or use.
  •     Just as trade secrets are protected indefinitely, biometric data protections wouldn’t have an expiration date. This safeguarding could apply even when individuals stop using a service or platform.
  •     If trade secrets are leaked or stolen, businesses can sue for damages. If biometric data were given the same status, individuals would have significant legal recourse if their data were mishandled.

Despite some of the apparent advantages of this innovative approach, some industry experts have expressed concern that the trade secret framework may be ill-suited for protection of biometric data.


Challenges to Enforcing Biometrics as Intellectual Property

First, from a practical standpoint, the obligations to restrict access to trade secret information runs counter to the needs of some entities to share and access biometric data to train AI models or for other authorized purposes. Second, because individuals are the ultimate owners of their biometric data, treating a person’s biometric data as a company’s trade secrete may subject companies to the obligations of trade secret regulations without the typical benefits of exclusively owning the underlying IP. Ultimately, rethinking the way we legally categorize and protect biometric data will need to be a part of our evolving relationship with AI and data privacy.

In the race toward the future, it’s important to advocate for and shape regulations that balance innovation with privacy. After all, as we pioneer the next big wave of AI innovation, we must also lead the charge in ensuring it remains fair, equitable, and respectful of the individuals it serves.


Nathan Mutter, Partner at Holland & Hart LLP, is well-versed in both U.S. and foreign patent practice and has significant experience prosecuting patent applications in the U.S., Europe, Japan, China, and Korea. He serves on the Intellectual Property Owners (IPO) Education Foundation Educate & Enable Committee. IPO Education Foundation programs educate about the value of intellectual property protection and the work of its committees focus on promoting innovation and creation by, within, and for underrepresented communities. This article was created by this author for the IPO Education Foundation to provide educational background. It should not be construed as providing legal advice or as presenting the view of the IPO Education Foundation.