The language of the piece has been updated to more clearly include scientific challenges as a potential concern.
Federal law enforcement agencies have long used forensic testing to associate physical evidence found at a crime scene with a specific individual. Formerly a manual task, this process is now increasingly automated by forensic analysts using computer-based forensic algorithms. Last week, the U.S. Government Accountability Office (GAO) released a new technology assessment report titled Forensic Technology: Algorithms Strengthen Forensic Analysis, but Several Factors Can Affect Outcomes, which provides an in-depth analysis of the three most common systems: latent fingerprint matching algorithms, facial recognition algorithms, and probabilistic genotyping algorithms that analyze complex mixtures of DNA.
According to the report, while each system has some strengths compared to conventional methods, there are a series of challenges associated with the use of these algorithms, such as poor quality input data, human user error, and bias across demographic groups, including by race and gender. In the report, the GAO makes three policy recommendations around the use of algorithms in order to address the challenges: (1) increasing training for analysts and investigators; (2) developing standards and policies on appropriate use; and (3) increasing transparency around the testing, performance, and use of these algorithms.
While each of these recommendations would improve the status quo, the report overlooks an immediate remedy available to policymakers: ending the abuse of trade secret laws to suppress relevant evidence in criminal cases, which allows vendors to evade scrutiny of law enforcement forensic software systems while repressing the rights of the criminally accused.
The problem of trade secrets
To be sure, the report does recognize that “algorithm developers may not want to divulge proprietary information related to training and testing.” It also identifies the troubling practice of some private vendors that actively block independent scientists from conducting validation studies of their tools by refusing to provide research licenses, all while the vendors simultaneously argue in court that their tools’ outputs are subject to peer review and thus should be admissible evidence against criminal defendants.
But the report fails to mention the one thing that facilitates these abuses, something which could be easily changed straightaway. Developers who sell or license forensic algorithms to law enforcement routinely claim that they have a special trade secret entitlement to entirely withhold relevant evidence about how these systems work from criminal defense expert witnesses. As a result, they refuse to disclose key methodological information about these tools to criminal defense teams, even when these teams would agree to a court-supervised protective order designed to safeguard companies’ intellectual property. As a New Jersey appellate court recently observed in being the first court in the nation to reject such a privilege, “[w]e are mindful of the important need to maintain the confidentiality of trade secrets . . . But shrouding the source code and related documents in a curtain of secrecy substantially hinders defendant’s opportunity to meaningfully challenge reliability.”
The practice of asserting a trade secrets privilege to block cross-examination by the criminally accused is a misuse of intellectual property law. Trade secret law is designed to protect information from theft by business competitors. It is not supposed to stymie due process or block judicial truth-seeking. Moreover, trade secrets are routinely disclosed under protective orders in civil disputes for trade secret misappropriation—in other words, in circumstances where one business entity is already accusing another of stealing its intellectual property.
In contrast, in criminal cases where the results of forensic algorithms are introduced as evidence, there is no meaningful risk of trade secrets being disclosed to a business competitor. To be sure, criminal defense counsel and defense expert witnesses want to probe and test these algorithms. After all, it is their job to challenge the prosecution’s evidence. But criminal defense advocacy is not the same as theft by a business competitor. Cross-examination is not misappropriation. It is a constitutional and statutory right of the criminally accused that should not be blocked by trade secret law.
The GAO omits a necessary legal perspective
The GAO report may have overlooked this trade secret problem because it brings a largely scientific, rather than legal, perspective to the lack of transparency in law enforcement forensic algorithms. For instance, the report highlights serious problems that lack of transparency creates for scientific peer review and reproducibility, including that most of the studies evaluating DNA classification software have been undertaken by the developers or law enforcement agencies themselves, rather than independent academic researchers. The report also points out that updates to software may require new internal validation studies, the omission of which may be obscured by a lack of transparency around version control.
Yet transparency around the evidence that the government uses to convict people accused of crimes is not merely a scientific best practice—it is a deep-seated, constitutionally-required commitment to due process. Scientific and legal values for transparency in forensic algorithms are distinct for good reasons. To start, their stakes differ. If a scientific theory is incorrect, subsequent studies can disprove it and get scientific inquiry back on track. But if the government uses incorrect evidence to try, convict, and incarcerate a criminal defendant, that person can never regain their lost time, freedom, dignity, or in some cases their life.
Appropriately, then, the standards that science and law apply to forensic algorithms also differ. The report focuses on how the scientific community assesses the accuracy of different forensic algorithms—for instance describing a comparative performance test of fingerprint algorithms that was conducted by the National Institute of Standards and Technology. But the report never mentions how the legal community assesses prosecution evidence through transparency and contestability. The federal Daubert standard for admissibility requires judges to evaluate not merely whether a scientific technique has achieved “general acceptance” in a scientific community, but also to assess the level of independent academic scrutiny to which the technique has been subjected, as well as whether it has a known or knowable error rate. Once evidence is admitted, criminal defendants have rights to confront and cross-examine that evidence. And defendants also have constitutional and statutory rights to discover information about that evidence that could be helpful to cross-examination. Together, these rights create the wholly distinct legal transparency mandate that is central to the truth-seeking process of the courts.
What policymakers can do
Fortunately, some policymakers have moved to stop this abuse of trade secret law. Taking guidance from the building blocks of the report, Rep. Mark Takano (D-Calif.), has introduced H.R. 2438, the Justice in Forensic Algorithms Act of 2021. Alongside multiple other beneficial reforms, including the development of standards and a testing program for forensic algorithms, this bill would prohibit the use of a trade secret evidentiary privilege to withhold relevant evidence from defense attorneys in criminal cases. If the bill passes, developers would still be able to get appropriate protective orders from the courts to safeguard their intellectual property interests, but they would no longer be able to rely on trade secret arguments to entirely suppress relevant evidence from cross-examination by the defense.
The GAO report shines an urgent spotlight on technical issues concerning law enforcement’s use of forensic algorithms. But forensic algorithms are a legal as well as a technical problem. Policymakers should act now to ensure that forensic algorithm vendors do not abuse intellectual property laws to undermine due process values.
The author served as an expert consultant on the GAO report and on the Justice in Forensic Algorithms Act of 2021.
 For a general discussion of this issue, see Rebecca Wexler, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System, 70 Stan. L. Rev. 1343 (2018).