Saturday, May 31, 2014

Quarreling and Quibbling over Psychometrics in Hall v. Florida (part 1)

In Hall v. Florida, 134 S.Ct. 1986 (2014), the Supreme Court struck down Florida’s practice of using an IQ score of 70 points or less as a dispositive measure of the level of the “intellectual disability” that precludes capital punishment. Justice Kennedy’s opinion (joined by Justices Breyer, Ginsburg, Sotomayor, and Kagan) summarizes the case pithily:
This Court has held that the Eighth and Fourteenth Amendments to the Constitution forbid the execution of persons with intellectual disability. Atkins v. Virginia, 536 U.S. 304, 321 (2002). Florida law defines intellectual disability to require an IQ test score of 70 or less. If, from test scores, a prisoner is deemed to have an IQ above 70, all further exploration of intellectual disability is foreclosed. This rigid rule, the Court now holds, creates an unacceptable risk that persons with intellectual disability will be executed, and thus is unconstitutional.
Id. at 1990. Led by Justice Alito, Chief Justice Roberts and Justices Scalia, Kennedy, and Thomas dissented. The Court, they maintained, was overruling Atkins and adopting “a uniform national rule that is both conceptually unsound and likely to result in confusion.” Id. at 2002 (Alito, J., dissenting). Among other things, the dissent warns that the Court “misunderstands” the statistical concepts of standard error and confidence intervals, id. at 2009, and that it therefore “makes factual mistakes that will surely confuse States attempting to comply with its opinion.” Id. at 2010.

The dissent has a point. Parts of the majority opinion are elliptical and potentially confusing. Nonetheless, some of the harsh critique is overdrawn. Moreover, Justice Alito's presentation of psychometric concepts also is hardly error-free, inviting a rejoinder of "tu quoque."

Therefore, in a series of postings yet to come, I will, in Justice Alito's words, "wade[] into technical matters that must be understood in order to see where the Court goes wrong." But I'll do the same for the dissenting opinion's presentation of these matters.


Saturday, May 10, 2014

Latent Fingerprints and the Uniqueness Hypothesis

It has been argued many times that assertions of the uniqueness of all fingerprints are insufficient to warrant claims that a latent print must match the fingerprints of one and only one individual in the world. However, some fingerprint analysts and criminalists, remaining true to the faith of an earlier generation (e.g., Swofford 2012; Vanderkolk 2012), continue to rely on uniqueness as the conceptual and practical foundation for their work.

In this regard, it may be worth noting a paper from two computer scientists that estimates the probability that the latent print that the FBI misidentified in the Madrid train bombing case back in 2004 would have matched at least one individual in the AFIS database. According to the abstract of Su and Shrihari (2010a):
While tremendous efforts have been made in 10-print individuality studies, latent fingerprint rarity continues to be a difficult problem and has never been solved because of the small finger area and poor impression quality. The proposed method is able to predict the core points of latent prints using Gaussian processes and align the latent prints by overlapping the core points. A novel generative model is also proposed to take into account the dependency on nearby minutiae and the confidence of minutiae in the probability of random correspondence calculation. The new methods are illustrated by experiments on the well-known Madrid bombing case. The results show that the probability that at least one fingerprint in the FBI IAFIS databases (over 470 million fingerprints) matches the bomb site latent is 0.93 which is large enough to lead to misidentification.
To be clear, Su and Shrihari do not suggest that full fingerprints of anyone in the AFIS database match those of the Madrid bomber. Their point is that the features of the latent prints that led to the misidentification of Mayfield are not likely to be unique. But surely, when it comes to thinking about the relevance of the uniqueness hypothesis for fingerprints and assessing the probative value of latent fingerprint identification, this is what matters.

The paper overlaps another one (Su and Shrihari 2010b) by the same authors delivered at another conference. I have not searched for responses to their work.


Chang Su & Sargur N. Srihari, Latent Fingerprint Rarity Analysis in Madrid Bombing Case, in Computational Forensics: 4th International Workshop, IWCF 2010, Tokyo, Japan, November 11-12, 2010a, Revised Selected Papers (Sako, Hiroshi; Franke, Katrin; Saitoh, Shuji eds. 2011), Lecture Notes in Computer Science, 6540:173-184

Chang Su & Sargur Srihari, Evaluation of Rarity of Fingerprints in Forensics, in Proceedings of Neural Information Processing Systems, Vancouver, Canada, Dec. 6-9, 2010b

Henry J. Swofford, Individualization Using Friction Skin Impressions: Scientifically Reliable, Legally Valid, J Forensic Identification 62:65-79, 2012

John R. Vanderkolk, Examination Process, in The Fingerprint Sourcebook 9-3 to 9-26, 2012