[T]he PCAST report acknowledged its own dubious value to courts, stating, “Judges’ decisions about the admissibility of scientific evidence rest solely on legal standards; they are exclusively the province of the courts and PCAST does not opine on them.” -- Chief Judge Marlin J. Appelwick, Washington Court of Appeals, Mar. 11, 2019
PCAST has taken it upon itself to usurp the Constitutional role of the Courts and decades of legal precedent and insert itself as the final arbiter of the reliability and admissibility of the information generated through these forensic science disciplines. -- National District Attorneys Association, Sept. 2, 2016
Both of the above statements are unacceptable. The NDAA's fulminations are discussed in a previous posting. As for yesterday's opinion of the Washington state Court of Appeals in State v. DeJesus, 436 P.3d 834 (Wash. Ct. App. 2019), the 2016 report of the President's Council of Advisors on Science and Technology has considerable value to open-minded courts. It is not an impeccably correct discourse on what it takes to validate expert testimony in criminal cases, but its discussion of the place of direct, empirical validation of traditional pattern-matching testimony is illuminating. Whether the inquiry into scientific validity is relatively direct (Daubert) or indirect (Frye), validity is vital in rulings on the admissibility of scientific evidence. The PCAST report describes a vision of the components of validation and applies it to several forensic-science pattern-matching fields. That the report does not purport to reach conclusions about the admissibility of evidence the Council's intensive report does not render it of "dubious value to courts."
In State v. DeJesus, a jury convicted Geraldo DeJesus of two murders and other crimes in a trailer park. Part of the evidence against him came from a former state crime laboratory analyst who "concluded that the cartridge casings from the scene had consistent markings with the casing found in the Smith and Wesson envelope, indicating that they were fired from the same gun" (The opinion does not state precisely how she presented this finding to the jury.) Relying on two National Academy of Sciences reports and declarations from defense experts, DeJesus had tried to exclude this testimony as lacking general acceptance in the scientific community.
DeJesus argued on appeal "that the trial court erred in failing to conduct a Frye hearing and admitting expert testimony on ballistic identification." The state tried to narrow the relevant community to toolmark examiners. The court of appeals did not state whether acceptance among toolmark examiners alone would be sufficient, but rather relied on opinions from other jurisdictions admitting firearms identifications (expressed in various ways) even after
the NAS and PCAST reports.
The court was not fazed by the conclusion in the PCAST report that "the foundational validity of the field had not been established." The reason? Because the report added that (a) "[w]hether firearms analysis should be deemed admissible based on current evidence is a decision that belongs to the courts" and (b) "[v]alidity as applied would also require ... that an expert testifying on firearms analysis ... has undergone rigorous proficiency testing on a large number of test problems to measure his or her accuracy and discloses the results" together with accuracy as seen in a properly designed experiment.
But the neologism of "validity as applied" does not bear on the threshold question of whether a putatively scientific test is generally accepted as valid. One can maintain that PCAST's notion of "foundational validity" sets the bar too high and that with different (but still acceptable) criteria, the scientific community has found the procedure for associating expended case cartridges with one another to be valid. But proficiency tests (that are not representative of case work) cannot transform a technique that many knowledgeable scientists do not regard as scientifically validated into one that is generally accepted as valid. The PCAST report could hardly have been clearer about this:
Neither experience, nor judgment, nor good professional practices (such as certification programs and accreditation programs, standardized protocols, proficiency testing, and codes of ethics) can substitute for actual evidence of foundational validity and reliability. (P. 6)PCAST's final words on firearm-toolmark identification are in an addendum to the report issued on January 6, 2017. There, we are told (pp. 7-8) that:
From a scientific standpoint, scientific validity should require at least two properly designed studies to ensure reproducibility. The issue for judges is whether one properly designed study, together with ancillary evidence from imperfect studies, adequately satisfies the legal criteria for scientific validity. Whatever courts decide, it is essential that information about error rates is properly reported.
No comments:
Post a Comment