Senator Leahy is spearheading a legislative movement to improve forensic science as scientific evidence, and I am cheering at his efforts (for an update on his efforts, see http://public.cq.com/docs/weeklyreport/weeklyreport-000004132586.html
uh oh –what if forensic science really is broken? Can you imagine if you were standing in trial, and the experts testifying against you really had no idea how well their methods work, except that they believe their methods work really well? What a frightening dilemma, and yet it is happening in American courts.
The National Academy of Science published a now quite-famous report on how many kinds of forensic science are really broken indeed. When I came to the USDOJ’s National Institute of Justice as a visiting fellow in 1995 (yes that many years ago), the first of the “impression evidence” forensic methods –handwriting identification purely by eye and memory– had just gone through its first real evidentiary attack. Starzecypyzel. That’s a case name that is not only a bear to spell, it was a bear for the field of handwriting identification. And I just happened to be the one closest to the mama bear whose cub of forensic science was getting tossed out of court. Yikes. I was tasked by NIJ to work with the handwriting community to develop a scientific research agenda and standards for the field –more about that in another post.
In the NAS report, both handwriting identification and fingerprint examination are shown to need serious empirical work as well as training forensic technicians to perform techniques in ways that minimize confirmation bias (Supervisor: “I see it, why don’t you?” Trainee: “Well, gee, now I see it too!”)
This kind of empirical work includes database development, standard operating protocols, and then validation testing. My chapter in Solan and Tiersma’s Oxford Handbook of Language and Law (2012) details how this kind of empirical work is done. It can be done, just not overnight or right before your next big case.
It takes time to collect data that you can use as “ground truth” –where you know the relevant features of the data, like authorship for testing an authorship method, or truth for testing a deception detection method.
It takes time to set up standard operating protocols. Why would that be the case? Because especially in techniques that rely totally on a technician’s visual ability, such as handwriting or fingerprinting, protocols are very difficult to verbalize.
It takes time to write software that automates these kind of protocols, but both fingerprinting and handwriting identification are now reaping the benefit of software built to identify fingerprints and handwriting (more about in another post).