May 13, 2001

The Myth of Fingerprints

By SIMON COLE
Future historians of science and law may well date the beginning of the end of fingerprinting to the opening night of the third season of "The Sopranos." Coked to the gills, Christopher Moltisanti, Tony Soprano's nephew, brings Livia Soprano's wake to an absurd anticlimax as he muses on the claim that no two fingerprints are exactly alike. For scientists to know this, Christopher reasons, they would have to get everyone in the world together in one room to check. And not just everyone in the world, but everyone who ever lived. Since this would be impossible -- even using computers -- he concludes, "They got nothin."' 

He's right, as it turns out. The claim that no fingerprint has ever appeared twice was first popularized more than a hundred years ago, and by dint of analogy (with other natural objects like snowflakes), lack of contradiction and relentless repetition, this bit of folk wisdom became deeply enshrined. By extension, it lent the technique of forensic fingerprint analysis an aura of infallibility. More than just a useful tool, it came to be regarded as a perfect system of identification, and examiners' testimony at criminal trials came to be practically unassailable. 

Until now, that is. In 1998, in Delaware County, Pa., Richard Jackson was sentenced to life in prison for murder based largely on a fingerprint match to which three experts had testified. The defense argued, unsuccessfully, that it was a bad match. But after Jackson spent more than two years in prison the prosecution conceded the error, and he was freed. In Scotland a murder case was upended when detectives found a fingerprint at the scene of the crime that belonged to a police officer -- who claimed she'd never been there in the first place. To verify her claim, she brought in two fingerprint analysts who attested that not only had her fingerprint been misidentified, but so had the print, found on a tin at the home of the accused, originally attributed to the victim. 

As these cases suggest, the relevant question isn't whether fingerprints could ever be exactly alike -- it's whether they are ever similar enough to fool a fingerprint examiner. And the answer, it's increasingly, unnervingly clear, is a resounding yes. A recent proficiency test found that as many as one out of five fingerprint examiners misidentified fingerprint samples. In the last three years, defendants in at least 11 criminal cases have filed motions arguing that fingerprinting does not meet even the basic requirements for scientific and technical evidence. The first such challenge -- filed on behalf of Byron Mitchell, who was being tried for robbery -- involved five full days of testimony on the credibility of the technique by leading fingerprint examiners and academic critics, including myself. There's no way to say how these cases, some of which are still on appeal, will be decided, but it is clear that puncturing the myth of fingerprinting's infallibility and scientific validity poses a grave threat to its century-long reign. 

But ultimately, the most dangerous threat to fingerprinting may be cultural, not legal. Much of the public's faith in fingerprinting has derived not from law but from culture: from the ubiquitous use of the fingerprint as a metaphor (think of chemical and electronic fingerprints); as an icon (think of advertisements, mystery novels and the Court TV logo) of truth, science and most of all, individual identity. Our fingerprints were unique, and, therefore, so were we. As it happens, a new metaphor has arisen just in time to fill the breach. These days we are increasingly apt to believe that our individuality is vouched for by the unique arrangement of genetic material in our cells. And DNA can now do nearly everything that fingerprinting does. Forensic scientists can recover identifiable DNA samples from ever-smaller traces of biological material, even the stray cells left by the smudge of a finger. Forensic DNA profiling, which has notably shed the early nickname of "DNA fingerprinting," is a perfect match for high-tech millennial sensibilities. Old-style fingerprinting, with its reliance on human observation and its correspondence to a romantic notion of our place in the universe looks . . . well, just so last century. 

If this is indeed the beginning of the end of fingerprinting, history will be repeating itself. A century ago, fingerprinting was the upstart rival of the world's dominant method of criminal identification: the Bertillon system, which used 11 bodily measurements, facial features, birthmarks, scars and tattoos to pinpoint individual identities. The transition to fingerprinting was treated as proof that the world was growing more rational, more discerning. But there may well come a time when our own genetically enhanced descendants find our belief in the power of fingerprinting as quaint as we find the Bertillon system. 

What are we to make of the end of fingerprinting? Not simply that we are growing steadily less gullible and more scientific. Rather, that the consensus that coalesces around scientific ideas is more easily built than we might like to think, that legal and public trust can be won over with a culturally resonant image. Over the course of history, even those propositions that seem most indisputable become fragile; our belief in them, fickle. In this increasingly scientific era, it's a fact worth remembering before we imbue the next foolproof system with the same aura of infallibility that we once ascribed to fingerprints. 

Simon Cole is the author of "Suspect Identities: A History of Fingerprinting and Criminal Identification" (Harvard University Press). 




Junk Science
Truth in Justice