Friday, December 5, 2008

Study Of Faulty Fingerprints Debunks Forensic Science 'Zero Error' Claim

Our textbook says that a fingerprint is an example of an individual characteristic, not a class characteristic; however, this is not necessarily the case. A study of faulty fingerprints debunks forensic science “zero error” claim. Recently, an innocent American was accused of being involved with the Madrid train bombing. The man was convicted of the crime based on fingerprint evidence, but later released because of an error. Now, criminologist, Simon Cole is investigating the errors that can occur in regards to fingerprint analysis. According to Cole, as many as a thousand incorrect fingerprint “matches” could be made each year in the United States, despite safeguards intended to prevent errors. Cole believes that the 22 exposed fingerprint errors, are just the “tip of the iceberg.” These errors were only discovered by coincidences, such as a post-conviction DNA test, the intervention of foreign police and even a deadly lab accident that led to the re-evaluation of evidence. Wrongful convictions on the basis of faulty evidence are supposed to be prevented by four safeguards: having print identifications “verified” by additional examiners; ensuring the examiners are competent; requiring a high number of matching points in the ridges before declaring the print a match; and having independent experts examine the prints on behalf of the defendant. Nevertheless, these safeguards have obviously failed in some cases. According to Cole, “Rather than blindly insisting there is zero error in fingerprint matching, we should acknowledge the obvious, study the errors openly and find constructive ways to prevent faulty evidence from being used to convict innocent people.” Through his studies Cole has found the aggregate error rate of fingerprint analysis to be 0.8 percent, which means 1,900 mistaken fingerprint matches made in one year.

3 comments:

Anonymous said...

Simon Cole's work is important in alerting us to the fallibility of the forensic sciences but it should not lead us to dismiss their importance as a major crime prevention and detection tool. The difficulty with fingerprints is the uncertainty over the error rate.

Some experts will claim a 0% rate providing the correct procedures are adopted but this is a deceptive and dangerous conclusion given that every identification involves a degree of human subjectivity and in many cases relies on technological accuracy. If you add to this the psychological, emotional and cultural influences that play on experts as they go about their work your confidence in a 0% error rate quickly evaporates.

I believe that more work is required to create internationally recognised standards and the checks and balances necessary to ensure that forensic analysis is objective and accurate.

For further information see: www.shirleymckie.com

Anonymous said...

This article was very interesting because we have recently learned that fingerprints are considered to be an individual characteristic, however, Cole's work proves otherwise. There was a very good amount of detail in the review, giving examples that prove validity of Cole's work. I knew that fingerprints were similar, but I did not know that they could be mistaken for one another.

Matt said...

This is a very well written article. It was an interesting topic to talk about. This shows that forensic scientists do make mistakes. Im curious as to whether or not they would have realized their mistake if they didnt have to lab accident and go back and re-evaluate. Maybe there should be more ways to test fingerprints to make sure this doesnt happen again