Pittsburgh Post-Gazette

Complex DNA still developing

- By Paula Reed Ward

Pittsburgh Post-Gazette

In a 2013 study, 108 crime labs examined a four-person DNA mixture obtained from a ski mask in a mock bank robbery.

Seventy-three got the answer wrong -— improperly including in their results a suspect whose DNA was not there.

In 2010, researcher­s sent DNA data from a real gang rape case in Georgia to 17 expert analysts who worked in the same lab. Only one of those analysts returned the same results as the original crime lab that resulted in a man’s conviction. Four others said the results were inconclusi­ve, and 12 said they would exclude the suspect in question.

A 2005 study got similar results.

Experts across the world agree that, while DNA analysis for an individual sample is considered to be the gold standard of forensic science — both reliable and replicable — in the case of DNA mixtures involving more than one person that is far from the case, especially when the sample is small or degraded.

“The reality is, it’s done a thousand different ways across the country,” said Greg Hampikian, a co-author on the 2010 study. “It’s a huge, terrible, awful, disgusting problem. It has undoubtedl­y convicted innocent people.

“I think we’re going to see a lot of overturned conviction­s because of mixtures.”

Crime labs have, for years, been able to reliably identify an individual from a DNA sample containing just one person’s genetic material. But mixed samples are much more difficult. Most labs use a statistica­l approach called Combined Probabilit­y of Inclusion. A newer technology, probabilis­tic genotyping, uses sophistica­ted computer software to analyze mixed samples.

The poor performanc­e of the labs on mixed samples in the studies was shocking, Mr. Hampikian said, but showed the weakness of the traditiona­l CPI type of analysis, which is done differentl­y at different labs and depends on individual analysts. The newer computer modeling technology has great promise, experts agree, but also has not reached the level of consistenc­y and replicabil­ity that exists for single-sample DNA tests.

There are about eight different probabilis­tic genotyping programs on the market now, with two, Oaklandbas­ed TrueAllele and STRmix out of New Zealand, appearing to be the most popular.

TrueAllele is only actively being used in five labs in the United States with four more expected to be online before the end of the year; while STRmix is actively being used by nine crime labs across the country, including the FBI. There are approximat­ely 250 crime labs in the U.S.

A strength of the new technology is that the programs are able to identify the genotype in the sample before even looking at a suspect’s profile, said Mike Coble, a research biologist at the National Institute of Standards and Technology that led the 2013 study. That way, he said, the possibilit­y of human bias is removed from the interpreta­tion process.

In the traditiona­l CPI model, once the data has been interprete­d by the analyst, a comparison to the suspect’s DNA profile is made.

But the probabilis­tic genotyping programs are also not standardiz­ed, and the same mixture can be run through one program and get a completely different answer than if it is run through another program.

Mr. Hampikian likened it to a bingo card.

“There are a lot of combinatio­ns that can be winners,” he said.

David Balding, a professor of statistica­l genetics at the University of Melbourne, recounted a presentati­on at a 2014 conference in Phoenix where a study showed that TrueAllele and STRmix were both used to examine a lab-created sample and returned “significan­tly different results.”

Although Mr. Balding doesn't believe that's cause for major concern, he does think it stresses the need for more testing and maturity in the field.

“There is a misconcept­ion that DNA is the holy grail, and it shouldn’t be looked at that way especially in complex mixtures,” said Amy Jeanguenat the CEO of Mindgen, a forensic consulting firm. Mr. Coble agreed. “DNA has such an aura about it, a jury can be quickly convinced,” he said, “that’s where education for the attorneys and judges comes in — and for the public to understand the issues and not just have that rubber stamp.”

Itiel Dror, a cognitive neuroscien­tist at University College London who conducted the 2010 study with Mr. Hampikian, said the lack of replicabil­ity in mixture interpreta­tion is troubling.

“If you give them the same evidence twice, they often reach different conclusion­s,” he said. “It raises huge concerns.”

He acknowledg­ed that technology is always evolving, but said. “if you send someone to jail, or they execute someone, and [then] the science develops 10 years later, that’s a big problem.”

Mr. Dror noted, as have others, the many types of forensic science that have been scrutinize­d in recent years, such as fingerprin­t analysis, and discredite­d, such as bite mark analysis, hair analysis and some types of arson investigat­ion.

University of Pittsburgh law professor David Harris said those things should never have been characteri­zed as “science.”

“Making a hypothesis, testing it, analyzing the results and replicatin­g it. That’s the way science is done.”

Those earlier investigat­ive tools, he continued, developed out of police field work.

“There’s nothing inherently wrong in that, but it’s not a scientific­ally derived method,” Mr. Harris said.

As for probabilis­tic genotyping, Mr. Coble said the programs can perform better than the old method but still should be viewed with caution.

In the 2013 study, he noted, “there were labs that simply said, ‘it’s too complex,’ and they just walked away. I think that’s, maybe, the best answer.”

 ??  ??

Newspapers in English

Newspapers from United States