This is one of my favorite pet peeves: a smart person comes out wearing a lab coat[1] and holding a clipboard. They make a claim, and one of two things happen.
Firstly, news reporters with little understanding of the subject matter try to explain it to the masses, and the results are disastrous. (Not to mention, the drive for academia to publish papers that prove the hypothesis.) Little needs to be said here.
The second is that I frequently see my peers latch onto any PDF they can download that supports their position. "Look, look! The data shows I'm right! And this man with a PhD supports my conclusion! (Of course I haven't read the report, but that's not important. What is important is my pet issue!)" Prosecutors using sloppy lab work to get convictions and computer algorithms deciding bail bonds are just end results.
We need to put the brakes on the empiricism train. It's useful, but only to a certain point. Reason eventually needs to take over. If your argument lives and dies by the next PDF or Machine Learning model, you need to find a new argument, or bolster your argument with reason.
(Meta argument: I'm reading a report about a paper someone published, and latching onto the report because it supports my conclusion that people will believe anything a "scientist" says! I'm victim to my own problems!)
> We need to put the brakes on the empiricism train. It's useful, but only to a certain point.
We need to foster a deeper understanding of how empiricism actually works, what its limitations are, and the difficult and often lengthy process through which varying degrees of certainty are obtained and adjusted over time.
I also think it is important to spread the value of grounded skepticism (including of one's own knowledge) and the power of freely admitting mistakes.
Obviously, none of this is easy. But it is worthwhile and I think small improvements will have a large impact through ripple effects over time.
We shouldn't throw the baby out with the bathwater, in regards to empiricism.
Few people who haven’t done a variety of masters or PhD level research really understand how there are only degrees of “knowing” something.
Teaching the scientific method early on, and really getting to the heart of what it means to stay unsure about things, is something the education system could do a better job of, but it’s no small task.
Cherrypicking and confirmation bias aren't problems with empiricism, they're well-known flaws in people's practice of empiricism. In fact, the "book of empiricism" says not to do them. You just have to apply the same standards when collecting studies for meta-analysis as you do when collecting data for studies.
Your experience seems to involve hanging out with people who regularly cite academic articles. Is that accurate?
There are many people outside of that world who either downplay, or reflexively reject, any claim that comes from peer reviewed research, if they personally disagree with the outcome.
These folks won’t even consider academic peer reviewed articles as a legitimate source of information.
This viewpoint is disturbingly far reaching in my experience, in part due to poor reporting of science. I’ve had friends and family question if researchers of any discipline know much of anything because of frustration with dietary research reporting: “one month eggs are good for you, one month eggs are bad. Which one is it?!” They end up rejecting researchers and scientists of all types simply because of a paper here or there which got traction in mainstream media but was perhaps reported in a misleading fashion... and that does get to your point of how important quality journalism is.
I feel far more distressed by the attitude of “my ignorance is equal to their experience” than people pointing to a PDF of research and saying, “here’s the data,” which seems to be a piece of what you are criticizing. Am I understanding that correctly?
If people can read the study, they can question specific methodologies, rather than coming up with fun straw men such as, “well these researchers in this field need this to be the case, because otherwise they wouldn’t have a job.” I’ve heard that argument used on climate researchers and dietary researchers at a minimum, but those trends can extend to people’s thoughts where they start to think, “well those researchers are up to no good. I bet they have the cure for cancer and they just don’t want to release it.”
I’m all for ensuring people appeal to reason and not just take the hottest headline (be it in popular press or academic papers), but there is a problem with a large portion of the population that outright rejects many research papers altogether.
That's a twisty comment and unrelated to the core information communication you described above.
People have opinions and everyone needs to correct for their opinions. I prefer it when people are open about their opinionated because everyone omits or rewords statements to suit their purpose.
When a random reporter is told to check out a scientific paper and tries to summarize it they take a complex topic and attempt to express it as best as they're able.
Those scientific emissaries who are skilled at their scientific expertise and public speaking are able to express those complex topics more correctly and more approachably... If a report then copy-paste's their speech and publishes it in a newspaper or reads through the speech and tries to pull out things they consider important, they're working from a better starting point, the important information has already been distilled and expressed by an expert.
Since Sessions ended an inquiry into obvious junk science being used in courtrooms (which itself seemed to be moving pretty slowly given the scale of the problem), my prediction is that nobody will do anything whatsoever about this finding.
> Since Sessions ended an inquiry into obvious junk science being used in courtrooms
(I'm not from the US!) Is this the stuff like "arson detectives" working with completely unsubstantiated theories, often enough sending people to jail even if they were innocent?
If you have enough money to pay for experts to shoot this nonsense down you are fine but if you don't have the money you are in serious trouble. Some years ago I was part of a jury in a trial about assault with a deadly weapon . The prosecution had an expert witness who talked about lighting conditions during sunset. I do a lot of photography so I know about little about this topic myself. I thought to myself "WTF is this guy talking about? This is plain wrong". But as jury member you can speak up only during deliberation and the defendant was a poor guy whose lawyer didn't challenge this nonsense.
Thank God the case was so flimsy that we voted "Not guilty" anyway but it's really easy to get convicted based on junk science. Unless you have the money to challenge that.
having the jury ask questions can really only hurt the defendant. if a juror is so confused that they feel the need to ask a question, they should just count that point towards "reasonable doubt" and move on.
Cross examination of an expert is pretty important.
Judges and juries are not supposed to allow fringe opinions as experts and even the best experts may hold non-consensus positions on some matters.
The fix for that system would be a "Mechanical Turk" of an expert and taking the majority vote or, if there's no consensus, dropping the matter.
The important part here would be complete blinding of each expert.
Arson analysis is definitely in there too. He's right about that. There was a PBS documentary and they were talking about how the investigators would be taking nonsense about "feeling the fire" and they convicted an innocent man of killing his entire family.
>> The good news is that there are methods to reanalyze old DNA mixture data using computer programs that can help analysts correct errors, without any new lab testing. In fact, one lesson from the study is that while only seven of the 108 labs in the study properly excluded the innocent profile, one of them used such a program (TrueAllele by Cybergenetics).
>> (...)
>> In fact, we have shown that this is possible. Working with Cybergenetics analysts and Innocence Network organizations in four states, our Boise State University laboratory has re-examined a few select cases and already persuaded courts to overturn a conviction in New Mexico, two in Indiana and two in Montana. We have also helped identify a new suspect in a 23-year-old murder.
This leaves a bad taste. Have I just read an advert for a proprietary piece of software, masquerading as an article?
It's shocking to me that something so critical to the criminal justice system is not more tightly regulated nor monitored continuously for sensitivity/specificity.
Given courts have ruled drug dogs don’t need to be tested for accuracy or to make sure they’re not influenced by their handlers this isn’t very surprising.
Seems like the more I learn about the system the more of a scam it seems.
Speaking of which Serial season 3 just started which covers this kind of stuff, more in the small than a grand ‘Thing X is the problem we want solved that will make things much better’.
I don't quite follow. Why would drug dogs need to be tested for accuracy? Is the indication of a dog accepted somewhere as evidence?
I have been assuming that dogs give false positives every now and then, and there's a search for some actual evidence based on that indication, which everyone knows can be a false positive.
Dogs are routinely used to bypass the 4th amendment against unlawful search and seizure. They are a "probable cause" machine, since all that needs to happen is that the dog gets excited. Without the dog, it could be argued that those pieces of evidence were obtained without probable cause for a search, in which case they would be discarded.
So you're a cop and you want to search someone, but don't have the legal basis to do so? Go get the dog and get him excited.
When the dog's accuracy can't be tested, then a search based on the dog giving an indication of "There are drugs here" is no different then a search based on a magic 8-ball giving an indication of "There are drugs here".
Every time this technique leads to a search that is used to secure a conviction, it erodes your rights, as a non-criminal, against arbitrary search and seizure.
There's a difference in the way people see this here in Europe, because here a "probable cause" is not needed for searching for drugs. Particularly at the border controls, everyone is subject to searches.
(Same applies for e.g. DUI testing on the roads; the police has the right to take alcohol test of any driver, without anyone looking like they're driving badly.)
Any arbitrary seizure, if only based on dog indication, is of course a horrible miscarriage of justice.
Borders are civil liberty free zones, by design. As are warzones, places that have declared martial law, and prisons.
DUI testing is different. In many jurisdictions you do not have to submit to it, if you are fine with losing your driving privileges - which is not a criminal penalty.
I'm not super pleased about it, but there is typically a clear line - some voluntary behavior voids your liberties. Involuntary behavior generally should not.
It's extremely important for these kinds of studies to get more publicity, for the courts to start to understand the limitations, and most importantly for the labs that do the tests to start to be held accountable. The standards from a scientific standpoint are truly deplorable.
I know this doesn't help the folks in prison without any resources, but during a trial, can't a defendant ask for any DNA test to be rerun by a private lab at their expense?
Even if that is an option, and the results come back favorable to the defendant, they still don't have a great situation. Now there will be two experts coming in and saying different things to the jury. The details that might explain those differences are likely beyond what can be adequately explained to a jury during a trial.
But the defendant themselves paid one of the experts, so...
Decades ago, I served on a jury where we had an expert witness for defence and an expert witness for prosecution. We, the jury, made a decision to ignore both expert witnesses as they each had a different opinion, please note the word "opinion".
I do not know with what regard, under US jurisdictions, the evidence provided by "expert" witnesses is treated, but at the time, we were told by the presiding judge to treat it as opinion by an "expert" and not as direct evidence.
This more detailed account states the exoneration happened years after the plea bargain release; the eff up discovery did not cause him to be released.
So the real impact on this poor man's life was thus significantly higher. Imagine trying to get anything worthwhile happen in life with a criminal conviction...
First, please don't mix in lie detectors here. They are plain quack science, and an "interrogation technique" (read: intimidation method) at best.
Second, we have always known witness testimonies to be unreliable. That's why perjury is a crime! We wouldn't need it to be a crime if nobody was giving false testimonies. And that's assuming people don't make mistakes, which they do.
We still use testimonies.
A court case, after all, is all about probabilities. Very often we just don't know the truth 100%. It's just that compounding evidence, ideally, would convince that it is very unlikely that a particular scenario didn't happen.
E.g.: someone could have mistaken another person for you when they said they saw you hitting the old pawnbroker lady with an axe, BUT given that you were caught with an axe AND blood was dripping from it AND that blood matched the old lady's AND you wrote an article about the merits of butchering old ladies in the local newspaper - ALL those things could have been mere coincidences and mistakes, HOWEVER they establish that it's highly unlikely that you didn't commit the crime.
Same with DNA evidence. We just need to re-adjust our expectations of how foolproof it is for it to work effectively in a court system. As long as we know that the labs are not 100% reliable, we'll be fine.
And that's what the authors are trying to do here.
The next step would be punishments for quack/pseudo science being used as "expert" testimony, as well as actual experts being reckless in their work (as is the case with the DNA labs in the article) when that work is used to accuse someone of a crime.
The punishment doesn't need to be jail-time; even banning the person/lab from providing evidence again would be a good first step. But that's my view on how to solve this problem, and a subject of another discussion.
It should be made clear that most DNA evidence is quite reliable, provided basic standard procedures are employed. I.e. as long as you don't switch samples or otherwise mess up, the results are very robust.
The particular issue here is the mixed samples. The basic approach of DNA fingerprinting is to look at a suite of variable markers which, taken together, form a unique set of alleles that can positively identify an individual. It's easy to see how a mixed sample can pose a problem: if you have two contributors and 20 loci analyzed, that's over a million permutations to consider, assuming all loci are different for those samples. Making an affirmative match in this situation is much more difficult! Not only must your statistics be much stronger (to offset the permutation complexity), but you must make far more assumptions about the sample than before. Namely, you must estimate the number of contributors to - and the relative proportions of - a sample via various heuristics to arrive at a statistical metric of match reliability.
It's kinda amazing that it can be done at all, but for smaller numbers of contributors, it's not that bad. The classic example is a sample that contains DNA from both the perpetrator and the victim; such 2-contributor situations seem to be well understood at this point.
The main issue for reliability is when there are a large number of contributors, and particularly when very sensitive assays are used. Increasing sensitivity means you have to rely more upon amplification, and the exponential nature of PCR makes it very easy for minor contributions to be out-competed.
Basically, not all DNA evidence is equal, and it's important to distinguish between more and less reliable methods. Unfortunately the vagaries of legal precedent are often have an outsize influence on what is accepted in court. The apparent reluctance by the NIST to report their findings is really what is most concerning here, not the tests per se.
Even if the actual labwork is flawless, there are still the issues of DNA contamination where your DNA is spread around on objects and persons you touch. And this problem just gets worse the more sensitive the detection methods are.
The problem is when you use those in isolation from other data points. You are supposed to doubt each individual piece of evidence, but put enough of them together to overcome the doubt.
Probably now or very soon AI tech can convincingly merge you into a video as well. Justice system is going to get very interesting to say the least in the future.
"It is uncomfortable to read the study’s authors praising labs for their careful work when they get things right, but offering sophomoric excuses for them when they get things wrong."
But not surprising.... they want to continue to work in the field, after all !
Some articles are so important that they don't belong behind paywalls. I believe this is one of them. The NYT's ombudsman should help develop a policy to determine when that should occur.
Every age has required a 'noble lie' in regards to catching criminals - that actually this time, we really can do it. Take this result, take the shakiness of forensic fire patterns, the shakiness of fingerprint matching, the shakiness of bite marks and confessions and more and it becomes apparent that our time isn't so different from those in the past.
This is one of my favorite pet peeves: a smart person comes out wearing a lab coat[1] and holding a clipboard. They make a claim, and one of two things happen.
Firstly, news reporters with little understanding of the subject matter try to explain it to the masses, and the results are disastrous. (Not to mention, the drive for academia to publish papers that prove the hypothesis.) Little needs to be said here.
The second is that I frequently see my peers latch onto any PDF they can download that supports their position. "Look, look! The data shows I'm right! And this man with a PhD supports my conclusion! (Of course I haven't read the report, but that's not important. What is important is my pet issue!)" Prosecutors using sloppy lab work to get convictions and computer algorithms deciding bail bonds are just end results.
We need to put the brakes on the empiricism train. It's useful, but only to a certain point. Reason eventually needs to take over. If your argument lives and dies by the next PDF or Machine Learning model, you need to find a new argument, or bolster your argument with reason.
1 - https://xkcd.com/699
(Meta argument: I'm reading a report about a paper someone published, and latching onto the report because it supports my conclusion that people will believe anything a "scientist" says! I'm victim to my own problems!)
> We need to put the brakes on the empiricism train. It's useful, but only to a certain point.
We need to foster a deeper understanding of how empiricism actually works, what its limitations are, and the difficult and often lengthy process through which varying degrees of certainty are obtained and adjusted over time.
I also think it is important to spread the value of grounded skepticism (including of one's own knowledge) and the power of freely admitting mistakes.
Obviously, none of this is easy. But it is worthwhile and I think small improvements will have a large impact through ripple effects over time.
We shouldn't throw the baby out with the bathwater, in regards to empiricism.
Few people who haven’t done a variety of masters or PhD level research really understand how there are only degrees of “knowing” something.
Teaching the scientific method early on, and really getting to the heart of what it means to stay unsure about things, is something the education system could do a better job of, but it’s no small task.
Cherrypicking and confirmation bias aren't problems with empiricism, they're well-known flaws in people's practice of empiricism. In fact, the "book of empiricism" says not to do them. You just have to apply the same standards when collecting studies for meta-analysis as you do when collecting data for studies.
Your experience seems to involve hanging out with people who regularly cite academic articles. Is that accurate?
There are many people outside of that world who either downplay, or reflexively reject, any claim that comes from peer reviewed research, if they personally disagree with the outcome.
These folks won’t even consider academic peer reviewed articles as a legitimate source of information.
This viewpoint is disturbingly far reaching in my experience, in part due to poor reporting of science. I’ve had friends and family question if researchers of any discipline know much of anything because of frustration with dietary research reporting: “one month eggs are good for you, one month eggs are bad. Which one is it?!” They end up rejecting researchers and scientists of all types simply because of a paper here or there which got traction in mainstream media but was perhaps reported in a misleading fashion... and that does get to your point of how important quality journalism is.
I feel far more distressed by the attitude of “my ignorance is equal to their experience” than people pointing to a PDF of research and saying, “here’s the data,” which seems to be a piece of what you are criticizing. Am I understanding that correctly?
If people can read the study, they can question specific methodologies, rather than coming up with fun straw men such as, “well these researchers in this field need this to be the case, because otherwise they wouldn’t have a job.” I’ve heard that argument used on climate researchers and dietary researchers at a minimum, but those trends can extend to people’s thoughts where they start to think, “well those researchers are up to no good. I bet they have the cure for cancer and they just don’t want to release it.”
I’m all for ensuring people appeal to reason and not just take the hottest headline (be it in popular press or academic papers), but there is a problem with a large portion of the population that outright rejects many research papers altogether.
> Firstly, news reporters with little understanding of the subject matter try to explain it to the masses, and the results are disastrous.
Do you have a better alternative?
That's why people like Carl Sagan, Bill Nye, and Neil deGrasse Tyson exist. Science communicators. They are the better alternative.
Neil Tyson was calling for a branch of the government to declare what truth is [1]. He's part of the problem I'm describing above.
1 - https://twitter.com/neiltyson/status/1031556958153666561?lan...
That's a twisty comment and unrelated to the core information communication you described above.
People have opinions and everyone needs to correct for their opinions. I prefer it when people are open about their opinionated because everyone omits or rewords statements to suit their purpose.
Yeah, I would not put him in the same category as Carl Sagan. They had very different takes on policies.
There's a lot of context that needs to be added to understand that tweet. I do not believe Tyson is calling for a Ministry of Truth.
Who exactly do you think communicates what they say to the masses? Reporters. Without them it would never reach a wide audience.
When a random reporter is told to check out a scientific paper and tries to summarize it they take a complex topic and attempt to express it as best as they're able.
Those scientific emissaries who are skilled at their scientific expertise and public speaking are able to express those complex topics more correctly and more approachably... If a report then copy-paste's their speech and publishes it in a newspaper or reads through the speech and tries to pull out things they consider important, they're working from a better starting point, the important information has already been distilled and expressed by an expert.
Science communication is a really specific skill. There are some very good ones out there but it's not the same as journalism in general.
Since Sessions ended an inquiry into obvious junk science being used in courtrooms (which itself seemed to be moving pretty slowly given the scale of the problem), my prediction is that nobody will do anything whatsoever about this finding.
> Since Sessions ended an inquiry into obvious junk science being used in courtrooms
(I'm not from the US!) Is this the stuff like "arson detectives" working with completely unsubstantiated theories, often enough sending people to jail even if they were innocent?
I think its things like hair analysis, bite mark matching even fingerprint matching.
If you have enough money to pay for experts to shoot this nonsense down you are fine but if you don't have the money you are in serious trouble. Some years ago I was part of a jury in a trial about assault with a deadly weapon . The prosecution had an expert witness who talked about lighting conditions during sunset. I do a lot of photography so I know about little about this topic myself. I thought to myself "WTF is this guy talking about? This is plain wrong". But as jury member you can speak up only during deliberation and the defendant was a poor guy whose lawyer didn't challenge this nonsense.
Thank God the case was so flimsy that we voted "Not guilty" anyway but it's really easy to get convicted based on junk science. Unless you have the money to challenge that.
It's a travesty that juries don't have a voice in a trial. It benefits no-one.
They do. It's called a verdict.
having the jury ask questions can really only hurt the defendant. if a juror is so confused that they feel the need to ask a question, they should just count that point towards "reasonable doubt" and move on.
In the (not at all unusual) event of incompetent counsel, jurors asking clarifying questions might draw out important information.
That's more like a lay judge system.
Cross examination of an expert is pretty important. Judges and juries are not supposed to allow fringe opinions as experts and even the best experts may hold non-consensus positions on some matters.
The fix for that system would be a "Mechanical Turk" of an expert and taking the majority vote or, if there's no consensus, dropping the matter. The important part here would be complete blinding of each expert.
Wait, that sounds like an expert jury. :)
Arson analysis is definitely in there too. He's right about that. There was a PBS documentary and they were talking about how the investigators would be taking nonsense about "feeling the fire" and they convicted an innocent man of killing his entire family.
This was the documentary I was thinking of: https://www.pbs.org/wgbh/pages/frontline/death-by-fire/
Actually it's turning out that arson detectives are on the level of hair and bite mark analysts -- completely and utterly made up.
Every defending lawyer will be waving this news article over their head ...
Does it matter? If it’s accepted science then courts are loathe to just start throwing it out even when there is very good evidence it’s junk.
If you check out the article, the authors of the study inserted a disclaimer that will make this very difficult.
OK, but then what?
Link to open access publication: https://www.fsigenetics.com/article/S1872-4973(18)30248-5/fu...
Link to more detailed article: https://www.forensicmag.com/news/2018/08/nist-publishes-land...
I can't understunderstand why the OP article didn't at least cite the paper.
I guess the quote works as a kind of content-based addressing, though.
>> The good news is that there are methods to reanalyze old DNA mixture data using computer programs that can help analysts correct errors, without any new lab testing. In fact, one lesson from the study is that while only seven of the 108 labs in the study properly excluded the innocent profile, one of them used such a program (TrueAllele by Cybergenetics).
>> (...)
>> In fact, we have shown that this is possible. Working with Cybergenetics analysts and Innocence Network organizations in four states, our Boise State University laboratory has re-examined a few select cases and already persuaded courts to overturn a conviction in New Mexico, two in Indiana and two in Montana. We have also helped identify a new suspect in a 23-year-old murder.
This leaves a bad taste. Have I just read an advert for a proprietary piece of software, masquerading as an article?
It's shocking to me that something so critical to the criminal justice system is not more tightly regulated nor monitored continuously for sensitivity/specificity.
Given courts have ruled drug dogs don’t need to be tested for accuracy or to make sure they’re not influenced by their handlers this isn’t very surprising.
Seems like the more I learn about the system the more of a scam it seems.
Speaking of which Serial season 3 just started which covers this kind of stuff, more in the small than a grand ‘Thing X is the problem we want solved that will make things much better’.
I don't quite follow. Why would drug dogs need to be tested for accuracy? Is the indication of a dog accepted somewhere as evidence?
I have been assuming that dogs give false positives every now and then, and there's a search for some actual evidence based on that indication, which everyone knows can be a false positive.
Dogs are routinely used to bypass the 4th amendment against unlawful search and seizure. They are a "probable cause" machine, since all that needs to happen is that the dog gets excited. Without the dog, it could be argued that those pieces of evidence were obtained without probable cause for a search, in which case they would be discarded.
So you're a cop and you want to search someone, but don't have the legal basis to do so? Go get the dog and get him excited.
When the dog's accuracy can't be tested, then a search based on the dog giving an indication of "There are drugs here" is no different then a search based on a magic 8-ball giving an indication of "There are drugs here".
Every time this technique leads to a search that is used to secure a conviction, it erodes your rights, as a non-criminal, against arbitrary search and seizure.
There's a difference in the way people see this here in Europe, because here a "probable cause" is not needed for searching for drugs. Particularly at the border controls, everyone is subject to searches.
(Same applies for e.g. DUI testing on the roads; the police has the right to take alcohol test of any driver, without anyone looking like they're driving badly.)
Any arbitrary seizure, if only based on dog indication, is of course a horrible miscarriage of justice.
Borders are civil liberty free zones, by design. As are warzones, places that have declared martial law, and prisons.
DUI testing is different. In many jurisdictions you do not have to submit to it, if you are fine with losing your driving privileges - which is not a criminal penalty.
I'm not super pleased about it, but there is typically a clear line - some voluntary behavior voids your liberties. Involuntary behavior generally should not.
A drug dog indicating is considered probable cause for a search.
It is awful, but keep in mind, this is a system where "eyewitness testimony" was the gold standard.
There are many, many examples of DNA testing being unreliable because of HOW it is done (for example: https://www.theatlantic.com/magazine/archive/2016/06/a-reaso... ) and yet the general public still sees it as infallible.
It's extremely important for these kinds of studies to get more publicity, for the courts to start to understand the limitations, and most importantly for the labs that do the tests to start to be held accountable. The standards from a scientific standpoint are truly deplorable.
I know this doesn't help the folks in prison without any resources, but during a trial, can't a defendant ask for any DNA test to be rerun by a private lab at their expense?
Even if that is an option, and the results come back favorable to the defendant, they still don't have a great situation. Now there will be two experts coming in and saying different things to the jury. The details that might explain those differences are likely beyond what can be adequately explained to a jury during a trial.
But the defendant themselves paid one of the experts, so...
Decades ago, I served on a jury where we had an expert witness for defence and an expert witness for prosecution. We, the jury, made a decision to ignore both expert witnesses as they each had a different opinion, please note the word "opinion".
I do not know with what regard, under US jurisdictions, the evidence provided by "expert" witnesses is treated, but at the time, we were told by the presiding judge to treat it as opinion by an "expert" and not as direct evidence.
The article gets the timeline of Jackson's case completely wrong:
https://www.innocenceproject.org/cases/dwayne-jackson/
This more detailed account states the exoneration happened years after the plea bargain release; the eff up discovery did not cause him to be released.
So the real impact on this poor man's life was thus significantly higher. Imagine trying to get anything worthwhile happen in life with a criminal conviction...
If we can't trust witness testimonies, lie-detectors, and now DNA results, what are we supposed to use?
First, please don't mix in lie detectors here. They are plain quack science, and an "interrogation technique" (read: intimidation method) at best.
Second, we have always known witness testimonies to be unreliable. That's why perjury is a crime! We wouldn't need it to be a crime if nobody was giving false testimonies. And that's assuming people don't make mistakes, which they do.
We still use testimonies.
A court case, after all, is all about probabilities. Very often we just don't know the truth 100%. It's just that compounding evidence, ideally, would convince that it is very unlikely that a particular scenario didn't happen.
E.g.: someone could have mistaken another person for you when they said they saw you hitting the old pawnbroker lady with an axe, BUT given that you were caught with an axe AND blood was dripping from it AND that blood matched the old lady's AND you wrote an article about the merits of butchering old ladies in the local newspaper - ALL those things could have been mere coincidences and mistakes, HOWEVER they establish that it's highly unlikely that you didn't commit the crime.
Same with DNA evidence. We just need to re-adjust our expectations of how foolproof it is for it to work effectively in a court system. As long as we know that the labs are not 100% reliable, we'll be fine.
And that's what the authors are trying to do here.
The next step would be punishments for quack/pseudo science being used as "expert" testimony, as well as actual experts being reckless in their work (as is the case with the DNA labs in the article) when that work is used to accuse someone of a crime.
The punishment doesn't need to be jail-time; even banning the person/lab from providing evidence again would be a good first step. But that's my view on how to solve this problem, and a subject of another discussion.
It should be made clear that most DNA evidence is quite reliable, provided basic standard procedures are employed. I.e. as long as you don't switch samples or otherwise mess up, the results are very robust.
The particular issue here is the mixed samples. The basic approach of DNA fingerprinting is to look at a suite of variable markers which, taken together, form a unique set of alleles that can positively identify an individual. It's easy to see how a mixed sample can pose a problem: if you have two contributors and 20 loci analyzed, that's over a million permutations to consider, assuming all loci are different for those samples. Making an affirmative match in this situation is much more difficult! Not only must your statistics be much stronger (to offset the permutation complexity), but you must make far more assumptions about the sample than before. Namely, you must estimate the number of contributors to - and the relative proportions of - a sample via various heuristics to arrive at a statistical metric of match reliability.
It's kinda amazing that it can be done at all, but for smaller numbers of contributors, it's not that bad. The classic example is a sample that contains DNA from both the perpetrator and the victim; such 2-contributor situations seem to be well understood at this point.
The main issue for reliability is when there are a large number of contributors, and particularly when very sensitive assays are used. Increasing sensitivity means you have to rely more upon amplification, and the exponential nature of PCR makes it very easy for minor contributions to be out-competed.
Basically, not all DNA evidence is equal, and it's important to distinguish between more and less reliable methods. Unfortunately the vagaries of legal precedent are often have an outsize influence on what is accepted in court. The apparent reluctance by the NIST to report their findings is really what is most concerning here, not the tests per se.
Even if the actual labwork is flawless, there are still the issues of DNA contamination where your DNA is spread around on objects and persons you touch. And this problem just gets worse the more sensitive the detection methods are.
The problem isn't in theory, the problem is in practice.
The problem is when you use those in isolation from other data points. You are supposed to doubt each individual piece of evidence, but put enough of them together to overcome the doubt.
Probably now or very soon AI tech can convincingly merge you into a video as well. Justice system is going to get very interesting to say the least in the future.
Pervasive surveillance, clearly. We didn't want to, but we had to, nothing else was reliable...
Skilled detective work?
"It is uncomfortable to read the study’s authors praising labs for their careful work when they get things right, but offering sophomoric excuses for them when they get things wrong."
But not surprising.... they want to continue to work in the field, after all !
Isn't clickbait?
"The Dangers of testing DNA mixtures" should be the title
And justice for all. Technology can be horrifically negative when in the wrong hands.
How IBM got it's start...
Some articles are so important that they don't belong behind paywalls. I believe this is one of them. The NYT's ombudsman should help develop a policy to determine when that should occur.
Isn't that why we have scihub?
Every age has required a 'noble lie' in regards to catching criminals - that actually this time, we really can do it. Take this result, take the shakiness of forensic fire patterns, the shakiness of fingerprint matching, the shakiness of bite marks and confessions and more and it becomes apparent that our time isn't so different from those in the past.