VIRTUAL FIELD TRIPS: Since kids can't go out on field trips these days, @NatGeoMuseum is putting on a virtual field trip series. Their first guest speaker is Carter Clinton, a @HowardU researcher who uses soil to study the DNA of enslaved people @nbcwashington pic.twitter.com/YiUhEq2Pl9— Aimee Cho (@AimeeCho4) October 21, 2020
The same DNA analysis used to find the alleged Golden State Killer has led to the arrest of a second alleged murderer. It’ll likely lead to more.
Paul Holes was on the verge of retirement, having never completed his decades-long mission to catch the Golden State Killer. Then he had an idea: Upload DNA evidence to a genealogy website.
On today’s episode:
• Paul Holes, an investigator in California who helped to crack the case.
• A spate of murders and rapes across California in the 1970s and 1980s went unsolved for decades. Then, last week, law enforcement officials arrested Joseph James DeAngelo, 72, a former police officer.
• Investigators submitted DNA collected at a crime scene to the genealogy website GEDmatch, through which they were able to track down distant relatives of the suspect. The method has raised concerns about privacy and ethics.
The subtleties will be when we’re using this type of DNA evidence more frequently for lower level crimes while at the same time the technology gets increasingly cheaper to carry out.
She was a mysterious serial killer known as the "The Woman Without a Face" and detectives across Europe spent more than 15 years doing their utmost to bring her to justice for at least six brutal murders and a string of break-ins. Yesterday, however, they were forced to admit that she probably didn't exist.
The clue that led investigators this week to the door of the suspected Golden State Killer came from an unexpected source: GEDmatch.com — an amateur genealogy website that’s something like the Wikipedia of DNA.
Powerful tools are now available to anyone who wants to look for a DNA match, which has troubling privacy implications.
DNA as a data storage medium has several advantages, including far greater data density compared to electronic media. We propose that schemes for data storage in the DNA of living organisms may benefit from studying the reconstruction problem, which is applicable whenever multiple reads of noisy data are available. This strategy is uniquely suited to the medium, which inherently replicates stored data in multiple distinct ways, caused by mutations. We consider noise introduced solely by uniform tandem-duplication, and utilize the relation to constant-weight integer codes in the Manhattan metric. By bounding the intersection of the cross-polytope with hyperplanes, we prove the existence of reconstruction codes with greater capacity than known error-correcting codes, which we can determine analytically for any set of parameters.