👓 The Coming Wave of Murders Solved by Genealogy | The Atlantic

Read The Coming Wave of Murders Solved by Genealogy (The Atlantic)
The same DNA analysis used to find the alleged Golden State Killer has led to the arrest of a second alleged murderer. It’ll likely lead to more.

I can see this going to the Supreme Court sooner than later on privacy related underpinning. I can’t help but recall the words of Jed Bartlett in The West Wing when he was saying in season one that privacy would be one of the most pressing issues for the Supreme Court in the coming century.

Syndicated copies to:

🎧 ‘The Daily’: The Hunt for the Golden State Killer | New York Times

Listened ‘The Daily’: The Hunt for the Golden State Killer by Michael Barbaro from nytimes.com

Paul Holes was on the verge of retirement, having never completed his decades-long mission to catch the Golden State Killer. Then he had an idea: Upload DNA evidence to a genealogy website.

On today’s episode:

• Paul Holes, an investigator in California who helped to crack the case.

Background reading:

• A spate of murders and rapes across California in the 1970s and 1980s went unsolved for decades. Then, last week, law enforcement officials arrested Joseph James DeAngelo, 72, a former police officer.

• Investigators submitted DNA collected at a crime scene to the genealogy website GEDmatch, through which they were able to track down distant relatives of the suspect. The method has raised concerns about privacy and ethics.

A stunning story with some ingenious detective work. I worry what the potential privacy problems are off in the future, though one of the ideas here is that it actually helps protect the privacy of some individuals who are wrongly and maliciously accused and thus saves a lot of time and money.

The subtleties will be when we’re using this type of DNA evidence more frequently for lower level crimes while at the same time the technology gets increasingly cheaper to carry out.

Syndicated copies to:

👓 DNA blunder creates phantom serial killer | The Independent

Read DNA blunder creates phantom serial killer (The Independent)
She was a mysterious serial killer known as the "The Woman Without a Face" and detectives across Europe spent more than 15 years doing their utmost to bring her to justice for at least six brutal murders and a string of break-ins. Yesterday, however, they were forced to admit that she probably didn't exist.

Interesting to be reading this article after just having read two articles about the DNA-related discovery of the Golden State Killer.

Syndicated copies to:

👓 The first step in finding Golden State Killer suspect: Finding his great-great-great-grandparents on genealogy site | LA Times

Read The first step in finding Golden State Killer suspect: Finding his great-great-great-grandparents on genealogy site (latimes.com)
The clue that led investigators this week to the door of the suspected Golden State Killer came from an unexpected source: GEDmatch.com — an amateur genealogy website that’s something like the Wikipedia of DNA.
Syndicated copies to:

👓 How a Genealogy Website Led to the Alleged Golden State Killer | The Atlantic

Read How a Genealogy Website Led to the Alleged Golden State Killer (The Atlantic)
Powerful tools are now available to anyone who wants to look for a DNA match, which has troubling privacy implications.

I find this mechanics relating to privacy in this case to be extremely similar to Facebook’s leak of data via Cambridge Analytica. Something crucial to your personal identity can be accidentally leaked out or be made discoverable to others by the actions of your closest family members.

Syndicated copies to:

🔖 [1801.06022] Reconstruction Codes for DNA Sequences with Uniform Tandem-Duplication Errors | arXiv

Bookmarked Reconstruction Codes for DNA Sequences with Uniform Tandem-Duplication Errors by Yonatan Yehezkeally and Moshe Schwartz (arxiv.org)
DNA as a data storage medium has several advantages, including far greater data density compared to electronic media. We propose that schemes for data storage in the DNA of living organisms may benefit from studying the reconstruction problem, which is applicable whenever multiple reads of noisy data are available. This strategy is uniquely suited to the medium, which inherently replicates stored data in multiple distinct ways, caused by mutations. We consider noise introduced solely by uniform tandem-duplication, and utilize the relation to constant-weight integer codes in the Manhattan metric. By bounding the intersection of the cross-polytope with hyperplanes, we prove the existence of reconstruction codes with greater capacity than known error-correcting codes, which we can determine analytically for any set of parameters.
Syndicated copies to: