Big secrets come in small packages. Take cells, like the ones shedding off the inside of your cheeks and into your saliva right now. These days it’s not that hard to crack them open, shake out the DNA coiled inside, and read the genetic code they contain. Those strings of As, Cs, Ts, and Gs can tell you any number of things you might want to know—the location of your ancestral homelands, say, or which cancer drug is going to give you the best shot at beating your diagnosis. You could also discover things you wish you hadn’t: What if your dad is actually some stranger from a sperm bank? What if you have a disease-causing mutation you could pass on to your own kids? And if you leave your genetic code lying around a crime scene, cops can trace it back to you.
Even 25 years ago, many of these things were unknowable. Today, obtaining such information can cost less than a Netflix subscription. For that you can thank your fellow US taxpayers and the $3 billion they pumped into the Human Genome Project during the decade best known for dial-up and Drew Barrymore. That Big Biology project turned Homo sapiens from a black box into a big fat book—262,000 pages long when printed letter by letter.
Ever since then, scientists have been trying to figure out what all the words mean. Some sections have been more accommodating to interpretation than others. Which is why, like the complicated chemistry and befuddling bioinformatics that power them, genetic tests can be difficult to understand. So too are the privacy risks associated with them. Still, genetic testing—whether it’s for genealogy research, assessing disease risk, or solving crimes—is only going to get cheaper, more powerful, and more popular. There’s never been a better time to learn what you’re getting into.
The History of Genetic Testing
You can divide genetic testing into two eras: B.H.G.P. and A.H.G.P., the defining event between them being the announcement of the first draft of the human genome, in 2000. People have known for centuries that traits—be it the curve of a nose or a bleeding disorder—tend to run in families, passing from parents to children through some inheritance mechanism. But technologies capable of detecting and interpreting said substance, now known to be DNA, evolved much more recently.
By most accounts, the prehistoric period of genetic testing begins in the 1950s with the discovery that an additional copy of chromosome 21 causes Down’s syndrome. Scientists developed methods for staining chromosomes so they could be sorted and counted, a test called karyotyping. Combined with the ability to collect fetal cells from a pregnant woman’s amniotic fluid, these early advances led to the first genetic prenatal screens. Such tests provided DNA-based diagnoses of genetic disorders caused by big biological screw-ups: too many chromosomes, too few, or chunks of them in the wrong places.
As these clinical tests became more common, scientists were also busy trying to drill deeper into the substance of DNA, the chemical structure of which had only been deciphered in 1953 by James Watson, Francis Crick, and Rosalind Franklin. Over the next few decades, scientists would come to understand that its helix-shaped pattern of paired bases—adenine, thymine, cytosine, and guanine—functioned like letters, spelling out words that a cell would decode into amino acids, the building blocks of proteins. They would also begin to realize that most of the human genome—about 98 percent—doesn’t actually code for proteins. In the ’70s, “junk DNA” became the popularized term for these nonfunctional sections.
Not long after, in 1984, a British geneticist named Alec Jeffreys stumbled upon a use for all that so-called junk DNA: crime-fighting. In these regions of the genome, the DNA molecule tends to duplicate itself, like it’s stuttering over the same word over and over again. Scientists can capture and count these stutters, known as “short tandem repeats.” And because the number of STRs a person has at various locations is unique to them, they can be used to build a personally identifiable profile, or DNA fingerprint.
In 1987, this technique was used for the first time in a police investigation, leading to the arrest and conviction of Colin Pitchfork for the rape and murder of two young women in the UK. That same year, Tommie Lee Andrews, who raped and stabbed to death a woman in Florida, became the first person in the US to be convicted as a result of DNA evidence. Since then, forensic DNA testing has put millions of criminals behind bars. In 1994, Congress signed the DNA Identification Act, giving the US Federal Bureau of Investigation authority to maintain a national database of genetic profiles collected from criminal offenders. As of September 2019, this database, known as CODIS, contains DNA from nearly 14 million people convicted of crimes, as well as 3.7 million arrestees, and 973,000 samples gathered at crime scenes.
Throughout the ’80s and ’90s, while cops were rushing to use DNA to catch rapists and murderers, geneticists were slowly doing detective work of their own. By linking health records, family pedigrees, disease registries, and STR locations and lengths, scientific sleuths painstakingly began to map traits onto chromosomes, eventually identifying the genes responsible for a number of inherited conditions, including Huntington’s disease, cystic fibrosis, and sickle-cell anemia. These diseases linked to single genes, so-called monogenic conditions, are basically binary—if you have the genetic mutation you’re almost certain to develop the disease. And once the sequences for these faulty genes were revealed, it wasn’t too hard to test for their presence. All you had to do was design a probe—a single strand of DNA attached to a signal molecule, that would send out a fluorescent burst or some other chemical flare when it found its matching sequence.