Behavioural Genetics: Past, Present, Future.
Prominent behavior geneticists have made a case for “precision education”, using polygenic scores to come up with personalized “education plans” for each child, an advanced form of tracking.
Written by Sasha Ivanov.
The Past
The end of the nineteenth century was a propitious time for social science. The geniuses of the era (Sir Francis Galton, Sir Ronald Fisher, Karl Pearson – refused to be knighted due to his socialist convictions) were diligently exploring human variation, creating the tools (p-values, regression, correlation) necessary to quantify and analyze it effectively.
Concurrently, a mature scientific psychology emerged from the cocoon of philosophical speculation. William Wundt, dubbed the father of experimental psychology, founded his lab in 1879. Wundt and other early psychologists were awed and aided by the paradigm-altering work of Charles Darwin, whose “Origin of Species” was published in 1859 and were followed by the equally compelling “Descent of Man” in 1871. Man was no longer considered a divine artifact, shaped by an omnipotent deity, but an evolved animal like all other animals. Biology was the proper lens through which to understand humans.
Francis Galton applied Darwin’s theory to the social realm. Galton initiated the study of individual differences, that is, the study of physical and behavioral variation among humans. His disciples, Karl Pearson and R.A. Fischer, formalized Galton’s studies of inheritance into the hard scientific field of quantitative genetics, more or less inventing modern statistics along the way. The rigorous and scientific study of man had hit its stride.
Alas, it wasn’t going to last.
The excesses of the eugenics movement (forced sterilizations etc.) led to a backlash. Although Marx was fond of Darwin, the idea of human genetic variation troubled the Left. Communism was largely premised upon the idea that human nature was a product of social forces and that humans were relatively similar.
The Left inherited an intellectual tradition that views people as plastic creatures greatly influenced by the environment (i.e., by nurture in Galton’s terminology). This ideology culminated in Stalin’s Soviet Union, where mainstream genetics were replaced by the dubious theories of Trofim Lysenko, a Ukranian agronomist who believed that crop yields could be maximized, not through selective breeding, but by environmental pressures that would render the plants stronger (in accordance with the Lamarckian inheritance of acquired traits). Ukrainian farmers were forced to implement these practices in their newly collectivized farms, leading to the disastrous famines of the 1930s. Mao Zedong recommended similar policies in China, leading to the Great Chinese Famine.
Russian geneticists who dissented from the orthodoxy (such as the eminent Nikolai Vavilov) were “canceled” by being thrown into a Siberian prison and left to die. But the situation in the West wasn’t great either. Various intellectual movements (Boasian anthropology, Freudian psychology) assailed biological thinking in social science. Post-WWII, quantitative genetics research on humans was stifled, and the field never received the recognition it deserved. Barbara Burks, a great woman of science who conducted the first adoption study of IQ, committed suicide in 1943 after a protracted depression that may have been triggered by the termination of her program of study.
By 1950, a new dominant paradigm had arisen in the social sciences, which scholars have called the standard social science model (SSSM). This model contended that humans were relatively interchangeable, flexible, rational units who strived to maximize their utility under preferences strongly shaped by culture. Instincts and predispositions were minimized.
While mainstream science was floundering, some obscure researchers in rather unprestigious institutions carried the flame of knowledge. Stemming from R.A. Fischer’s disciples at the University of Birmingham, the small field of behavioral genetics was born. Behavioral geneticists developed mathematical models to measure genetic influence on human behavior, but for decades few scholars paid attention.
The Present
While twin and adoption studies were being neglected, the field of molecular genetics was moving in parallel. Genetic variants linked to rare Mendelian disorders (such as Huntington’s chorea) had been identified by the 1980s, but reasonable knowledge of the biological basis of complex polygenic traits (characteristics influenced by a multitude of genetic and environmental factors, such as personality traits), was still far off.
By the turn of the millennium, things had changed. The entire human genome was finally decoded. Despite President Clinton’s reassuring platitudes during the announcement ceremony (“we are all 99% the same”), the Human Genome Project shed light on human genetic variation as never before. The early 2000s saw an avalanche of findings (“scientists found the gene for depression”, “gene for aggression finally discovered”), virtually all of which turned out to be false and premised on flawed assumptions. This should serve as a cautionary tale against blindingly trusting “The Science”, without an extensive and prolonged evaluation of its conclusions.
But eventually, the labors bore fruit. Statistical genetics is back with a bang in the era of genome-wide association studies (GWAS), starting around 2010. To date, thousands of genetic variants related to human behavior (ranging from schizophrenia and autism to food taste and fear of heights) have been identified. The sum of trait-related variants that each individual carries is the individual’s polygenic score for a said trait (e.g., intelligence or liability to schizophrenia). For the first time, it is possible to directly measure someone’s genetic propensity to be smart or dull, bold or cautious, introverted or extroverted, prone to psychotic symptoms or not.
The Future
Psychiatry in flux
The genomics revolution will radically transform psychiatry. The days of blaming “refrigerator mothers” for their offspring’s illnesses are gone, but the nosological framework of psychiatry still uses the 19th-century classification of disorders based on symptomatology. For example, according to the Diagnostic and Statistical Manual of Mental Disorders (DSM – the Holy Scripture of clinical psychologists), bipolar disorder and depression are grouped together as “mood disorders” because they are both characterized by mood swings. GWAS data now reveals that bipolar largely shares the same genetic basis as schizophrenia (rather unsurprisingly, given the high comorbidity between the two).
These developments will eventually force psychiatrists to toss the DSM in the dustbin and come up with a new classification system that is based on causes instead of symptoms. As currently defined, psychological disorders are simply constructs that have been built on clinicians’ observations of symptoms. The underlying biological realities of mental illnesses do not quite tap into the DSM system. Instead of thinking of bipolar disorder as a distinct, fixed category, future clinicians will likely perceive it as a manifestation of a continuous spectrum of genetic liability to psychosis – one that is shared with schizophrenia.
The implications of the new classification will be greater for more common disorders, such as ADHD or depression. An alliance of pharmaceutical companies, pundits, and psychiatrists (often based on unsubstantiated theories) have convinced millions that these conditions are “diseases” that can/must be “cured” with the constant use of substances. But, genomic evidence suggests that ADHD and depression are extremes of normal spectra of behavior (externalizing and internalizing, respectively). The use of substances to alleviate symptoms may be advised when these conditions become detrimental. But it might be wise to reconsider defining depression as a discrete category of clinical significance.
Eugenics reloaded?
Another breakthrough of GWAS is that the identified genes can explain within-family variance in behavior. Amongst siblings that have been raised under the same roof, the one that carries the most IQ-related variants will tend to be brighter and, as a result, attain more educational and professional status compared to his co-siblings. Which sibling will inherit which genes is determined by pure luck, in accordance with Mendel’s First Law of Segregation.
But in-vitro fertilization (IVF) allows parents to choose which genetic variants they will inherit from their offspring. This is done by harvesting multiple eggs and selecting to implant the embryo that has the highest polygenic score for IQ or the lowest polygenic score for schizophrenia. There are already multiple companies that offer preimplantation genetic testing for polygenic traits (PGT-P), although none has dared to explicitly select for intelligence or against common psychopathology (yet).
In the near future, genetic testing will be much more fine-grained and predictive. Currently, genetic testing services are based mostly on genotyping chips, which only record a small part of the genome. This technology will soon be replaced by whole genome sequencing (WGS), which captures all 3 billion genetic sites of an individual’s DNA. Dominic Cummings, chief advisor to Boris Johnson during the latter’s ill-fated government, had plans for every newborn in the UK to be sequenced, with expenses covered by Britain’s National Health System. Although Cummings was ousted from Downing St., we can expect his vision to be implemented in the future. Genetic predispositions will be known and stored in records available to clinicians. But WGS can also be used for much more potent embryo selection, not to mention gene editing, which has already been performed in humans.
These developments have provoked antipathy from “ethicists” who are ringing alarm bells about the resurgence of eugenics. Of course, PGT-P is only the logical extension of the already-existing PGT. In the developed world, embryos are routinely screened for genetic conditions. Because of this, Down syndrome has been eradicated in certain Scandinavian countries. Still, the situation with complex traits is a bit more, well, complex. Some technical arguments have been forwarded against PGT-P. The chief concern is pleiotropy: selecting for a certain trait can (and will) inadvertently impact other traits. For example, recent studies suggest that IQ is genetically correlated with autism. Therefore, by selecting smarter children, parents may end up with children that fall under the autism spectrum.
However, most of these problems are tractable. New statistical genetic methods enable scientists to discern the genetic structure of even highly correlated traits. The most salient problem with PGT-P and other kinds of genetic enhancement are that they ran counter to the regnant egalitarian dogma. At a time when most mainstream institutions lean left, genetic enhancement is seen as “problematic.” This was exemplified in the case of the deaf lesbian couple who chose to select congenitally deaf children. Some disabled advocacy groups oppose any kind of screening, even for debilitating disorders such as cystic fibrosis, because they view these conditions as integral to their identity. According to these groups, if, e.g., cystic fibrosis is eradicated, then their community will be erased.
It is likely that PGT-P for physical health traits will become commonplace in the future. But the same cannot be said for enhancing intelligence or athleticism, as that runs completely against prevailing dogmas. Steve Hsu, the founder of the first PGT-P startup (who lost his academic job as a result), has often compared PGT-P to IVF, claiming that it will eventually be normalized and accepted by the public. This might be optimistic. IVF is easier to sell to the Left because it facilitates childbirth to infertile couples and older women. In contrast, PGT-P is about avoiding deleterious genes and striving for excellence – an ancient ideal of virtue that is far from today’s sensitivities.
Education and IQ
“In the competition between nature and nurture, nature proves the strongest of the two,” wrote Francis Galton, and his view is worth attending to since he did more than anybody to promote the study of the relative contributions of nature and nurture to human variation.
The genomics revolution will likely put the final nail in the coffin of the nature-nurture debate. GWAS results provide exciting new ways to measure heritability with molecular data. These methods do not require special samples of twins or adoptees but still arrive at heritability estimates similar to those of the old twin studies. As the evidence for genetic influence on behavior becomes irrefutable, old-school radical environmentalism will be dropped. Although there are still some strugglers out there espousing 1970s-style genetic denialism, future leftists will likely follow the path laid out by Paige Harden of an uneasy synthesis between leftism and genetic realism.
The 1970s science wars were centered around the heritability of IQ, which modern progressives like Harden now accept. Indeed, intelligence being the most well-measured behavioral trait, it is the phenotype for which GWAS has given us the most information. With thousands of variants detected, polygenic scores for intelligence can powerfully predict individual outcomes. How is this information to be used?
For one, it will become patently clear that policies such as “No Child Left Behind” (the U.S. policy that required schools to demonstrate that all pupils performed above a certain level) are flawed and probably pointless (a fact that has been recognized even by Marxist scholars). Education and rearing can only make people smarter up to an extent. Public intellectuals are slowly beginning to recognize that the education system does not significantly improve human capital. What makes Harvard the top school in the world is not some magic recipe for instilling knowledge into young people’s minds. Harvard students are the best because they have been selected to be so.
Prominent behavior geneticists have made a case for “precision education”, using polygenic scores to come up with personalized “education plans” for each child, an advanced form of tracking. In general, the education system could well do with more awareness of genetics and individual differences. Pushing everyone to be college-educated ruins both colleges and the employment prospects of their students. Perhaps the answer lies in a more German-like system, with most students ending up in trade schools where they are taught employable skills without aspiring to a Bachelor’s degree.
These suggestions are not likely to be implemented in the near future. Instead, we are witnessing the abolition of standardized testing, the scraping of special programs for gifted children, and an aversion to any sort of standards. Schools across the Western world no longer use grading or have effectively abolished grading by giving the same grade to every pupil. In a sense, this is the logical conclusion of “No Child Left Behind”. That policy was abandoned because not every pupil could reach the set standards – the next step is to abolish the standards.
Conclusion
More than 40 years after the Sociobiology Wars, it seems as though the leftists have decisively won. Biological denialism permeates academia and society for cultural reasons that are related to our increased disconnect from reality (fewer and fewer people live in any proximity to nature or grow up in large families, making it difficult to appreciate biological facts). Simultaneously, scientific developments make it harder to ignore the reality of genetic influence. The era of GWAS has pushed the formerly obscure field of behavioral genetics into the mainstream.
Will behavior genetics play a central role in the raging culture war? Leftist critics fear that disseminating this knowledge will lead to the “justification of inequality”: the rich are rich because they are genetically gifted, while the poor are not (as some have indeed claimed). But it is not necessarily so. Findings from behavioral genetics may also be used to alleviate psychopathology (through a re-examination of diagnosis and drug targets), enhance newborn health (PGT-P), design more efficient policies (precision education), and even examine social inequality. Many scholars agree that increased genetic influence is an index of meritocracy and equality of opportunity (when environments are equalized, variance in social outcomes can only be explained by genetics). Instead of viewing heritability as a boogieman, perhaps we should embrace genetic diversity and learn to live with it.
Sasha Ivanov is a behavior geneticist.
"But, genomic evidence suggests that ADHD and depression are extremes of normal spectra of behavior (externalizing and internalizing, respectively). The use of substances to alleviate symptoms may be advised when these conditions become detrimental. But it might be wise to reconsider defining depression as a discrete category of clinical significance."
This is almost certainly correct, but with a caveat. When we talk about "depression", we can mean many things, as there are many varieties of human unhappiness. I agree that what most people call ordinary depression is basically on the spectrum of ordinary human misery, and most "depressed" people these days are using medical language to describe fairly normal, albeit painful and unpleasant, life problems. For many of them, prescription medication is marginally helpful at best.
But severe depression, or melancholic depression, or whatever you want to call the depression that renders someone totally non-functional for weeks or months on end, sometimes leading to catatonia, psychotic depression, etc is clearly an illness.
Brief example: I treated an older gentleman who was admitted to the psychiatric unit for depression. He had no history of psychiatric problems at all. Over the past year or so he had been becoming increasing depressed for no apparent reason. By the time he was admitted, he thought he was actively dying, or possibly already dead because he didn't think he had a pulse. He was convinced he had murdered members of his family and committed all sorts of heinous crimes. None of this was true or course. We treated him with electroconvulsive therapy (ECT), and he rapidly improved and was discharged to get back to his life.
It will also be welcome when psychiatry is developed enough to start looking at diseases from a causal perspective rather than a symptom-syndrome one. But for all its inadequacies, the DSM isn't a totally useless book. Paraphrasing from psychiatrist Kenneth Kendler, here is one example of how to usefully think about the DSM in clinical practice:
If someone comes to the ER with crushing left sided chest pain, certain EKG changes, and elevated troponin levels, we can tell them they are having a heart attack. Of course "what a heart attack is" is not those EKG changes, chest pain, and changes in blood levels. Those are indexes of the illness such that, if they are present, we can say the illness is present.
The DSM works in a similar way. If you ask "what is major depression", we wouldn't say it is literally low mood, sleep and appetite changes, low energy, etc. Those are indexes of depression just like the EKG in a heart attack.
The DSM remains agnostic on causes, which is probably a wise thing ultimately since we don't know what the causes are. But eventually we will, and the DSM will either incorporate this or will cease to be relevant.
Great stuff, thanks for posting.
Parents want the best for their children. I think many progressives will be totally willing to use PGT-P if they are already undergoing IVF.