The media bollocksed up a science story

I think people have seen on this site the way that the media has distorted the scientific data on epigenetics, with newspapers and popular books implying that adaptive environmental modification of an organism can be inherited, which constitutes a revolution in the way we think about evolution. Well, were that true it would be a revolution, but we have no evidence of such environmental modification being inherited over more than two generations, much less for any of that modification being adaptive.

This kind of distortion is pervasive in much of the press, though are notable exceptions (the New York Times, for instance, almost always reports science accurately and responsibly).

Below is a new study from Nature Communications in which the authors looked for associations within a large group of UK residents (over 300,000) between genetic variation of individuals (“single nucleotide polymorphisms,” or SNPs) and the level of their “general cognitive function” (a summary statistic of cognitive ability taken from different tests; “g” is an example of such a statistic). The paper reports that in this sample 148 independent regions of the genome were associated with cognitive function; that is, variation at each of those DNA sites was significantly correlated with cognitive ability. While some of the genes they found are associated with cognitive ability (Alzheimer’s and Parkinson’s diseases, schizophrenia, autism), others had no clear relationship with cognition (body mass index, eyesight, height, weight, lung cancer). The study doesn’t identify the specific genes associated with cognition, but regions of the genome, even though those regions were independent. They thus did not identify specific “cognition genes”, but simply chromosomal regions that might contain genes whose variation affects cognitive ability.

Further, a gene whose variation affects cognition doesn’t mean that that gene played a role in the evolution of human cognition—only that there’s genetic variation in that region that can affect cognition. For example, any gene that affects brain growth could mutate to a deleterious form that impedes cognition; but that doesn’t mean that that gene was important in the evolution of human cognition. For example, genes whose mutation could affect human cognition by screwing up development in various ways, large or small, might not have been important in the evolution of our unique cognitive abilities. Yet variation in those genes could be turned up by studies like this one.

Indeed, these genes could be detected because they were variable, whereas genes adaptively affecting human cognition in our ancestors would be expected to sweep to fixation, eliminating the variation that reduces cognitive function. What we have are simply genes that contribute to the “heritability” of cognition: the proportion of variation in cognitive ability among individuals in the population due to variation in genes rather than other factors, like variation in the environment.  This heritability is around 50%, meaning that about half of the variation we see in the cognitive ability of humans is due to “heritable” variation in their genes (variation capable of being passed on to offspring), and the other half to variation in their environments and other factors.

So what we have here is simply an association study, and the meaning of the associations are unclear. That doesn’t mean the paper isn’t important or interesting, because it could, among other things, help us zero in on genes affecting, say, Parkinson’s disease, which in turn might help find a cure. Nevertheless, that’s not why the press picked this up—see below.

Click on the title to go to the paper.

 

First, I’ve put the list of authors below to show you how many people participated in doing the science in this study one way or the other. The number of authors of papers is growing over time, partly because work like this is intensely collaborative, requiring work by many labs, but also because competition for scientific status is fierce, and these days some people tend to put their names on papers when they’ve hardly done any of the work). I have no idea who did what here, but it doesn’t matter: just look at the list of authors. Author #8 is Stuart Ritchie of the University of Edinburgh, who pushed backed against the misrepresentation of this paper by the press:

Here’s the series of tweets emitted by Ritchie showing how the Torygraph and other venues like the Guardian distorted the finding that some of the genetic variation affecting cognitive ability mapped to regions of the genome associated with better or worse eyesight (tweet stream here).

Remember that the genetic correlations are observed between variation in a region of the genome and cognition; it’s not strong evidence that “genes affecting eyesight” also have a pleiotropic (associated) effect on cognition. And, of course, the Guardian had to “virtue signal”, as Ritchie aptly puts it, about the dangers of studying the genetics of cognition because of its linkage to “race science”:

While being more intelligent may be linked to poor eyesight, it’s also connected with a lot of positive health benefits. Researchers found negative correlations between cognitive function and a number of health problems, including angina, lung cancer and depression.

Of course, it’s important to remember that these are all simply correlations not conclusive links. And it’s worth noting that what constitutes intelligence is subjective and can be difficult, if not impossible, to measure. Further, linking intelligence to DNA can quickly lead into bogus “race science”.

Yeah, right: I’m sure these authors are all racists who are going to promote bigotry!  The Guardian’s article on this piece and its virtue-signaling, like much of the Guardian itself, is trash.

What Ritchie demonstrates by looking at the popular reporting of his piece is how the press readily distorts reporting to favor what’s sensational, even if it’s blatantly wrong.

How to avoid this? It’s easy! Simply vet your story to scientists before it’s published—something that any journalist can do, but few are be arsed to perform. Just pick up the phone and call an author like Ritchie—or any geneticist who does association studies (there are many)—and say, “is this right?”.

Thank your lucky stars that Professor Ceiling Cat (Emeritus) is here to set you straight and point you to Ritchie’s tweets. Maybe I should charge for “sciencesplaining”!

 

h/t: Grania, Matthew

55 Comments

  1. Kev
    Posted June 3, 2018 at 9:53 am | Permalink

    Oi loik de Oirish spellin best: “bolloxed”.

    • Saul Sorrell-Till
      Posted June 3, 2018 at 10:05 am | Permalink

      Irish footballer Roy Keane once responded to criticism from his manager with the immortal line “stick it up your bollocks”.

      • Kev
        Posted June 3, 2018 at 5:46 pm | Permalink

        Are you familiar with “banjaxed”? Means much the same in this context but slightly more polite.

  2. Saul Sorrell-Till
    Posted June 3, 2018 at 10:01 am | Permalink

    I’ve read from various, apparently authoritative sources, some of whom I respect and trust, some of whom I respect and trust rather less, that the correlation between race and IQ is basically undeniable

    I don’t know whether they’re right. But even if they are my reaction is to wonder why anyone would actually care. Aside from perfectly legitimate scientific curiosity there seems to be no real reason to give a shit, unless you have some rather dubious ideas about how to order society.

    But more than that, even if the correlation holds, and I’m perfectly prepared to believe it does, how is it possible to take a concept like IQ seriously on this issue given the year on year increase in the average test scores? That does seem to suggest that there is something implicitly cultural about the slow creep upwards in IQ, and that certain cultures might be less conducive to getting good results in intelligence tests. Which in turn suggests that the differences between races in IQ might be more plausibly explained by cultural differences rather than genetic ones. After all, even when different races live in the same society there are still strong cultural differences.

    I don’t know if researchers have studied this but I’d like to know, preferably from someone without some political bone to pick.

    • Harrison
      Posted June 3, 2018 at 10:20 am | Permalink

      The proper response to the threat of groupish racism is individualism. Individuals are not simply the sum average of the group they belong to and trends are not destiny. Exceptional people can come from anywhere.

      • Saul Sorrell-Till
        Posted June 3, 2018 at 10:36 am | Permalink

        And it was/is a foundational belief of the Enlightenment, that individuals should not be owned by the community from which they come; that religions and countries and political belief systems, etc. didn’t have rights over their constituents. One of humanity’s best ideas, along with moussaka.

    • Kev
      Posted June 3, 2018 at 11:29 am | Permalink

      Eysenck got himself into trouble in the seventies by trying to deal with question of IQ and race.

      Measuring IQ depends on your criteria for IQ: if IQ psychologists are educated professionals (furthermore often of Jewish extraction like Eysenck), the tests that they put forward will tend to favour their own concept of intelligence (that of educated professionals, often of Jewish extraction).
      That was one of the issues that Eysenck had to face (he also got punched in the face and received death threats, so the subject was not merely “academic”).

      I know “intelligent” people who can’t even cook a meal.

      • mikeyc
        Posted June 3, 2018 at 11:56 am | Permalink

        It’s almost as if researchers on intelligence are not intelligent enough to account for these biases.

        I recommend a look into the cite Matt made below to see how IQ and IQ tests are actually used by people who study it.

      • Posted June 10, 2018 at 3:33 am | Permalink

        I guess these intelligent people have someone to cook for them, be it a family member or the cooks of some cafeteria or restaurant they frequent. Some types of inability are actually privilege.

    • Heather Hastie
      Posted June 3, 2018 at 2:21 pm | Permalink

      Re your last paragraph. There have been studies. One I vaguely remember relates to the questions. In an association question for example, the correct answer was cup + saucer. However, some races usually answered cup + table because they never used saucers and children in particular had no experience of cups and saucers. There were other examples given too.

      It’s why IQ tests now tend to rely on spatial awareness. However, that’s unfair too. As a group, it tends to favour males.

      • Adam M.
        Posted June 5, 2018 at 1:57 pm | Permalink

        Isn’t it only “unfair” if the goal is to produce equal outcomes on the test, or if you assume that groups are equally intelligent?

        If there are multiple aspects to intelligence (logical reasoning, geometrical/spatial, pattern matching, verbal/linguistic, etc.), they should all be included, but if one group does better than another on average does that really make it “unfair”? I’d think we’d want the test to be as comprehensive and accurate as possible, and then the outcomes are what they are.

        • Heather Hastie
          Posted June 5, 2018 at 3:52 pm | Permalink

          I agree they should all be included. I’m criticizing the ones that rely only on spatial ability.

  3. Paul Beard
    Posted June 3, 2018 at 10:02 am | Permalink

    Surely there must be a link between writing for the Gruniad and a loss of facility for critical thinking. Could it be — epigenetic?

    Unfortunately the Guardian has reached the point where readers can predict what it will say just by thinking of a subject.

    • Saul Sorrell-Till
      Posted June 3, 2018 at 10:20 am | Permalink

      Or reading a news item…

      The Malian Spiderman story – since it involved a centrist, vaguely Blairite French PM doing something good for an immigrant – was always going to be swiftly followed up with a Grauniad column explaining why Macron’s actions were actually ‘problematic’ and condescending, etc.

      As soon as you read a news story you know the overall rhetorical shape of any forthcoming article in The Grauniad that mentions it. It’s such a tedious, counterproductive approach.

      • Jonathan Wallace
        Posted June 4, 2018 at 8:52 am | Permalink

        To be fair to the Guardian its take on the study (I should stress I am only judging from the quoted excerpt)was far from being the most egregious.
        As to predicting what the Guardian will write on any given topic I’d say that that is again a fault that other papers are at least as guilty of. It’s none too hard to guess what the Daily Mail will have to say about many things (or the Express, Torygraph, Sun, etc).
        I used to like the Independent before it stopped having a printed edition as it employed columnists with a range of political perspectives including both right of centre (e.g. Dominic Lawson) and left (e.g. Owen Jones). It felt less like an echo chamber than many newspapers and it was stimulating to read views that you did not necessarily agree with.

  4. CHARLES A SAWICKI
    Posted June 3, 2018 at 10:48 am | Permalink

    Keep up the “sciencesplaining”!

  5. Posted June 3, 2018 at 11:40 am | Permalink

    Ritchie’s Intelligence: All That Matters is the best introduction to the science of intelligence.

  6. Mark Sturtevant
    Posted June 3, 2018 at 12:05 pm | Permalink

    At present I am a bit suspicious of the value of their central claim. In data mining, if you simply do an open-ended search for any correlation between DNA loci and cognitive function you will find it. But equally, I bet you could find genetic correlations between various DNA loci and having a mole on your right arm but not your left arm.

    • Mikeyc
      Posted June 3, 2018 at 12:46 pm | Permalink

      Mark – are you familiar with the techniques used? If so, where do you find fault? I’m a co-author on a GWAS (Genome Wide Association Study) paper linking obesity and metabolic traits to gene expression traits and can perhaps discuss this with you.

      One thing is very true – these are probabilistic claims and, as Dr PCCe said, they are only associating allele variation with variation in traits. They are not making causal claims (though that too can be estimated).

      • Mikeyc
        Posted June 3, 2018 at 12:56 pm | Permalink

        Btw, just so it’s clear, I’m not challenging you. It might be an interesting and informative discussion for me to hear from someone skeptical of the techiques.

        • Mark Sturtevant
          Posted June 3, 2018 at 2:48 pm | Permalink

          Thank you for the clarification. But it would be a rather one-side discussion since I don’t understand how this study was conducted. You know probably better than I that papers are published that are basically data mining of the sort I mentioned.

    • Posted June 3, 2018 at 4:34 pm | Permalink

      You’re right about data mining, but this study is not an example. Three critical statistical features to help distinguish:

      (1) Data miners tend to look for correlations without any predetermined hypothesis in mind. The authors of this paper instead had a fixed set of hypotheses in mind (association or not between general cognitive function and SNPs). True, this generated an enormous number of hypotheses to test, so we should be very worried about the multiple testing issue.

      (2) Data miners generally ignore (or downplay) the issue of multiple testing. That is, if you test enough hypotheses, you will eventually be guaranteed of finding statistical significance somewhere unless you adjust your threshold for significance accordingly. This study’s authors seem to be well aware of the issue and take several measures to combat the multiple testing problem. In part, they use several Bonferroni adjustments, which is one of the most conservative (stringent) methods available.

      (3) Data mining is notorious for “massaging” the dataset after performing hypothesis testing. This could look like excluding ostensible “outliers” because they destroy the correlation of interest, restricting to an ad hoc subgroup analysis, creating artificial dichotomizations (e.g. “intelligent” vs. “not”), etc. From my reading of the paper, there doesn’t appear to be any of this going on.

      • Mark Sturtevant
        Posted June 3, 2018 at 5:05 pm | Permalink

        Ok, and thank you. This helps with the adjacent thread as well.

        • Posted June 3, 2018 at 6:31 pm | Permalink

          No problem! I always enjoy weighing in on statistical issues.

  7. Christopher
    Posted June 3, 2018 at 12:18 pm | Permalink

    Wait, so I don’t wear glasses but have numerous bouts of depression, and because of that, I’m not intelligent? Ok, makes sense. I’m sure glad these journalists have explained that clearly enough that my dumb little head could grasp it. But what I want to know is, if as I age my eyesight diminishes, will my brains grow by .29, or roughly 30%? I sure can’t wait to grow me some smarts. Then maybe I could write for the Guardian.

    • Heather Hastie
      Posted June 3, 2018 at 2:28 pm | Permalink

      The way the heading in one of those articles reads, you only have to put on glasses to become more intelligent. Needing them is irrelevant.

      Viz: ‘Study finds wearing glasses actually makes you smarter’.

      • Christopher
        Posted June 3, 2018 at 9:27 pm | Permalink

        Well then I’m going to go buy meself some reading glasses and make another attempt at Dr. Coyne’s book “Speciation”.

      • Zetopan
        Posted June 6, 2018 at 8:00 pm | Permalink

        “Study finds wearing glasses actually makes you smarter.”

        Even that great thinker Sylvester Stallone agrees with that! On a talk show many years ago Stallone was wearing “scarecrow” glasses (i.e. no lenses) and he was asked why. He stated that he liked wearing glasses. In the several years since that time I have noticed that fad arising with more and more people wearing scarecrow glasses.

        • Heather Hastie
          Posted June 7, 2018 at 3:27 pm | Permalink

          That’s just weird!

          I suppose it’s sad too. They must be deeply insecure about their intelligence.

          I’m more insecure about my looks, so I’ve worn contact lenses since I was 18 and started needing glasses.

      • Posted June 10, 2018 at 3:36 am | Permalink

        I shudder to think what would Sarah Palin be if she didn’t wear glasses.

        • Diana MacPherson
          Posted June 10, 2018 at 9:54 am | Permalink

          😹

        • Heather Hastie
          Posted June 10, 2018 at 3:10 pm | Permalink

          Ha!

  8. Posted June 3, 2018 at 12:42 pm | Permalink

    As it happens, I discussed the original publication at a local science journalists meeting because I’m interested in “general cognitive function” (who isn’t?) and there’s a bunch of Finns among the contributors: at least Ahola-Olli, Lahti, Palviainen, Kähönen, Lehtimäki, Loukola, Lyytikäinen, Rovio, Kaprio, Vuoksimaa, Palotie, Räikkönen and Rantakari.

    Funnily enough, I had glimpsed the Guardian article but until I read this WEIT post it didn’t even occur to me it was about the same study. Well, at least The Guardian gives a link to the original paper. Gail Davies isn’t mentioned, so let’s repeat her name and write it correctly: Gail Davies.

  9. Gareth
    Posted June 3, 2018 at 12:46 pm | Permalink

    My grandma needs reading glasses, she must be smarter than me.

    • Posted June 3, 2018 at 2:29 pm | Permalink

      I guess so. I began using reading glasses about the same time I grew smart enough to become a grandfather 🙂

  10. Diana MacPherson
    Posted June 3, 2018 at 1:11 pm | Permalink

    It’s screw ups like this that makes people say stupid things like “I don’t trust science”

    • Gareth
      Posted June 3, 2018 at 1:15 pm | Permalink

      I only like it when it tells me what I want to hear.

    • Craw
      Posted June 3, 2018 at 8:10 pm | Permalink

      Actually, that’s a smart thing to say. The stupid thing is trusting anything else.

      • Diana MacPherson
        Posted June 3, 2018 at 9:36 pm | Permalink

        Sadly, it’s usually followed by “because they change their minds” or “because you hear about a study then never hear anything again”.

  11. Diana MacPherson
    Posted June 3, 2018 at 1:12 pm | Permalink

    Sub

  12. Taz
    Posted June 3, 2018 at 1:28 pm | Permalink

    I was going to post a comment about how I’m surprised “sciencesplaining” isn’t already a derogatory term, but I decided to google it first.

    Too late.

    • Christopher
      Posted June 3, 2018 at 2:25 pm | Permalink

      What about “godsplaining”, is that a word yet? You know, how people who’ve only read small parts of one book use it to tell you how you are wrong about everything if what you said wasn’t in their one book, or when you bring up a clear contradiction between one story and another in their book they ”godsplain” to you how you misunderstood the metaphor.

      • Heather Hastie
        Posted June 3, 2018 at 2:31 pm | Permalink

        It’s called theology.

        Orjust Lying For Jesus.

      • Mark Sturtevant
        Posted June 3, 2018 at 2:50 pm | Permalink

        I think that falls into the range of relitious ‘apologetics’.

        • Mark Sturtevant
          Posted June 3, 2018 at 2:50 pm | Permalink

          Religious.

  13. Peter Welch
    Posted June 3, 2018 at 2:20 pm | Permalink

    OK, OK, got it. I wear glasses, therefore I am smarter than folks who don’t. (Sound of Jerry’s head exploding.) 🙂

    • Craw
      Posted June 3, 2018 at 8:11 pm | Permalink

      No, this is wrong. You are only smarter when actually wearing the glasses. In the shower or wearing contacts your IQ is lower.

  14. Larry Winkler
    Posted June 3, 2018 at 5:26 pm | Permalink

    Coyne has stated something bigly wrong.

    “What we have are simply genes that contribute to the “heritability” of cognition: the proportion of variation in cognitive ability among individuals in the population due to variation in genes rather than other factors, like variation in the environment. This heritability is around 50%, meaning that about half of the variation we see in the cognitive ability of humans is due to “heritable” variation in their genes (variation capable of being passed on to offspring), and the other half to variation in their environments and other factors.”

    NO! Coyne seems to be stating that heritability is orthogonal to environment. This is false.

    Smallpox (or polio) has zero heritability in the US, because there is no smallpox (polio) in the US. If the environment changed, and smallpox (polio) was introduced, the heritability of smallpox (polio) would likely rocket.

    The same is true for cognitive measures, IQ, etc.

    Change the environment and heritability changes follow.

    • Posted June 3, 2018 at 5:32 pm | Permalink

      You don’t know what you’re talking about, I’m afraid. I obviously know that heritability depends on the environment in which it’s measured, but IN THAT ENVIRONMENT it is a valid measure of the proportion of variation that can be passed on. I left that out because it wasn’t relevant, and I also left out gene-environment interaction and dominance variation.

      I could have written many paragraphs on the meaning of heritability, but it wasn’t relevant to this post.

      And, by the way, I’m not called “Coyne” on this website.

    • Mikeyc
      Posted June 3, 2018 at 6:38 pm | Permalink

      Dr PCCe adequately rebutted you, but I will point out that neither small pox nor polio are heritable in any environment. They are infectious, not heritable agents. You can be infected through your parents but you cannot inherit them. Unless you meant resistance to the viruses, in which case see the good Dr’s rebuttal to your claims.

  15. Helen Hollis
    Posted June 3, 2018 at 6:48 pm | Permalink

    I asked Dr. Coyne a question about some CPS material and he not only responded but explained his reasoning.
    Fast forward and the student in question won a certificate from both CPS and Apple for developing an app.
    Thank you again Dr. Coyne,
    for keeping a science loving kid in science

    • Mikeyc
      Posted June 3, 2018 at 7:22 pm | Permalink

      Bravo! To you and our host.

  16. Diana MacPherson
    Posted June 3, 2018 at 9:38 pm | Permalink

    Well that guy in the Twilight Zone episode didn’t feel so smart in the library, post apocalypse, when his eye glasses broke. 🙂

  17. Dale Franzwa
    Posted June 4, 2018 at 1:50 am | Permalink

    I wear eyeglasses, therefore I’m smart. = I’m smart, therefore I wear eyeglasses. Well, duh. I wear eyeglasses ‘cuz I don’t see so good without ’em. “Duh” again.

  18. bruce
    Posted June 4, 2018 at 11:34 am | Permalink

    https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect


%d bloggers like this: