The Atlantic: Genes are overrated; science doesn’t progress towards truth. Me: Wrong on both counts

The Atlantic has a review of Siddhartha’s new book on genetics; the review is by Nathaniel Comfort, a professor at the Institute of the History of Medicine at Johns Hopkins, and carries the provocative title of “Genes are overrated.”

I haven’t yet read Mukherjee’s book, so I won’t comment on its content except to say that the reviews have been generally positive but mixed, as Comfort’s is. I want instead to concentrate briefly on Comfort’s attitude towards science and genes.

One of the criticisms Comfort levels at Mukherjee is that he holds a “whiggish” view of genetics; that is, he sees genetics’ history as being one of progressive understanding. To Comfort, that’s a misleading way of describing science, which, to him, doesn’t progress toward deeper understanding of reality—like building an edifice of understanding—but acts simply as a bulldozer, plowing under theories that are shown to be wrong. Some quotes (my emphasis):

The antidote to such Whig history is a Darwinian approach. Darwin’s great insight was that while species do change, they do not progress toward a predetermined goal: Organisms adapt to local conditions, using the tools available at the time. So too with science. What counts as an interesting or soluble scientific problem varies with time and place; today’s truth is tomorrow’s null hypothesis—and next year’s error.

. . . The point is not that this [a complex view of how genes work; see below] is the correct way to understand the genome. The point is that science is not a march toward truth. Rather, as the author John McPhee wrote in 1967, “science erases what was previously true.” Every generation of scientists mulches under yesterday’s facts to fertilize those of tomorrow.

“There is grandeur in this view of life,” insisted Darwin, despite its allowing no purpose, no goal, no chance of perfection. There is grandeur in a Darwinian view of science, too. The gene is not a Platonic ideal. It is a human idea, ever changing and always rooted in time and place. To echo Darwin himself, while this planet has gone cycling on according to the laws laid down by Copernicus, Kepler, and Newton, endless interpretations of heredity have been, and are being, evolved.

Comfort is correct that science never knows when it’s reached the absolute, never-to-be-changed truth: there is no bell that goes off in our heads saying “ding ding ding: you’re there, and need go no further.” And a true Whiggish view of history—one that implies there’s an inevitable and unswerving path from error to truth, without any dead ends, mistakes, paths toward error, or roadblocks, is also a distortion, one that Matthew also criticized in his review of Mukherjee’s book in Nature.

But this doesn’t mean Comfort is right in arguing that everything we think we know will inevitably be demolished by future research. There are simply some things that are so unlikely to be falsified that we can see them not only as provisional truths, but as nearly absolute truths. A normal water molecule, for instance, has two hydrogen atoms and one oxygen atom. The Earth is about 4.6 billion years old, and life evolved on it, with all tetrapods descending from ancestral fish. Bodies attract each other with a force inversely proportional to the square of the distance between them. DNA is the purveyor of heredity, and in most organisms is a double helix. AIDS is caused by infection with a virus that attacks our immune system. You can all think of a gazillion more such “truths”—asssertions that you’d bet your house on.

Yes, science refines our understanding, and some theories, like Newton’s laws, are found to be special cases of deeper theories, like quantum mechanics. But to say that science is not a march toward truth, but a simple erasure of the false, is not only simplistic, but even a bit tautological: if we keep eliminating what doesn’t stand up, and keep adumbrating new theories, we will usually arrive at a more correct understanding of nature. For example, smallpox was once thought to be due to the wrath of gods. That theory was plowed under by the view that it was spread from person to person, and then to the notion that one could prevent it via inoculation. That, in turn, led to the recognition that the disease was caused by a virus, and then to the preparation of effective vaccines using live, attenuated viruses. The result: we understand fully how to get rid of the disease, and it’s been eliminated from our planet. In what sense is this not due to progressive homing in on the truth? We can use the laws of physics to land probes on comets. In what sense is that not due to a better understanding of how bodies move and interact, and not just a dispelling of what is false?

I see this kind of postmodernism infecting a lot of scientific writing, and it’s misguided; no, it’s simply wrong. 

Comfort also errs, I think, in claiming (as did Evelyn Fox Keller did in her 2000 book The Century of the Gene) that the gene is now pretty much a useless concept, both in definition and in action. (I critically reviewed that book in Nature; pdf available on request.) Comfort:

This handful of errors, drawn from a sackful of options, illustrates a larger point. The Whig interpretation of genetics is not merely ahistorical, it’s anti-scientific. If Copernicus displaced the Earth from the center of the universe and Darwin displaced humanity from the pinnacle of the organic world, a Whig history of the gene puts a kind of god back into our explanation of nature. It turns the gene into an eternal, essential thing awaiting elucidation by humans, instead of a living idea with ancestors, a development and maturation—and perhaps ultimately a death.

. . . Ironically, the more we study the genome, the more “the gene” recedes. A genome was initially defined as an organism’s complete set of genes. When I was in college, in the 1980s, humans had 100,000; today, only about 20,000 protein-coding genes are recognized. Those that remain are modular, repurposed, mixed and matched. They overlap and interleave. Some can be read forward or backward. The number of diseases understood to be caused by a single gene is shrinking; most genes’ effects on any given disease are small. Only about 1 percent of our genome encodes proteins. The rest is DNA dark matter. It is still incompletely understood, but some of it involves regulation of the genome itself. Some scientists who study non-protein-coding DNA are even moving away from the gene as a physical thing. They think of it as a “higher-order concept” or a “framework” that shifts with the needs of the cell. The old genome was a linear set of instructions, interspersed with junk; the new genome is a dynamic, three-dimensional body—as the geneticist Barbara McClintock called it, presciently, in 1983, a “sensitive organ of the cell.”

Yes, gene action is complicated, but the notion of a “gene” is not only not near death, but still extremely useful. Even if many diseases are caused by many different genes, they’re still genes, which I’ll define as “a segment of DNA that codes for a protein or an RNA molecule that regulates protein-coding genes.” In fact, there are many diseases and conditions—Landsteiner blood type, Rh type, Tay-Sachs disease, Huntington’s disease, sickle-cell anemia, color-blindness, and so on—that are caused by mutations in single genes, and can be effectively understood (and used in genetic counseling) by considering them as “single gene traits.” These are said to number over 10,000.

I’ve put at the bottom a discussion from Matthew’s book, Life’s Greatest Secret, about of the notion of “gene” and how it was questioned and then widely accepted.

And why the modern concept of a gene turns it into “kind of god” baffles me. The notion of genes, and of DNA as the molecule that carries them, has been immensely useful, and “true in the scientific sense. Does that make them into “gods”? Only to a postmodernist who resents the hegemony of scientific truth.

As for genes being a “higher order concept”, a “shifting framework” or a “three-dimensional body,” well, that’s not something that I, as a geneticist, am familiar with. Perhaps those concepts are adumbrated in the “science studies” departments—the same places where truths are seen as relative and privileged.

Let me add that most of Comfort’s review is okay, but then at the end he veers off into pomo la-la land. The usefulness of the idea of “genes” will survive: it survived Keller’s attack and will survive Comfort’s. But what I see as damaging is the notion that science doesn’t progress towards some kind of truth, or greater understanding of reality. It mystifies me how anyone familiar with the history of science can say that.

And if genes are overrated, it’s news to me. They are the bearers of heredity, the switches of development, and the coders of bodies. Without the notion of genes, and of the genetic code described so well in Matthew’s latest book, we’d be back in the days before 1900.

________

APPENDIX (!): Excerpts from Life’s Greatest Secret:

For much of the 1950s, scientists had felt uncomfortable about the word ‘gene’. In 1952, the Glasgow-based Italian geneticist Guido Pontecorvo highlighted the existence of four different definitions of the word that were regularly employed by scientists and which were sometimes mutually contradictory. A gene could refer to a self-replicating part of a chromosome, the smallest part of a chromosome that can show a mutation, the unit of physiological activity or, finally, the earliest definition of a gene – the unit of hereditary transmission. Pontecorvo questioned whether the gene could any longer be seen as a delimited part of a chromosome, and suggested instead that it was better seen as a process and that the word gene should therefore be used solely to describe the unit of physiological action.

. . . Although Pontecorvo’s suggestion was not taken up, scientists recognised the problem. The debate over words and concepts continued at the Johns Hopkins University symposium on ‘The Chemical Basis of Heredity’, which was held in June 1956. By this time it was generally accepted as a working hypothesis that all genes in all organisms were made of DNA and that the Watson–Crick double helix structure was also correct. Joshua Lederberg, a stickler for terminology, declared audaciously that ‘“gene” is no longer a useful term in exact discourse’ He would no doubt be surprised to learn that it is still being used, more than half a century later.

. . . The multiple roles of nucleic acids have expanded far beyond the initial definition of a gene as the fundamental unit of inheritance and show the inadequacy of Beadle and Tatum’s 1941 suggestion that each gene encodes an enzyme. As a consequence, some philosophers and scientists have suggested that we need a new definition of ‘gene’, and have come up with various complex alternatives. Most biologists have ignored these suggestions, just as they passed over the argument by Pontecorvo and Lederberg in the 1950s that the term ‘gene’ was obsolete.

In 2006, a group of scientists came up with a cumbersome definition of ‘gene’ that sought to cover most of the meanings: ‘A locatable region of genomic sequence, corresponding to a unit of inheritance, which is associated with regulatory regions, transcribed regions and/or other functional sequence regions. In reality, definitions such as ‘a stretch of DNA that is transcribed into RNA’, or ‘a DNA segment that contributes to phenotype/function’, seem to work in most circumstances. There are exceptions, but biologists are used to exceptions, which are found in every area of the study of life. The chaotic varieties of elements in our genome resist simple definitions because they have evolved over billions of years and have been continually sieved by natural selection. This explains why nucleic acids and the cellular systems that are required for them to function do not have the same strictly definable nature as the fundamental units of physics or chemistry.

39 Comments

  1. merilee
    Posted May 22, 2016 at 11:57 am | Permalink

    sub

    • jimroberts
      Posted May 22, 2016 at 12:35 pm | Permalink

      sub

      • Posted May 22, 2016 at 2:43 pm | Permalink

        🚌

        • HaggisForBrains
          Posted May 23, 2016 at 5:25 am | Permalink

          That’s a little backward for you.

          • Posted May 23, 2016 at 5:54 am | Permalink

            Easier than:

            🌊 🛳

            Unfortunately, there’s no sandwich emoji. (Emojus?)

            /@

  2. Posted May 22, 2016 at 12:11 pm | Permalink

    “In fact, there are many diseases and conditions—Landsteiner blood type, Tay-Sachs disease, Huntington’s disease, sickle-cell anemia, tongue-rolling, color-blindness, and so on—that are caused by mutations in single genes…”

    I have to pick one tiny nit in an excellent post: tongue rolling is NOT a simple, single-gene trait. Sturtevant (1940) first suggested that it was partly genetic, with rolling dominant to non-rolling, but his data had several rolling offspring from two non-rolling parents (which would be impossible under his simple genetic model). Matlock (1952) found several pairs of identical twins in which one twin could roll and the other couldn’t, so it’s clearly influenced by environmental factors, not just genes. A couple of later papers also found identical twins that differed in rolling ability. Some papers have found rolling to be more common in older children, suggesting that it is a learned behavior.

    By 1965, Sturtevant was convinced by the twin data that tongue rolling was not a simple genetic character, and wrote that he was “embarrassed to see it listed in some current works as an established Mendelian case.”

    Unfortunately, tongue rolling and other human traits (attached earlobe, hitchhiker’s thumb, widow’s peak, etc.) are still used in classrooms to demonstrate Mendelian genetics, even though almost all of them are either determined by a mix of genetics and environment; are continuous characters, not dichotomous; or in some cases, show no evidence of genetic influence whatsoever.

    I’ve written about the visible human traits commonly used to teach genetics here: http://udel.edu/~mcdonald/mythintro.html .

    • Mark Sturtevant
      Posted May 22, 2016 at 12:37 pm | Permalink

      Very interesting! (And no, no relation).

    • Posted May 22, 2016 at 12:43 pm | Permalink

      Thanks, John; I stand corrected and will eliminate that. I left out attached earlobes because I knew that story wasn’t good any longer.

      • Posted May 22, 2016 at 1:56 pm | Permalink

        Yeah, the only common, visible human trait with simple Mendelian genetics is wet vs. dry earwax (wet is dominant), caused by a single amino acid substitution in the ABCC11 gene. All the other textbook examples of visible human genetic characters are either very rare (like dwarfism) or complete crap.

        It amazes me that biology teachers don’t get sued every time they tell a student that tongue rolling is dominant, and a tongue-rolling student finds out that neither parent can roll, and concludes that they’re a product of what human geneticists delicately call “unacknowledged non-paternity” (i.e., Mom had an affair). That’s one reason I use cat coat genetics to demonstrate simple Mendelian genetics; the characters really are genetic, and cats aren’t very litigious.

        • Posted May 26, 2016 at 3:55 pm | Permalink

          I think that a good Genetics 101 course can be compiled using only cat coat.

    • Diane G.
      Posted May 23, 2016 at 1:21 am | Permalink

      Fascinating, thanks!

  3. rwilsker
    Posted May 22, 2016 at 12:14 pm | Permalink

    A good antidote to this kind of nonsense about how science evolves is Isaac Asimov’s wonderful essay, “The Relativity of Wrong”.

    http://chem.tufts.edu/answersinscience/relativityofwrong.htm

    • merilee
      Posted May 22, 2016 at 12:22 pm | Permalink

      excellent!

    • rwilsker
      Posted May 22, 2016 at 1:10 pm | Permalink

      For people quickly reading my (sorry!) ambiguous comment: Asimov’s essay talks about how our scientific knowledge and ability to understand reality evolves and how nonsensical the idea that science is continually completely rewritten is.

    • Gordon Davisson
      Posted May 22, 2016 at 2:11 pm | Permalink

      Asimov gives the history of our knowledge of the shape of the Earth (flat -> spherical -> oblate spheroid -> lumpy oblate spheroid) as example of Science giving better and better approximations of reality. Appropriately, since he wrote it, we’ve found that the pattern of lumpiness he describes is only approximately right, and it’s actually more complicated than he knew about.

      My favorite summary is from Piet Hein, the Danish poet/mathematician/scientist/pretty-much-everything-else:

      The road to wisdom? — Well, it’s plain
      and simple to express:
      Err
      and err
      and err again
      but less
      and less
      and less.

    • Smith Powell
      Posted May 22, 2016 at 3:50 pm | Permalink

      I was going to suggest that Comfort would profit from reading Isaac Asimov’s essay “The Relativity of Wrong” only to find that rwilsker beat me to it, so I second the suggestion.

    • HaggisForBrains
      Posted May 23, 2016 at 5:37 am | Permalink

      Asimov’s non-fiction essays are brilliant. I can recommend The Tragedy of the Moon collection.

      • Posted May 23, 2016 at 5:56 am | Permalink

        Concur. I had all of those essay collections in Coronet editions from the 1970s. I’m not sure I’ve still got them (they’d be in a box in the loft if I have).

        /@

  4. Mark Sturtevant
    Posted May 22, 2016 at 12:32 pm | Permalink

    That last paragraph quoted from Comfort was especially irksome to me, as so much of it describes a false history about our growing knowledge about our genome. He chooses the dramatic spin of things to fabricate a history that never really happened so to support his claims.
    The notion that our genome had 100,000 genes was known to be likely wrong when it was proposed. The finding that most of our genome is not ‘gene’ has also been known for a very long time, and no, the rest was never considered to be enigmatic ‘dark matter’ save to a few who chose to be ignorant because they had an agenda.

    It’s like hearing someone describe pseudohistory as if it were real history because they want to claim that we don’t know much about history.

    • Diana MacPherson
      Posted May 22, 2016 at 12:57 pm | Permalink

      I think he’s probably a crummy writer. Oh, the facts don’t match how I want to say something, so I’ll just ignore them.

    • Diane G.
      Posted May 23, 2016 at 1:24 am | Permalink

      Who knew you could strawman genes?

  5. Pascal Nelson
    Posted May 22, 2016 at 12:56 pm | Permalink

    Well said. Thank you for writing this.

  6. Diana MacPherson
    Posted May 22, 2016 at 12:56 pm | Permalink

    Articles like Comfort’s are examples of why I think there is a need to educate people about how science is done. This “how science is done” education is needed more than communicating scientific discoveries because this misunderstanding leads to bad conclusions that in turn spawn bad ideas. One example is denying global warming. People who don’t understand how science is done, don’t know who to believe. They can’t understand that it’s not a matter of opinion and someone can actually be absolutely wrong and that it’s not a matter of one party (the scientist) providing an opinion and another party providing his/her opinion….the way science is presented, people think it’s all about opinions and not facts. This is especially bad when new facts come to light and the scientific consensus shifts….to the uninformed, this is just evidence that science can’t be trusted with its “opinions”.

    And all that leads to anti-science and ultimately can lead us back to the Middle Ages.

    • Heather Hastie
      Posted May 22, 2016 at 1:50 pm | Permalink

      That’s similar to my take. I’m not going to pretend to understand the details of the science, because I don’t. But I find Comfort’s idea of how science works self-serving. He sounds similar to one of those people who insist they love science but when it comes to evolution, equally insist we were plonked here by a supernatural being.

  7. Posted May 22, 2016 at 12:58 pm | Permalink

    I think one could defend the statement that the gene is seriously over rated (by some people). Some people have a strong idea that genes themselves are the sole cause of everything about a body. Actually, genes interact with each other and their environment. For people who understand genes, that’s not a surprise, but it makes sense to me that this realization would lead some people to swing from “genes are everything” to “genes are overrated.” I mean, they did overrate genes. Hopefully these people will eventually end up with both “genes are really really important” and “so are other things.”

  8. Gregory Kusnick
    Posted May 22, 2016 at 1:01 pm | Permalink

    How quaint of Comfort to think that “this planet has gone cycling on according to the laws laid down by Copernicus, Kepler, and Newton”. Any day now, the cycling theory will be erased and we’ll learn that planets walk to work, or take the bus. Indeed, the very notion of planets is overrated.

  9. Derek Freyberg
    Posted May 22, 2016 at 1:34 pm | Permalink

    I’m wading my way through (it’s a long and dense read) David Wootton’s “The Invention of Science”, subtitled “A new history of the scientific revolution”.
    A quote (pp. 543-4):
    “The important thing about the science of Galileo and Newton, Pascal and Boyle is that it was, in part, successful, and that it laid the foundation for future successes. They did not know what the future would hold; but they did have a clear sense of what they were trying to achieve. … The remarkable thing about science is that the process is not only cumulative but, it would seem (to make a distinction the dictionary does not recognize), accumulative. The past not only shapes the present; in science, gains made in the past are only given up ( … ) in order to be exchanged for greater gains made in the present.”
    Science may be Darwinian in the sense that bad ideas eventually die out, but it is not random.

  10. Posted May 22, 2016 at 1:41 pm | Permalink

    Comfort: “today’s truth is tomorrow’s null hypothesis—and next year’s error.”

    …Or next years provisional fact which forms the basis of informed speculation, hypotheses, and ultimately scientific progress — something which never seems to get mentioned by those who argue in this manner.

    • Ken
      Posted May 22, 2016 at 2:08 pm | Permalink

      +1

      • Ben
        Posted May 22, 2016 at 8:14 pm | Permalink

        Sub

  11. Posted May 22, 2016 at 11:04 pm | Permalink

    What I find most disconcerting about Comfort’s most egregious nonsense isn’t that he sounds like Horgan’s first cousin (maybe we could look at their DNA panels to find out?), but that he would, at this point in a career writing about science, unleash such terribly errant generalizations thar come across as far more splenetic than analytic. I, myself, make no claim at all to be an historian of scientific movements, but such encyclopedic knowledge is not necessary in order to confidently conclude that science has, does, and will continue to establish some consistently reliable truths, among those ones provided by Jerry in his post. It is also quite easy to conclude that Science does NOT assume that, BEFORE INVESTIGATION, the universe is neatly ordered, waiting for our discovery. What this means is that science arrives at its currently accepted truths in two general ways: by direct identification of a phenomenon that proves, through repeated investigation, to answer a question (and that phenomenon is sometimes discovered not as a result of purposive planning, but by “happy accident”); and by methodically eliminating that which is not, as Jerry also mentioned.

    What is doubly confounding is the tired duckspeaking of those archetypal postmidernist tropes in Comfort’s non-criticism of the scientific project overall. Okay. I’ll take a “risk,” here: I LIKE certain aspects of postmodernism, quite a bit, and I hope I can efficiently offer a significantly different–i.e. far more palatable, if not thought provoking, understanding of it.

    It is true that Lyotard’s fire across the bough, the Postmodernism is defined as an “incredulity towards all metanarratives.” However, though this is poyemtially a problem from the start (more on that later), I don’t see this as a source of conflict whatsoever with the scientific truths, epistemologically, methodogically, culturally, or in terms of political economy. The conflicts arise not with a more accurate and honest engagement of both scientific kn owledge and postmodern theory, but with the insistence on the part of the individual writer to commit the fundamental logical error alluded to above, with regard to the “incredulity towards all metanarratives.” Indeed, if this statement is taken out of context and applied to the massive tract human experience involved with knowledge production (let’s call that “science”), it immediately becomes that thing towards which it just professed it’s sneering dismissal–an overarching explanation of a class of phenomenon that transcends race/class/gender/geographic/ economic differences. (In case you were wondering, and I feel rather confident you weren’t, this epistemological bait-and switch is a linguistic example of the “reification” of the idea of incredulity towards metanarratives (reification = good), but then “fetishsized” when Comfort it applies to all scientific knowledge, and thus both the statement and the category of phenomena to which it is attached become primarily the interplay of power relations.) By the same token, when Derrida claimed that we had arrived at “the end of philosophy,” he was still offering an ideology critique, and was thus participating in the continuation of philosophical discourse. My point is that it is Comfort’s shallow and dishonest, and in this case opportunist, grafting of Lyotard’s idea onto genetics and science in general which is the offender here.

    Science has established a large number of truths (well, large for us homo sapiens, the johnny-come-lately species on Earth) which have undeniably served both us and the planet very well. I, too, sense in Comfort’s piece thinly veiled hostility towards a perceived–but utterly bogus–“exploitative hegemony” that is inextricably linked to scientific truths. Um, wait just a sec there: last time I checked, it was the uniformity and face validity of the scientific method that enabled access to the fruits of its disciplined, repeated practice in MULTIPLE cultures and races, classes, and all gender identifications. And Comfort should feel quite compelled to agree that such is a fundamental tenet of the godless Pomo “paradise.”

    But the most ridiculous aspect of Comfort’s equation “science = modernist power broker = liar! liar! liar!” is that his claim is based on an assumption of how scientific truths, once established, continue to “legitimate” themselves. He has boxed himself in above, and the only claims he can make without further contradicting himself are that: 1) scientific truths have often been used for exploitative ends, often with terrible ugliness for many, many people; 2) scientific “truth” is fundamentally hypocritical and dishonest once it makes ANY kind of truth claim (small or capital “t”), because all truth is subject to revision, and thus any enduring scientific truth is maintained only through hegemonic means.

    I wish I had the linguistic equivalent of a crock pot right now, so I could slow cook the nonsense out of this massive crock of s___ assessment. It is the hallmark of the scientific project to reconsider even the most strongly held ideas if and when sufficient contradictory evidence emerges. How this leads to either irrelevance or a specious conception of social control is completely unclear (because it is completely false).

    Even more telling is that the biggest scientific revolution ever (in my humble opinion of limited scope) is the establishment of the scientific method itself. At the same time, the scientific method was: 1) a previously inarticulated plan of both rigorous discipline and elegant simplicity, developed for the sake of discovering consistency of observed results, enabling the creation of a reliable record of replication (i.e. the establishment of scientific truth), while acknowledging that “failures” to confirm hypotheses aren’t necessarily failures and can be valuable in other ways; 2) much more subtly and expansively, the establishment of the SM marks the standardizing of a process of individual assessment of the outside world used for the sake of getting through a normal day. Somewhat absurdly put, if I am walking across the street, suddenly to see a big bus hurtling towards me Ata frightening rate of speed, I can use the basic principles of the scientific method to conclude that if I don’t get out of the way and I try to best the bus in this competition head-on, I will lose by significant amount. Seriously put, the establishment of the SM, as an abstracted and thus preservable, socially-accessible set of rules, also was a way to take disparate, individual “truths,” centralize them through subjecting them to a set of standards in the social sphere, and then re-disseminate reliable and valid results as scientific truths which can thus, at once, be built upon but also remain open to reconsideration upon presentation of superior information. Again, these patterns are so very PoMo, so I’m not sure why Comfot’s so uncomfy.

    Now then, Comfort would be spot on if he had directed his critique at the concentration of science away from the public sphere, the corrupting influence of politics, and the drive for “results” which “confirm” a previously disseminated corporate claim that leads to the fattening of a tax-sheltered bank account. In THAT way, I can see he would have been having dinner with Noam. (And with that, a nod to Filippo, as I am about to view the link you posted in response, so I can provide something resembling an adequate response to yours).

    • Diane G.
      Posted May 23, 2016 at 1:41 am | Permalink

      Whew, I’m going to have to revisit this to understand the case you make for postmodernism, but it’s certainly an authorial tour de force.

      Please pardon a nit pick:

      “fire across the bough”

      …not much of a sailor, are we?

      😉 (I hope I’m not just missing some intentional irony there!)

      • Posted May 24, 2016 at 6:37 pm | Permalink

        Hello Diane!

        My “authorial tour de force” notwithstanding, I am currently removing the last bits of egg from my face for the painfully awkward and careless phrase you were kind enough to point out, that I might learn proper maritime terminology. You were even kinder not to use my dreadful mixed metaphor as an occasion to roast me like a holiday ham by going all PoMo on the metaphor. You could have been uber cheeky, exploring the phrase witlh its “multiplicity of indeterminate meanings, at once obscured by the epistemic putrefaction of language-driven identity (and thank Nietzsche for that, eh?), yet brought into a comparative hyper-reality through its printed and/or digital reproduction; either way the textuality of the aforementioned multiplicity establishes an archaeology of etymology, and that itself creates a truly robust area of future investigation regarding what seems to be the answer to our binary madness of “meaning = episteme (late modernity, controlled by media)” vs “meaning qua meaning = ontology apart from the dead author (post-atructuralism)”: meaning mediated not by refined, abstracted social rules or by a metaphysics of inference, but by (gasp!) agents who re-emerge into consciousness as the episteme of the self attains primacy through space-time–that is, meaning is now a production of onto-geography.”

        There, wasn’t that SPECIAL??? Lmao. Please don’t take this as a dismissal of my earlier post. I was quite sincere when I addressed Horgan, and it would be both a pleasure and a privilege to correspond with you at your convenience about anything of worth I might have had to offer regarding PoMo and the original article.

        The silliness above was offered merely in good humor (assuming one finds such things funny, which might be cause for concern if one does. Ahem. Moving on…)

        Is it too obvious that I was a Ph.D. student in Sociology for a while? I promise everyone here that I have since learned to effectively communicate in English. Really, I have!

        CC

  12. Matthew
    Posted May 23, 2016 at 8:21 am | Permalink

    It’s deeply disturbing to me that someone who has a doctorate and works at Johns Hopkins could be so wrong on such a basic level in his analogy between evolution by natural selection and the evolution of our understanding of science. It’s a distinction that a high school student could easily point out; i.e., that the progress of science is directed and that of natural selection is not. Science, by definition, has predetermined goals! To fail to acknowledge this is to have put your own philosophy so far ahead of facts as to be utterly blind to reality.

    General relativity didn’t plow under Newtonian mechanics, it explained Newton in terms of a more refined understanding of how the Universe operates. Quantum mechanics didn’t erase Boyle’s Law, it explained why the law works. And where a theory was simply false, greater understanding buried it forever, not just for the time being. Now that we understand oxidization, there will never be a resurgence of phlogiston theory. Scientific knowledge is a ratchet — it only moves in one direction.

    Meanwhile, natural selection, not knowing any better, will keep redesigning the same eye again and again. And postmodernist critiques of science, likewise, will keep making the same fallacious arguments.

    • Posted May 23, 2016 at 8:56 am | Permalink

      « Meanwhile, natural selection, not knowing any better, will keep redesigning the same eye again and again. »

      Yes. The fact that the cephalopod eye is better “designed” than the human (vertebrate) eye should give us pause. However, I have come across creationist websites that suggest that it *is* a good idea to have the optic nerves run in front of the rods and cones …

      /@

  13. Posted May 23, 2016 at 3:59 pm | Permalink

    I objected to Comfort’s claim that “Before Watson and Crick described the gene as a sequence of DNA…terms such as information would have been nonsensical.”

    I pointed out to him via Twitter that Schrodinger and von Neumann, among others, had described units of heredity (i.e., genes) in terms of information prior to the solution of DNA’s structure.

    Comfort’s response: “Um, yeah. Thanks, I’ve read them–as well as the secondary lit on them.”

    • Posted May 24, 2016 at 11:44 am | Permalink

      And there EFK is potentially involved again. She has a bizarre little book that is a pedestrian history for the first chapter and the third. The second, middle chapter, is a basically a psychoanalysis of Schrodinger’s motivation for writing _What is Life?_. Loopy stuff.


%d bloggers like this: