Sometime in the mid-'90s, there appeared in print an article we wrote on Sigmund Freud (we can't remember exactly when, and our only copy of the printed piece is long gone, but which piece was reprinted in full here on S&F in August 2004) wherein we recount an episode of the Charlie Rose show on which the now-famous Harvard cognitive neuroscientist Steven Pinker expressed openly and antagonistically and with no punches pulled his utter contempt for Freudian theories of mind. In response, we wrote, in part:
No wonder Dr. Pinker was so huffy. Perhaps I'd have been a bit huffy too were I a cognitive neuroscientist, and bet the whole farm, my new BMW and solid-gold Rolex on the hard science of this [then] new discipline [cognitive neuroscience], and then heard someone academically credentialed, and therefore to be paid attention to [another of the show's guests, Dr. Peter J. Gay, Emeritus Professor of History at Yale University, cultural historian, Freud biographer, a graduate of Western New England Institute for Psychoanalysis, and the author of the introductory commentary for W. W. Norton & Co.'s paperback edition of The Standard Edition of Freud's complete works], give credence to the strange theories of some soft-science guy who 100 years ago worked on some of the very same problems I was now working on. As I said [i.e., above, in the original article], a good explanation [of Pinker's antagonism), but somewhat short of reasonable. But then, Dr. Pinker was himself being somewhat short of reasonable. More reasonable, it seems to me, would have been for him to have remembered the lesson of history that teaches that world-transforming discoveries about the nature of man and the cosmos were, by intuition, first adumbrated by poets, philosophers, and other thinkers of genius using the same sort of metaphorical language Freud was compelled to use in order to make his revolutionary theories comprehensible. More reasonable, also, would it have been for him to have held it no more than prudent to acknowledge that it's never wise to give short shrift to the intuitions and insights of genius, and to have taken Freud's theories as a working guide in his new research, and centered one small portion of that research on seeking out possible neurobiological analogues of such things as the unconscious, repression, Oedipal strivings, psychic determinism, libido, id, ego, superego — the whole psychoanalytic menagerie.As regular readers of S&F have probably long ago surmised, I'm an informed (as laymen go) and convinced Freudian, and believe that, in the large, Freud got most of it right first time out of the box. And it now appears that, based on their hard-science researches, even cognitive neuroscientists will be forced to begin to do a chagrined 180 on their thinking concerning Freud's insights into and metaphorical explanations of the human mind and how it functions. At least, if what Jessa Crispin (she of Book Slut fame) reports is right. Ms. Crispin, in a piece for Drexel University's The Smart Set, writes:
I think we are entering a new Freudian era. This struck me as I was recently reading some stories in The New York Times Science Section: Depressive disorders may have a beneficial mechanism behind them; dreams may be meaningful after all; and hysteria — now called conversion disorders, and by which they mean the physical expression of emotional trauma — may actually exist. [...] For decades, Freud has been slowly discredited until his name is more a punchline [sic] than a scientific reference. But the more science wades into the murky territory of the mind, the more we see that we have to look backward to move forward.By our linking the above Smart Set piece, we do not mean to even so much as imply that we consider Ms. Crispin in any way competent or qualified to pass comment on any of this. We link the piece merely, and solely, because we get the sense that the articles she refers to might be a first hint that perhaps what we suggested in the last above quoted graf of our '90s print piece may at last now be beginning to take place in the hard-science world of cognitive neuroscience. But perhaps that's just wishful thinking on our part.