A true believer in unification [as was Kepler, Newton, Faraday, Einstein, Heisenberg, and Schrödinger], I spent my Ph.D. years, and many more, searching for a theory of Nature that reflected the belief that all is [ultimately] one. [...] Echoing the teachings of Pythagoras and Plato, this idea carries with it an implicit aesthetic judgment that such theories are more beautiful, and, as the poet John Keats wrote in 1819, that "beauty is truth." And yet, as we investigate the experimental evidence for unification ... we find very little hard data supporting [it]. [...] Slowly, my thoughts converged into an aesthetic based on imperfection rather than perfection. [...] [I]t's time for science to let go of the old aesthetic that espouses perfection as beauty and beauty as truth.See now why we hate the guy? And why exactly do we hate him? Because we've the sneaking, sinking suspicion he may actually be right.
[M]ythology is no toy for children. Nor is it a matter of archaic, merely scholarly concern, of no moment to modern men of action. For its symbols (whether in the tangible form of images or in the abstract form of ideas) touch and release the deepest centers of motivation, moving literate and illiterate alike, moving mobs, moving civilizations. There is a real danger, therefore, in the incongruity of focus that has brought the latest findings of technological research into the foreground of modern life [i.e., c. 1959], joining the world in a single community, while leaving the anthropological and psychological discoveries from which a commensurable moral system might have been developed in the learned publications where they first appeared. For surely it is folly to preach to children who will be riding rockets to the moon a morality and cosmology based on concepts of the Good Society and of man's place in nature that were coined before the harnessing of the horse! And the world is now far too small, and men's stake in sanity too great, for any more of those old games of Chosen Folk (whether of Jehovah, Allah, Wotan, Manu, or the Devil) by which tribesmen were sustained against their enemies in the days when the serpent still could talk.And this, from the Introduction to Part One of the book proper:
The artist's eye, as Thomas Mann has said, has a mythical slant upon life: therefore, the mythological realm — the world of the gods and demons, the carnival of their masks and curious game of "as if" in which the festival of the lived myth abrogates all the laws of time, letting the dead swim back to life, and the "once upon a time" become the very present — we must approach and first regard with the artist's eye. For, indeed, in the primitive world, where most of the clues to the origin of mythology must be sought, the gods and demons are not conceived in the way of hard and fast positive realities. [...] [In a living mythology] there [is] a shift of view from the logic of the normal secular sphere, where things are understood to be distinct from one another, to a theatrical or play sphere, where they are accepted for what they are experienced as being and the logic is of "make believe" — "as if." We all know the convention, surely! It is a primary, spontaneous device of childhood, a magical device, by which the world can be transformed from banality to magic in a trice. And its inevitability in childhood is one of those universal characteristics of man that unite us in one family. [...] [A] highly played game of "as if" frees our mind and spirit, on the one hand, from the presumption of theology, which pretends to know the laws of God, and, on the other, from the bondage of reason, whose laws do not apply beyond the horizon of human experience. [...] [In a highly played game of "as if",] the opaque weight of the world — both of life on earth and of death, heaven, and hell — is dissolved, and the spirit freed, not from anything, for there was nothing from which to be freed except a myth too solidly believed, but for something, something fresh and new, a spontaneous act. [...] [I]n the play of children, where, undaunted by the banal actualities of life's meager possibilities, the spontaneous impulse of the spirit to identify itself with something other than itself for the sheer delight of play, transubstantiates the world — in which, actually, after all, things are not quite as real or permanent, terrible, important, or logical as they seem.If any era of Homo sapiens needed to learn and understand what Campbell has to teach us in the four volumes that constitute The Masks of God, and in this first volume in particular, it's our present postmodern era that needs it most, and most urgently. The four volumes are still in print, but to our utter astonishment, not as a set in paperback, each volume having to be purchased individually. All four volumes are available for immediate shipment from Barnes and Nobel (but, not, amazingly enough, from Amazon which doesn't have available except as a used book from independent sellers Volume 4, Creative Mythology). The four volumes are: Primitive Mythology (Vol. 1), Oriental Mythology (Vol. 2), Occidental Mythology (Vol. 3), and Creative Mythology (Vol. 4). We cannot recommend to your attention too highly these four volumes that together constitute The Masks of God. As far as we're concerned, they're required reading for everyone whose IQ is larger than his belt size.
We've just finished reading theoretical physicist Brian Greene's The Fabric Of The Cosmos, an explanatory text on the nature of space and time written for a lay public that covers some of the most esoteric and recondite physics imaginable, and came away from the book feeling as though we understood everything — perfectly. For a lay public, Greene is to theoretical physics what Leonard Bernstein was to music. He's, quite simply, a prodigy; a virtuoso explicator.
We then went to our next physics book written for a lay public: The Black Hole War by theoretical physicist Leonard Susskind which documents his "Battle With Stephen Hawking To Make The World Safe For Quantum Mechanics", which is the book's subtitle. The first few chapters of this book cover much the same introductory material covered by the introductory chapters of Greene's book; introductory material necessary to understand what follows. And like Greene, Susskind uses ordinary life examples and metaphors in place of mathematical formulae in order to make the material comprehensible to a lay public. We were tempted to simply skip these introductory chapters in Susskind's book and move on to the book's main argument as, thanks to Greene, we had a perfect grasp of this material, but in the end decided not to, and proceeded to read them anyway.
Bad decision. By the time we finished reading Susskind's introductory chapters, we were thoroughly confused. What Greene made crystal clear, Susskind muddied beyond recognition — or understanding.
Well, OK, we exaggerate, but you get the idea. We'll still finish the Susskind book because we want to read its main argument, but we'll simply forget his introductory chapters, and depend on what we learned from Greene to give us the basic background we need in order to understand that argument.
Dr. Greene, we love you to pieces, and hope you live forever — or at least long enough to write more books for inquisitive mathematics- and physics-challenged dummies such as ourself. We don't know what we'd do without you.
In 2004, physicist John Smith, along with colleagues Joe Wolfe and Elodie Joliveau, published a study in the Journal of the Acoustical Society of America that revealed for the first time the physiological cause of the so-called "soprano problem", a curious phenomenon that causes sopranos to invariably mispronounce lyrics when singing powerfully in the top half of their range. Smith and Wolfe soon realized that "composers could actually avoid the problem completely by pairing words with notes at which the vowel sounds resonated naturally in a singer’s mouth."
Smith felt that through [his] obsession with perfection, Wagner might have come to understand the relationship between vowel sounds and pitch necessary to overcome the soprano problem.
So one evening in his garden while he was recovering from surgery, Smith took up a pen and paper and went through Götterdämmerung note-by-note, lyric-by-lyric, recording which notes were paired with which vowel sounds. In the early hours of the next morning he wrote a computer program to determine with statistical certainty whether Wagner had in fact used a vowel-pitch matching technique. Looking at the program’s first results, he was amazed. There was a clear relationship.
After Smith’s discovery, he and Wolfe began analyzing more of Wagner’s work. “It’s quite a tedious job, but sitting in the garden reading Wagner is not a bad way to spend your time,” Smith says. In all, Smith and Wolfe looked at four of the composer’s works, including Tristan und Isolde and three operas from Wagner’s magnum opus, Der Ring des Nibelungen. In each case they found a statistically significant correlation between the music and lyrics [i.e., between the notes and the vowel sounds]. For comparison they also looked at operas by Mozart, Rossini, and Strauss and determined that, in these compositions, no such correlation existed.
You all know the pentatonic scale, right? What's that? You're no trained musician and so haven't a clue?
Watch and listen to this (the audience has NOT been specially selected):
Cultural conditioning; species-wide, genetically determined specialized brain circuits; or a combination of both?
We don't really know, but we'd put our money on that last.
(Our thanks to Sarah of Inside The Classics for the video link.)
Not satisfied to be a perfectly useless, entirely superfluous, and forever inept, bungling, and budget-draining creation of the Bush Administration, the U.S. Department of Homeland Security has decided to go into the therapeutic music business.
Does the brain naturally compose melodies to rival those by Mozart or Chopin? Researchers at the US Department of Homeland Security (DHS) think so. What's more, they suggest that piano renditions of an individual's cerebral music can help in dealing with insomnia and fatigue in the aftermath of a stressful experience.
The DHS researchers on the TechSolutions programme and in the Human Factors/Behavior Science Division hope to record the brain's natural activity during periods of calm or alertness. Human Bionics — a company specialising in neurotraining in Purcellville, Virginia — will convert the signal into an audible polyphonic melody. Individuals will be asked to listen to the tracks at various times during the day to either soothe the nerves or improve concentration levels.
(Our thanks to ArtsJournal for the link.)
As most people are aware, newspaper writers hardly ever write the headlines for their own pieces, that job assigned to the writers' editors or to editorial specialists. A good headline is one that captures the core argument of the piece in a single, attention-grabbing line. The headline for this piece sure grabbed our attention.
Who woulda guessed?
[Note: This post has been updated (1) as of 5:41 AM Eastern on 11 Feb. See below.]
Blogger and musician Osbert Parsley of This Blog Will Change the World has lodged several objections to Steven Pinker's comments as quoted in this S&F post, the first of which objections is his accusation that Pinker has misquoted Virginia Woolf. Writes Mr. Parsley:
As so often happens when non-artists try to pontificate on the arts, Pinker's comment betrays a lack of appreciation for the art works themselves. But he also gets his facts wrong: Pinker cites a comment by Virginia Woolf "In or about December 1910, human nature changed". A quick search for the source of this comment turned up the original essay. What Woolf actually says is the following:
. . . On or about December 1910, human character changed. . . The change was not sudden and definite. . . but a change there was, nevertheless.
Not only has Pinker significantly misquoted Woolf, but he's removed Woolf's careful qualifiers in order to make the statement more sensational than it actually is.
Well, Pinker does seem to have misquoted Woolf by substituting "human nature" for her "human character". But the misquote is trivial (although still unforgivable considering the source), for Woolf's use of "human character" in that quote means precisely "human nature." As Woolf explains,
I am not saying that one went out, as one might into a garden, and there saw that a rose had flowered, or that a hen had laid an egg. The change was not sudden and definite like that. But a change there was, nevertheless; and since one must be arbitrary, let us date it about the year 1910. [...] In life one can see the change, if I may use a homely illustration, in the character of one's cook. The Victorian cook lived like a leviathan in the lower depths, formidable, silent, obscure, inscrutable; the Georgian cook is a creature of sunshine and fresh air; in and out of the drawing room, now to borrow the Daily Herald, now to ask advice about a hat. Do you ask for more solemn instances of the power of the human race to change [emphasis ours]? Read the Agamemnon, and see whether, in process of time, your sympathies are not almost entirely with Clytemnestra. [...] All human relations have shifted — those between masters and servants, husbands and wives, parents and children. And when human relations change there is at the same time a change in religion, conduct, politics, and literature. Let us agree to place one of these changes about the year 1910.
As Pinker rightly asserted, Virginia Woolf was wrong. Human character — human nature — "did not change in 1910, or in any year thereafter." Evolution determined human nature hasn't changed for millennia — or, rather, has changed only in those miniscule Darwinian increments that gradually become perceptible as change only after the passage of eons of geologic time.
Mr. Parsley has further objections, all of which seem to have resulted from a misunderstanding of both Woolf and Pinker; to wit:
As I read the [Woolf] essay, Woolf's intent was to propose a more realistic portrayal of character than that of 19th-century authors, who dwelt on the surface details of characters rather than exploring their deeper humanity — or in Pinker's terms, their human nature.
That was not Woolf's essay's intent. That was Woolf's essay's example (which Mr. Parsley gets backwards, BTW) to contrast the difference between literature's Edwardians (her designation for literature's pre-modernist authors) and Georgians (her designation for literature's early modernist authors). Woolf's essay's intent was to fire a warning shot over the heads of the Georgians, so to speak, by admonishing them to take a lesson from the Edwardians, and to,
[C]ome down off [your] plinths and pedestals, and describe beautifully if possible, truthfully at any rate, our Mrs. Brown [Woolf's ad hoc, made up, paradigmatic character in literature]. [S]he is an old lady of unlimited capacity and infinite variety; capable of appearing in any place; wearing any dress; saying anything and doing heaven knows what. But the things she says and the things she does and her eyes and her nose and her speech and her silence have an overwhelming fascination, for she is, of course, the spirit we live by, life itself.
But [Woolf cautions readers of the Georgian modernists] do not expect just at present [i.e., since the advent of modernism in December 1910 or thereabouts] a complete and satisfactory presentment of her. Tolerate the spasmodic, the obscure, the fragmentary, the failure [Woolf advises readers of Georgian literature]. Your help is invoked in a good cause. For I will make one final and surpassingly rash prediction — we are trembling on the verge of one of the great ages of English literature. But it can only be reached if we are determined never, never to desert Mrs. Brown.
As regards Mr. Parsley's misunderstanding of Pinker, Mr. Parsley seems to think Pinker's "denial of human nature" refers to the content of works of art when what Pinker is saying is that the theories underlying modernist and postmodernist art exhibit a "militant denial" of evolution determined human nature, and embrace instead the delusional (and dead wrong) blank slate theory of mind. That's the key to and overarching concept of this chapter on the arts in the book titled The Blank Slate as it is generally, concerning various instances and domains, of the entire book.
Finally, as to Mr. Parsley's assertion that Pinker,
[L]acks the familiarity with modernist art works to judge them as aesthetic objects, or to make accurate comments about the movement....
To not put too fine a point on it, we can only say: Mr. Parsley, you're as wrong as wrong can be.
Oh, and as to Mr. Parsley's sneering at Pinker's assertion that music is "auditory cheesecake," Pinker's quite right, you know — in evolutionary terms, that is, as music plays no conceivable survival or reproductive role whatsoever in the process of natural selection, and is therefore a nonadaptive byproduct of evolution and ipso facto "cheesecake," or, to use Stephen Jay Gould's more technical term, a mere "spandrel". (There's some new thinking afoot lately that might serve to make an end run around natural selection and confer adaptive status on music and other arts via natural selection's sister process, sexual selection, but that's not pertinent here.)
Update (5:41 AM Eastern on 11 Feb): Osbert Parsley responds to the above. Our response to his response is in the comments section of his post.
[Note: This post has been updated (1) as of 11:39 PM Eastern on 9 Feb. See below.]
In his brilliant 2002 book, The Blank Slate, wherein renowned cognitive neuroscientist Steven Pinker vivisectionally dismembers the delusional theory of mind that in the 20th century dominated the thinking about all human affairs and behaviors by declaring the human mind to be a blank slate at birth, its contents written almost exclusively by the culture(s) to which it's exposed, and by experience, Dr. Pinker includes a chapter on the arts which is an equally vivisectional dismemberment of the pernicious conceits underlying modernism and postmodernism in the arts, and it's must reading for all those with an interest in that domain of human endeavor.
Pinker begins the chapter thus:
The arts are in trouble. I didn't say it; they did: the critics, scholars, and (as we now say) content providers who make their living in the arts and humanities.
In this chapter I will diagnose the malaise of the arts and humanities and offer some suggestions for revitalizing them. They didn't ask me, but by their own accounts they need all the help they can get, and I believe that part of the answer lies within the theme of this book.
Indeed it does. A little further on in the chapter, Pinker continues,
In three circumscribed areas the arts really do have something to be depressed about. One is the traditions of elite art that descended from prestigious European genres, such as the music performed by symphony orchestras, the art shown in major galleries and museums, and the ballet performed by major companies.
The second is the guild of critics and cultural gatekeepers, who have seen their influence dwindle.
And the third, of course, is the groves of academe, where the foibles of the humanities departments have been fodder for satirical novels and the subject of endless fretting and analyzing.
[Y]ou can probably guess where I will seek a diagnosis for these three ailing endeavors. The giveaway may be found in a famous [c. 1920s] statement from Virginia Woolf: "In or about December 1910, human nature changed." She was referring to the new philosophy of modernism that would dominate the elite arts and criticism for much of the twentieth century, and whose denial of human nature was carried over with a vengeance to postmodernism, which seized control in its later decades. The point of this chapter is that the elite arts, criticism, and scholarship are in trouble because Woolf was wrong. Human nature did not change in 1910, or in any year thereafter.
And once again, indeed it did not. As Pinker goes on to say,
Regardless of what lies behind our [evolution determined] instincts for art, those instincts bestow it with a transcendence of time, place, and culture. Though people can argue about whether the glass is half full or half empty, a universal human aesthetic really can be discerned beneath the variation across cultures.
And yet once again, indeed it can, a detailed and jargonless treatment of the matter the subject of a just published book by the philosopher of aesthetics Denis Dutton (he of Arts & Letters Daily fame) titled, The Art Instinct, which is also must reading for all those with an interest in the arts.
Pinker then goes on to dissect the blank slate conditioned, leftist socio-political and ethical agenda-driven theoretical underpinnings of both modernism and postmodernism which hold beauty in art in deepest contempt, and clamor loudly for an art that will "diagnose, and cure, the sickness unto death of modern humankind" and, later, in postmodernism, clamor additionally for an art that will strip away all "elitist" claims to truth and privileged position — dogmatic, delusional doctrines that result in art that, to even be appreciated, requires "a support team of critics and theoreticians" who do not "simply evaluate and interpret art ... but suppl[y] [it] with its rationale." Pinker concludes the chapter by declaring,
The dominant theories of elite art and criticism in the twentieth century grew out of a militant denial of human nature. One legacy is ugly, baffling, and insulting art. [...] And they're surprised [i.e., the they of the opening graf] that people are staying away in droves?
We're certainly not.
Once again, must reading for all those with an interest in the arts.
Update (11:39 PM Eastern on 9 Feb): Objections to the above and our response can be read here.
Regular readers of S&F know of our abiding interest in and fascination with all matters cosmological. Well, we've just read an intriguing article in Scientific American that proposes that the big bang theory of the creation of the universe really ought to be thought of as the big bounce rather than the big bang. To wit:
Einstein’s general theory of relativity says that the universe began with the big bang singularity, a moment when all the matter we see was concentrated at a single point of infinite density. But the theory does not capture the fine, quantum structure of spacetime, which limits how tightly matter can be concentrated and how strong gravity can become. To figure out what really happened, physicists need a quantum theory of gravity.
According to one candidate for such a theory, loop quantum gravity, space is subdivided into “atoms” of volume and has a finite capacity to store matter and energy, thereby preventing true singularities from existing.
If so, time may have extended before the bang. The prebang universe may have undergone a catastrophic implosion that reached a point of maximum density and then reversed. In short, a big crunch may have led to a big bounce and then to the big bang.
We don't know how that new theory makes you feel, but it makes us feel hugely comforted as it does away with the almost impossible to fathom and maddening to contemplate notion that the big bang singularity came into being ex nihilo.
(Our thanks to 3 Quarks Daily for the link.)
The self-study of psychoanalysis has been something of an on-and-off preoccupation of mine that began when I was in my teens and continues to this day. And let me be very clear right at the outset here that when I speak of psychoanalysis I'm speaking exclusively of the writings and theories of its creator, Sigmund Freud, and not those of his followers and successors nor of the simplifications and bastardizations of psychoanalysis that call themselves psychoanalysis today, nor of any number of a myriad of other psychologies and psychotherapies that go by any number of other names but still refer informally to their disciplines as psychoanalysis.
My study began not at the beginning, so to speak, but at a little past the middle, also so to speak, with my reading of Freud's slender volume, The Ego and The Id (1923). So riveted was I by what I read that I went on to read everything of Freud's on which I could lay my hot little hands — everything, that is, except those writings intended by Freud to be introductory texts on psychoanalysis for the non-specialist: Five Lectures On Psycho-Analysis (1910), Introductory Lectures On Psycho-Analysis (1916-17), and, New Introductory Lectures On Psycho-Analysis (1933). Why I skipped the reading of those elementary volumes (i.e., elementary compared with Freud's theoretical and metapsychological writings intended for professionals) is something of a mystery to me as they were among the most readily available. But skip them I did, and for reasons equally mysterious have just this past week taken up reading the latter two and found myself astonished anew by Freud's lucidity of explication of even the most difficult material, and by his disarming candor concerning what psychoanalysis understands about the structure and dynamics of the human psychic apparatus and what it doesn't yet understand, and which types of pathological disturbances of that apparatus psychoanalysis is competent to deal with therapeutically and which not.
Also astonishing in the light of the ready availability of these volumes is just how appallingly ignorant of the fundamental theoretical concepts of psychoanalysis are so many of today's most strident critics of psychoanalysis, including those with professional credentials who ought to know better (see this 2004 S&F post for more on this). It's almost as if (perhaps precisely as if) those critics had a professional or personal investment in showing the fundamental concepts of psychoanalysis to be the mere effusions of a sexually obsessed, imaginative 19th-century writer of speculative essays rather than the fruit of the rigorous investigations of a brilliant and relentlessly self-critical physician whose discoveries of the structure and dynamics of the human mind were light-years ahead of their time even though they had to be arrived at using the limited technical methods and processes at their discoverer's disposal, and be expressed by him metaphorically using the limited technical language of the time in order to be communicated at all.
But not to worry. Strident critics notwithstanding, Freud and the fundamental concepts of psychoanalysis will ultimately prevail, and those strident critics recognized for the ankle-nipping lilliputians they so clearly are.
Here's something for y'all to think — and think hard — about.
North Carolina writer Eric Wilson thinks America's current addiction to happiness threatens the arts.
Wilson writes about it in his new book, Against Happiness ($20, Farrar, Straus & Giroux), which paints a disturbing portrait of what happens to art in a world filled with "happy types."
He predicts an America of vacant smiles and bland sameness. It's a place where poetry is a Hallmark card and where music is, well, Muzak.
"I fear we're creating a country where no one would aspire to write a novel like Moby-Dick again," said Wilson.... No one would even want to read it, because who needs Moby-Dick when you've got Dr. Phil?"
Dr. Thomas Svolos, an adjunct professor and the vice chairman of the department of psychiatry at the Creighton University School of Medicine, thinks Wilson may be on to something.
It's especially true because the psychiatric community has long known about the link between artistic genius and manic-depressive disorder. History is full of examples.
Composer Ludwig van Beethoven, painter Vincent van Gogh and writer Sylvia Plath all were famous depressives. And Springsteen was purportedly in a deep malaise when he entered a lonely room in Colts Neck, N.J., in 1981 to record "Nebraska."
"When you're melancholy, you tend to step back and examine your life," Svolos said. "That kind of questioning is essential for creativity."
But for happy types, life's deeper meaning may not be an active question. Wilson makes that point in his book, and Svolos thinks it points to an even broader cultural concern.
Before the 1950s, clinical depression was considered an extremely rare mental illness, affecting less than 5 percent of the population.
Now, Svolos said, depression is a catchall term.
"It applies to everything from a life-threatening mental illness to ordinary sadness or disappointment," he said. "It's all the same and often receives the same treatment."
And that treatment often involves the use of antidepressant drugs, which have been readily available since the 1990s.
"This overemphasis on drugs has become a knee-jerk reaction that's thrown our whole concept of happiness out of whack," Svolos said. "Happiness is now seen as a lack of suffering as opposed to accomplishing important societal goals, like creating art."
Sadly, all too true — except, that is, that last imbecile bit about the creation of art being an accomplishing of an "important societal goa[l]."
(Our thanks to ArtsJournal for the link.)
When I was a kid, I latched onto the fantasy that aspirin was an undiscovered miracle drug that could prevent or cure every ailment that flesh is heir to. It was a fantasy born of fear: fear of sickness, fear of hospitals, and most of all, fear of all diagnostic medical procedures.
Fantasy or not, aspirin has through the years been my constant and best medical friend, and the very first thing I try when feeling out of sorts no matter what the symptoms. And it usually works, no matter what the symptoms.
About 30 years ago, I began taking three 325mg aspirin tablets first thing on awaking every day along with a large glass of water and my first cuppa of the day. No particular reason for my doing that. It just seemed the right and prudent thing to do; a remnant, I suppose, of my childhood fantasy. Then, some 20 or so years ago, came the medical news that aspirin was proven prophylactic in the prevention of heart attack, and today I read this in The New York Times:
Researchers studied more than 47,000 men for 18 years. After adjusting for age, smoking, diet, physical activity and other risk factors, they found that men who took more than two standard 325 mg aspirins a week reduced their risk for colon cancer by about 21 percent compared with those who took less. Men who took 6 to 14 a week reduced their risk by 28 percent, and those who took more than 14 pills a week had a 70 percent decreased risk.
Why am I not surprised.
As an exemplar of the colossal ignorance of current-day thinking, both lay and scientific, whenever it concerns Freudian psychoanalytic theory, this article that appeared several days ago in the Brit Telegraph describing the findings of a Harvard Medical School Centre for Sleep and Cognition study on dreams sponsored by the Telegraph would be hard to beat. Reports the article’s author, the Telegraph’s science editor, Roger Highfield,
Freud called our dreams the "royal road to the unconscious". His seductive idea was that their content is shaped by experiences early in life, creating the hope that psychoanalysis could use our dreams to reveal our childhood miseries, and thereby cure our inner torment.
Today, however, a study of dreams conducted for The Daily Telegraph by Harvard University has come to the inescapable conclusion that Freud put too much emphasis on our formative years.
Although dreams are bizarre and otherworldly, they are as likely to be moulded by mundane, humdrum and everyday activities as by life-changing events.
As part of this [study], we invited visitors to our website, telegraph.co.uk, to provide details of dreams that were fresh in their mind, so that they could be analysed by Dr Erin Wamsley, a colleague of Dr Stickgold.
Almost 300 people were prepared to fill in a detailed online questionnaire and the responses were described as "of good quality" she says. The overall findings, she reveals, "do not fit neatly with the psychoanalytic/Freudian presumption that early life experiences are a primary source of dream content".
In fact, they are much more likely to be shaped by events of the past week than a childhood trauma. "Overall, mundane, unimportant events were as likely to be identified as more significant life events – a TV commercial they had seen, or something boring that a friend said to them," says Dr Wamsley.
Indeed, even among these recent events, we failed to dwell on the most interesting in our dreams. "Contrary to the folk-psychological belief that we dream only of the most important events in our lives, the memory sources identified by participants were not necessarily events of any significance to the dreamer," explains Dr Wamsley.
"One fifth of all memory sources were described as 'not at all important' to the dreamer, while approximately half, 47 per cent, were described as being less important than an average waking event."
A classic example of a hundrum experience invading our sleep was the participant who dreamt of being at a school music lesson in which Art Garfunkel was a guest teacher, addressing his class with an Irish accent.
"He asked the class (who were all females whom I remember from years ago) to each individually sing Sound of Silence, but to make it as original and individual as possible. Though nervous, I also felt very giggly, too, mainly owing to the fact that Art Garfunkel was wearing loose white shorts (which he had borrowed from his wife), and every time he bent over, or uncrossed his legs, he exposed a mass of pubic hair."
While Freud would not doubt have seized on this as signalling a repressed childhood memory, the more prosaic explanation was that the dreamer had, earlier in the day, watched a Simon and Garfunkel video.
But this is precisely what Freud established in his landmark seminal work; viz., that the material of the manifest dream content — i.e., that part of a dream that makes itself immediately perceptible to our consciousness — is always presented in terms of an innocuous experience or experiences of the past 24-48 hours, but that that material is itself never the psychologically significant content of the dream which is contained in what Freud termed the latent dream content, but is always the product of what Freud called the "dream work", the complex and intricate cloaking (distorting) mechanism that prevents the dream’s raw, “dangerous” latent content from reaching our conscious mind directly.
Neither Mr. Highfield nor, apparently, Harvard’s researcher, Dr. Wamsley, made any distinction whatsoever between a dream’s manifest and latent content which is tantamount to a fundamental rewriting of what Freud wrote — a rewriting that ignores entirely a central pillar of Freud’s argument — and then proceeded to disagree with and criticize the substance of that rewriting as if that rewriting was what Freud himself actually wrote.
Why are we not surprised.
[Note: This post has been updated (1) as of 8:53 AM Eastern on 26 Oct. See below.]
No-one who knows the brilliant molecular biologist and co-discoverer of the structure of DNA, James Watson, through his personal writings could have been the least surprised by his outrageous remarks that today led to his resignation as chancellor of the Cold Spring Harbor Laboratory, an institution which he was largely responsible for building into the world-class research facility it has become. Through those writings — most notably the bestseller, The Double Helix, and its less interesting follow-up, Genes, Girls, and Gamow — one fast becomes aware that the man is and has always been something of a social misfit: an insufferable intellectual snob and a bourgeois bigot of the first water, albeit of the benign sort, with an unendearing childlike (childish) propensity to run off at the mouth on matters concerning his fellow human beings before first engaging his considerable brain.
Dr. Watson, who shared the 1962 Nobel Prize for describing the double-helix structure of DNA, and later headed the American government’s part in the international Human Genome Project, was quoted in The Times of London last week as suggesting that, overall, people of African descent are not as intelligent as people of European descent. In the ensuing uproar, he issued a statement apologizing “unreservedly” for the comments, adding “there is no scientific basis for such a belief.”
The man ought to be awarded the Nobel for back-pedaling.
Too bad. It’s not how we like to see and perceive our scientific heroes.
Update (8:53 AM Eastern on 26 Oct): More here.
Dr. Craig Venter, the bio-entrepreneur whose company, Celera Genomics, was involved in the race to decipher the human genetic code, has built a synthetic chromosome out of laboratory chemicals and is set to announce the creation of the first new artificial life form on Earth.
The announcement, which is expected within weeks and could come as early as Monday at the annual meeting of his scientific institute in San Diego, California, will herald a giant leap forward in the development of designer genomes. It is certain to provoke heated debate about the ethics of creating new species and could unlock the door to new energy sources and techniques to combat global warming.
Mr. [sic] Venter told the Guardian he thought this landmark would be "a very important philosophical step in the history of our species. We are going from reading our genetic code to the ability to write it. That gives us the hypothetical ability to do things never contemplated before".
The Guardian can reveal that a team of 20 top scientists assembled by Mr. [sic] Venter, led by the Nobel laureate Hamilton Smith, has already constructed a synthetic chromosome, a feat of virtuoso bio-engineering never previously achieved. Using lab-made chemicals, they have painstakingly stitched together a chromosome that is 381 genes long and contains 580,000 base pairs of genetic code.
The DNA sequence is based on the bacterium Mycoplasma genitalium which the team pared down to the bare essentials needed to support life, removing a fifth of its genetic make-up. The wholly synthetically reconstructed chromosome, which the team have christened Mycoplasma laboratorium, has been watermarked with inks for easy recognition.
It is then transplanted into a living bacterial cell and in the final stage of the process it is expected to take control of the cell and in effect become a new life form. The team of scientists has already successfully transplanted the genome of one type of bacterium into the cell of another, effectively changing the cell's species. Mr. [sic] Venter said he was "100% confident" the same technique would work for the artificially created chromosome.
RTWT here — if you have the courage.