My Blog has moved!.... Блог переехал!...

Мой блог переехал на новый адрес:





My blog has relocated to the new address:



http://www.heyvalera.com/


































June 27, 2010

Just Watched: Weeds



Intro



Andy's Plan



Masturbation lesson
(little background: kid's uncle is asked to teach nephew how to masturbate since the kid kept throwing "used" socks into the toilet)

June 19, 2010

Wired on Internet


Illustration: l-dopa

During the winter of 2007, a UCLA professor of psychiatry named Gary Small recruited six volunteers—three experienced Web surfers and three novices—for a study on brain activity. He gave each a pair of goggles onto which Web pages could be projected. Then he slid his subjects, one by one, into the cylinder of a whole-brain magnetic resonance imager and told them to start searching the Internet. As they used a handheld keypad to Google various preselected topics—the nutritional benefits of chocolate, vacationing in the Galapagos Islands, buying a new car—the MRI scanned their brains for areas of high activation, indicated by increases in blood flow.

The two groups showed marked differences. Brain activity of the experienced surfers was far more extensive than that of the newbies, particularly in areas of the prefrontal cortex associated with problem-solving and decisionmaking. Small then had his subjects read normal blocks of text projected onto their goggles; in this case, scans revealed no significant difference in areas of brain activation between the two groups. The evidence suggested, then, that the distinctive neural pathways of experienced Web users had developed because of their Internet use.

read more

The most remarkable result of the experiment emerged when Small repeated the tests six days later. In the interim, the novices had agreed to spend an hour a day online, searching the Internet. The new scans revealed that their brain activity had changed dramatically; it now resembled that of the veteran surfers. “Five hours on the Internet and the naive subjects had already rewired their brains,” Small wrote. He later repeated all the tests with 18 more volunteers and got the same results.

When first publicized, the findings were greeted with cheers. By keeping lots of brain cells buzzing, Google seemed to be making people smarter. But as Small was careful to point out, more brain activity is not necessarily better brain activity. The real revelation was how quickly and extensively Internet use reroutes people’s neural pathways. “The current explosion of digital technology not only is changing the way we live and communicate,” Small concluded, “but is rapidly and profoundly altering our brains.”

What kind of brain is the Web giving us? That question will no doubt be the subject of a great deal of research in the years ahead. Already, though, there is much we know or can surmise—and the news is quite disturbing. Dozens of studies by psychologists, neurobiologists, and educators point to the same conclusion: When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. Even as the Internet grants us easy access to vast amounts of information, it is turning us into shallower thinkers, literally changing the structure of our brain.

Back in the 1980s, when schools began investing heavily in computers, there was much enthusiasm about the apparent advantages of digital documents over paper ones. Many educators were convinced that introducing hyperlinks into text displayed on monitors would be a boon to learning. Hypertext would strengthen critical thinking, the argument went, by enabling students to switch easily between different viewpoints. Freed from the lockstep reading demanded by printed pages, readers would make all sorts of new intellectual connections between diverse works. The hyperlink would be a technology of liberation.

By the end of the decade, the enthusiasm was turning to skepticism. Research was painting a fuller, very different picture of the cognitive effects of hypertext. Navigating linked documents, it turned out, entails a lot of mental calisthenics—evaluating hyperlinks, deciding whether to click, adjusting to different formats—that are extraneous to the process of reading. Because it disrupts concentration, such activity weakens comprehension. A 1989 study showed that readers tended just to click around aimlessly when reading something that included hypertext links to other selected pieces of information. A 1990 experiment revealed that some “could not remember what they had and had not read.”

Even though the World Wide Web has made hypertext ubiquitous and presumably less startling and unfamiliar, the cognitive problems remain. Research continues to show that people who read linear text comprehend more, remember more, and learn more than those who read text peppered with links. In a 2001 study, two scholars in Canada asked 70 people to read “The Demon Lover,” a short story by Elizabeth Bowen. One group read it in a traditional linear-text format; they’d read a passage and click the word next to move ahead. A second group read a version in which they had to click on highlighted words in the text to move ahead. It took the hypertext readers longer to read the document, and they were seven times more likely to say they found it confusing. Another researcher, Erping Zhu, had people read a passage of digital prose but varied the number of links appearing in it. She then gave the readers a multiple-choice quiz and had them write a summary of what they had read. She found that comprehension declined as the number of links increased—whether or not people clicked on them. After all, whenever a link appears, your brain has to at least make the choice not to click, which is itself distracting.

A 2007 scholarly review of hypertext experiments concluded that jumping between digital documents impedes understanding. And if links are bad for concentration and comprehension, it shouldn’t be surprising that more recent research suggests that links surrounded by images, videos, and advertisements could be even worse.

In a study published in the journal Media Psychology, researchers had more than 100 volunteers watch a presentation about the country of Mali, played through a Web browser. Some watched a text-only version. Others watched a version that incorporated video. Afterward, the subjects were quizzed on the material. Compared to the multimedia viewers, the text-only viewers answered significantly more questions correctly; they also found the presentation to be more interesting, more educational, more understandable, and more enjoyable.

The depth of our intelligence hinges on our ability to transfer information from working memory, the scratch pad of consciousness, to long-term memory, the mind’s filing system. When facts and experiences enter our long-term memory, we are able to weave them into the complex ideas that give richness to our thought. But the passage from working memory to long-term memory also forms a bottleneck in our brain. Whereas long-term memory has an almost unlimited capacity, working memory can hold only a relatively small amount of information at a time. And that short-term storage is fragile: A break in our attention can sweep its contents from our mind.

Imagine filling a bathtub with a thimble; that’s the challenge involved in moving information from working memory into long-term memory. When we read a book, the information faucet provides a steady drip, which we can control by varying the pace of our reading. Through our single-minded concentration on the text, we can transfer much of the information, thimbleful by thimbleful, into long-term memory and forge the rich associations essential to the creation of knowledge and wisdom.

On the Net, we face many information faucets, all going full blast. Our little thimble overflows as we rush from tap to tap. We transfer only a small jumble of drops from different faucets, not a continuous, coherent stream.

Psychologists refer to the information flowing into our working memory as our cognitive load. When the load exceeds our mind’s ability to process and store it, we’re unable to retain the information or to draw connections with other memories. We can’t translate the new material into conceptual knowledge. Our ability to learn suffers, and our understanding remains weak. That’s why the extensive brain activity that Small discovered in Web searchers may be more a cause for concern than for celebration. It points to cognitive overload.

The Internet is an interruption system. It seizes our attention only to scramble it. There’s the problem of hypertext and the many different kinds of media coming at us simultaneously. There’s also the fact that numerous studies—including one that tracked eye movement, one that surveyed people, and even one that examined the habits displayed by users of two academic databases—show that we start to read faster and less thoroughly as soon as we go online. Plus, the Internet has a hundred ways of distracting us from our onscreen reading. Most email applications check automatically for new messages every five or 10 minutes, and people routinely click the Check for New Mail button even more frequently. Office workers often glance at their inbox 30 to 40 times an hour. Since each glance breaks our concentration and burdens our working memory, the cognitive penalty can be severe.

The penalty is amplified by what brain scientists call switching costs. Every time we shift our attention, the brain has to reorient itself, further taxing our mental resources. Many studies have shown that switching between just two tasks can add substantially to our cognitive load, impeding our thinking and increasing the likelihood that we’ll overlook or misinterpret important information. On the Internet, where we generally juggle several tasks, the switching costs pile ever higher.

The Net’s ability to monitor events and send out messages and notifications automatically is, of course, one of its great strengths as a communication technology. We rely on that capability to personalize the workings of the system, to program the vast database to respond to our particular needs, interests, and desires. We want to be interrupted, because each interruption—email, tweet, instant message, RSS headline—brings us a valuable piece of information. To turn off these alerts is to risk feeling out of touch or even socially isolated. The stream of new information also plays to our natural tendency to overemphasize the immediate. We crave the new even when we know it’s trivial.

And so we ask the Internet to keep interrupting us in ever more varied ways. We willingly accept the loss of concentration and focus, the fragmentation of our attention, and the thinning of our thoughts in return for the wealth of compelling, or at least diverting, information we receive. We rarely stop to think that it might actually make more sense just to tune it all out.

The mental consequences of our online info-crunching are not universally bad. Certain cognitive skills are strengthened by our use of computers and the Net. These tend to involve more primitive mental functions, such as hand-eye coordination, reflex response, and the processing of visual cues. One much-cited study of videogaming, published in Nature in 2003, revealed that after just 10 days of playing action games on computers, a group of young people had significantly boosted the speed with which they could shift their visual focus between various images and tasks.

It’s likely that Web browsing also strengthens brain functions related to fast-paced problem-solving, particularly when it requires spotting patterns in a welter of data. A British study of the way women search for medical information online indicated that an experienced Internet user can, at least in some cases, assess the trustworthiness and probable value of a Web page in a matter of seconds. The more we practice surfing and scanning, the more adept our brain becomes at those tasks. (Other academics, like Clay Shirky, maintain that the Web provides us with a valuable outlet for a growing “cognitive surplus”; see Cognitive Surplus: The Great Spare-Time Revolution

But it would be a serious mistake to look narrowly at such benefits and conclude that the Web is making us smarter. In a Science article published in early 2009, prominent developmental psychologist Patricia Greenfield reviewed more than 40 studies of the effects of various types of media on intelligence and learning ability. She concluded that “every medium develops some cognitive skills at the expense of others.” Our growing use of the Net and other screen-based technologies, she wrote, has led to the “widespread and sophisticated development of visual-spatial skills.” But those gains go hand in hand with a weakening of our capacity for the kind of “deep processing” that underpins “mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection.”

We know that the human brain is highly plastic; neurons and synapses change as circumstances change. When we adapt to a new cultural phenomenon, including the use of a new medium, we end up with a different brain, says Michael Merzenich, a pioneer of the field of neuroplasticity. That means our online habits continue to reverberate in the workings of our brain cells even when we’re not at a computer. We’re exercising the neural circuits devoted to skimming and multitasking while ignoring those used for reading and thinking deeply.

Last year, researchers at Stanford found signs that this shift may already be well under way. They gave a battery of cognitive tests to a group of heavy media multitaskers as well as a group of relatively light ones. They discovered that the heavy multitaskers were much more easily distracted, had significantly less control over their working memory, and were generally much less able to concentrate on a task. Intensive multitaskers are “suckers for irrelevancy,” says Clifford Nass, one professor who did the research. “Everything distracts them.” Merzenich offers an even bleaker assessment: As we multitask online, we are “training our brains to pay attention to the crap.”

There’s nothing wrong with absorbing information quickly and in bits and pieces. We’ve always skimmed newspapers more than we’ve read them, and we routinely run our eyes over books and magazines to get the gist of a piece of writing and decide whether it warrants more thorough reading. The ability to scan and browse is as important as the ability to read deeply and think attentively. The problem is that skimming is becoming our dominant mode of thought. Once a means to an end, a way to identify information for further study, it’s becoming an end in itself—our preferred method of both learning and analysis. Dazzled by the Net’s treasures, we are blind to the damage we may be doing to our intellectual lives and even our culture.

What we’re experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: We are evolving from cultivators of personal knowledge into hunters and gatherers in the electronic data forest. In the process, we seem fated to sacrifice much of what makes our minds so interesting.

June 18, 2010

Saramago is dead



José Saramago Dies


By FERNANDA EBERSTADT

José Saramago, the Portuguese writer who won the Nobel Prize in Literature in 1998 with novels that combine surrealist experimentation and a kind of sardonic peasant pragmatism, has died at his home in Lanzarote in the Canary Islands, his publisher said on Friday. He was 87.
The publisher, Zeferino Coelho, told the Portuguese newspaper Publico that Mr. Saramago’s health had been deteriorating after a recent illness, but gave no other details, according to The Associated Press.
Mr. Saramago, a tall, commandingly austere man with a dry, schoolmasterly manner, gained international acclaim for novels like “Baltasar and Blimunda” and “Blindness.” (A film adaptation of “Blindness” by the Brazilian director Fernando Mireilles was released in 2008.)
Mr. Saramago was the first Portuguese-language writer to win the Nobel Prize, and more than two million copies of his books have been sold, Mr. Coelho.
Mr. Saramago was known almost as much for his unfaltering Communism as for his fiction. In later years, Mr. Saramago used his status as a Nobel laureate to deliver lectures at international congresses around the world, accompanied by his wife, the Spanish journalist Pilar del Rio. He described globalization as the new totalitarianism and lamented contemporary democracy’s failure to stem the increasing powers of multinational corporations.
read more To many Americans, Mr. Saramago’s name is indissolubly associated with a statement he made while touring the West Bank in 2002, when he compared Israel’s treatment of Palestinians to the Holocaust.
As a professional novelist, Mr. Saramago was a late bloomer. (A first novel, published when he was 23, was followed by 30 years of silence.) He became a full-time writer only in his late 50s, after working variously as a garage mechanic, a welfare agency bureaucrat, a printing production manager, a proofreader, a translator and a newspaper columnist.
In 1975, a counter-coup overthrew Portugal’s Communist-led revolution of the previous year, and Mr. Saramago was fired from his job as deputy editor of the Lisbon newspaper Diário de Noticias. Overnight, along with other prominent leftists, he became virtually unemployable. “It was the best luck of my life,” he said in a 2007 interview. “It drove me to become a writer.”
His first major success was the rollicking love story “Baltasar and Blimunda.” Set in 18th-century Portugal, the novel portrays the misadventures of a trio of eccentrics threatened by the Inquisition: a heretic priest who constructs a flying machine and the two lovers who help him — a one-handed ex-soldier and a sorceress’s daughter who has X-ray vision.
The novel, published in an English translation in 1987, won Mr. Saramago a passionate international following. The critic Irving Howe, praising its union of “harsh realism” and “lyric fantasy,” described its author as “a voice of European skepticism, a connoisseur of ironies.”
“I think I hear in his prose echoes of Enlightenment sensibility, caustic and shrewd,” Mr. Howe wrote.
Asked in 2008 to assess Mr. Saramago’s achievement, the critic James Wood wrote: “Jose Saramago was both an avant-gardist and a traditionalist. His long blocks of unbroken prose, lacking conventional markers like paragraph breaks and quotation marks, could look forbidding and modernist; but his frequent habit of handing over the narration in his novels to a kind of ‘village chorus’ and what seem like peasant simplicities, allowed Saramago great flexibility.” On the one hand, Mr. Wood wrote, it allowed the writer to “revel in sheer storytelling,” while on the other, to “undermine, ironically, the very ‘truths’ and simplicities his apparently unsophisticated narrators traded in.”
Paradox was Mr. Saramago’s stock in trade. A militant atheist who maintained that human history would have been a lot more peaceful if it weren’t for religion, his novels are nonetheless intimately preoccupied with the question of God.
His novel “The Gospel According to Jesus Christ,” in which Jesus on the cross apologizes to mankind for God’s sins, was deemed “corruscatingly blasphemous” by some believers and deeply religious by others. When the Portuguese government, under pressure from the Catholic Church, blocked its entry for a European Literary Prize in 1992, Mr. Saramago chose to go into exile, setting up residence in the Canary Islands, a Spanish possession.
Mr. Saramago’s hard-scrabble origins did not seem to predestine him for a life of letters. Born in 1922 in the village of Azinhaga, 60 miles northeast of Lisbon, Mr. Saramago was largely raised by his maternal grandparents, while his parents sought work in the big city. In his Nobel acceptance speech, Mr. Saramago spoke admiringly of these grandparents, illiterate peasants who, in the winter, slept in the same bed as their piglets, yet who imparted to him a taste for fantasy and folklore, combined with a respect for nature.
One of Mr. Saramago’s last books — and one of his most touching — was a childhood memoir titled “Small Memories.” In it, he recounts the trauma of being transplanted from his grandparents’ rural shack to Lisbon, where his father had joined the police force. Several months later, his older brother Francisco, his only sibling, died of pneumonia.
Mr. Saramago loved to tell a story of how he came by his surname. His real family name was de Sousa. But when the 7-year-old boy showed up for his first day of school and was obliged to present his birth certificate, it was discovered that the clerk in his home village had registered him as José Saramago. “Saramago,” which means “wild radish,” a green that country people were obliged to eat in hard times, was the insulting nickname by which the novelist’s father was known.
“My father wasn’t very happy, but if that was his son’s official name, well, then he too had to take it,” he recounted in the 2007 interview. The family remained so poor, Mr. Saramago recalls in his memoirs, that every spring his mother pawned their blankets, hoping that she might be able to redeem them by the following winter. Despite being a good student, Mr. Saramago was obliged by his family’s financial straits to drop out of grammar school at 12 and switch to a vocational school, where he was trained as a car mechanic.
The most oppressive influence on Mr. Saramago, however, was one he rarely wrote about: the fascist regime that ruled Portugal from 1926 to 1974.
“The Year of the Death of Ricardo Reis,” regarded as his masterpiece, is his only novel to deal directly with the dictatorship of António de Oliveira Salazar. Set in 1936 in a Europe darkened by the ascendancies of Hitler, Mussolini, Franco and Salazar, the book tells the story of a doctor and poet living in Brazil who returns to fascist Lisbon when he hears of the death of his friend Fernando Pessoa, Portugal’s great modernist poet. What gives the book its dreamlike blend of historical reality and illusion is the fact that the title character’s name was actually one of the aliases the fantastically prolific Fernando Pessoa used to publish much of his verse. The novel, consisting of increasingly macabre encounters between the ghost of Pessoa and his fictional alter ego Reis, is a delicate meditation on identity and nothingness, poetry and power.
In his later years, Mr. Saramago’s fiction became more starkly allegorical. In novels like “Blindness,” in which an entire city is struck by a plague of sightlessness that reduces most of its citizens to barbarism, readers have found a powerful parable about the fragility of human civilization.
“Saramago for the last 25 years stood his own with any novelist of the Western world,” the critic Harold Bloom said in 2008. “He was the equal of Philip Roth, Gunther Grass, Thomas Pynchon, and Don DeLillo. His genius was remarkably versatile — he was at once a great comic and a writer of shocking earnestness and grim poignancy. It is hard to believe he will not survive.”


June 16, 2010

LRB on God


The Atheists’ Picnic


Julian Bell



Conceiving God: The Cognitive Origin and Evolution of Religion by David Lewis-Williams Thames and Hudson, 320 pp, £18.95, March 2010, ISBN 978 0 500 05164 1


‘God created man.’ There are various ways you might read those words even without looking beyond the scriptures. Set them in the context of archaeology and a different reading altogether suggests itself. Primates turn recognisably human when factors beyond the reach of the senses leave traces in their behaviour. It is an intrusion of the invisible that sets homo sapiens apart from other species. This animal has the unique habit of making one thing stand for another: where prehistoric evidence of that habit shows up, we infer that the agents knew – in the way that we know, and in a way that other creatures seemingly do not – what it is like to contemplate and to relate physical objects, on some plane distinct from the objects themselves. ‘Mind’ is the obvious label for that not exactly material zone. It is not obvious, however, where mind cuts off from the larger bodilessness that we point towards when we speak of God. Perhaps objects interrelate in our individual minds because they are already interrelated within a communal mind. Perhaps understanding is located not in us alone, but in the world about us in the manner of water under the ground, of blood under the skin or the flame within fuel: perhaps it’s a stuff within objects, awaiting release. In which case, homo sapiens becomes sapient by dowsing for that flow – or, to switch metaphors, by seeking spark-points where that fire will catch to illuminate the world.
Engraved ochre block, c.75,000 BCE
Engraved ochre block, c.75,000 BCE
Was it in some such matrix that the concepts ‘God’ and ‘man’ first arose? That seems to be the implication when, in the opening pages of Conceiving God: The Cognitive Origin and Evolution of Religion, the archaeologist David Lewis-Williams examines the earliest records of symbolic behaviour. It is unlikely we will ever pinpoint just when the human habit of investing objects with significance took hold. But for the moment, until fresh discoveries arrive, the most striking evidence we have comes from the Blombos cave on South Africa’s Cape coast, which was inhabited some 75,000 years ago. read more Another site along the same coast, Pinnacle Point, indicates that anatomically modern humans had already added an interest in collecting chunks of red ochre to their long-standing use of hand axes and of fire nearly a hundred thousand years before that. It was occupants of Blombos, however, who fashioned what Lewis-Williams calls ‘the world’s oldest objets d’art’, when they scored a linear design onto two small blocks of ochre, each the size of a box of cook’s matches, making their marks on one of the long narrow faces – on the striking panel, as it were. What was the purpose of the patterns? Perhaps, Lewis-Williams writes, they ‘refer to something inside the ochre’ – a pre-eminently symbolic substance, he suggests, in its intense inherent redness – ‘rather as the title on a book spine refers to what is inside the volume’. He pursues the speculation further: ‘We may have here an early hint of an important component of religious thought: immanence. Gods and supernatural powers can be inside statues, mountains, lakes, seas, nature itself, and of course people.’
We also have here – though the fact does not concern Lewis-Williams – our earliest evidence of systematic line-drawing. The ends of the blocks were flattened to make planes suitable for patterning. The closed off criss-cross hatchings rendered on one block roughly correspond to those on the other; the intervals and intersections of the scorings aim at regularity. Such patterning makes best sense as a conscious stab at perfection. Beyond each scratch that we can see, there lay an intended ideal geometry, luring the draughtsman on. A pure, complete and cohesive network of interrelations was evidently the endpoint. If we take these little chunks of red mineral seriously – if we accept that as 3D objects, they possessed a symbolic charge that set them apart from their environment and that they were in effect keys to unlock its meanings – then we might also consider that the scratches on the objects’ 2D planes had to do with thought as a weightless, abstract thread, one fit to weave the environment around it into unity. Religion, Lewis-Williams might have added, has always shuffled from one foot to the other: here it isolates some object, place or person and declares them special, there it proposes to embrace everything in a resolved coherence.
Lewis-Williams’s pen-picture of the Blombos cave, on a high cliff overlooking the Indian Ocean, is brisk and vivid – a testimony to skills honed over fifty years of lecturing. His career in archaeology began elsewhere in South Africa, in pursuit of the nation’s most haunting relics, the vast corpus of rock paintings left by the retreating San (or Bushmen) after farmers drove them from the land during the mid-19th century. As of the 1960s, this field of research was still being freshly harvested. It was Lewis-Williams’s senior colleague Patricia Vinnicombe who brought out the foundational study of San rock art, People of the Eland, in 1976. Her book, recently republished,[*] is a monumentally thorough documentation of exquisite, eloquent paintings and, moreover, a remarkable biographical document. Vinnicombe set out as a girl on horseback, trekking the Drakensberg hills above her own family’s farm: the figure-filled pictures she noticed on the rocks sent her poring through grim newspaper reports of dispossession and massacre from just a couple of generations back, with which many of the depicted scenes turned out uncannily to correspond. But her ever widening investigations of San life and thought shifted her interests towards a structuralist theorisation of the field: by the book’s conclusion, she was serenely contemplating a Lévi-Straussian cat’s cradle of cultural oppositions.
Lewis-Williams pursued a contrary route. He sifted the ethnographic evidence – picture tracings, testimonies from the last of the Drakensberg’s departed San, field reports on those still living in the Kalahari desert – until he had to his own satisfaction isolated its dominant theme: the fact that trances, induced by communal dancing, led the San to believe that they lived in a tiered cosmos. Whatever they saw happening at ground level related to what happened in the skies above and in the earth below, and their paintings were a means to connect with those spirit-ruled zones. Lewis-Williams wanted not so much to explore how that cosmology was expressed as to determine exactly what made these people think this way. The task required that he cross disciplines: it lay, in a word, in neurology. From the late 1980s, he became a frontman for an approach to the study of hunter-gatherer societies headlined by the words ‘entoptics’ and ‘shamanism’. Entoptics is what you see as you enter a trance: shapes generated by activity in your own stimulated brain, either nakedly geometric in appearance or dressed up in costumes from your culture’s mythic wardrobe, often followed by wholly hallucinatory scenes and stories. Lewis-Williams claimed to identify these little twists of neural scrap metal – zigzagged, chequered or cupular – in the rock art, not just of the San, but of other small-scale societies across time and space. The study of such ‘shamanic’ cultures, in which affairs typically coalesced around a cosmos-hopping thaumaturge, had major historical implications. For this was the way all our ancestors originally lived: the San were our nearest available witnesses to the distant and elusive palaeolithic age.
The Mind in the Cave (2002) put Lewis-Williams’s researches before the general public. The famous subterranean art shows of southern France and Spain, painted between 30,000 and 12,000 years ago, became the foreground of the discussion. When hunter-gatherers plunged inside the limestone flanks of the Dordogne and the Ariège, Lewis-Williams argued, they believed they were passing into a spirit zone. For them, the walls of the caves were ‘membranes’: running their eyes and hands over the passages’ bumps and hollows, they sensed the ghost-animals who dwelled within the rock. Painting them, ‘they reached out to their emotionally charged visions and tried to touch them, to hold them in place … They were not inventing images. They were merely touching what was already there.’ Lewis-Williams’s fetchingly empathetic interpretations of the art were underpinned by schemata of how minds operate. The ‘entoptic’ zigzags and chequers appearing on Lascaux’s walls alongside the visionary bison suggest that the cave-art complex was intended to sustain an acutely ‘intensified consciousness’, a state of mind which lifted upwards from the thought patterns of daily activity rather than falling down away from them in the manner of sleep and dreaming. At the same time, the actual construction of that palaeolithic Gesamtkunstwerk indicates considerable social organisation – the artists must have built platforms to paint the 13-foot-high ceiling of its ‘Hall of the Bulls’ – and hence must point us to a proto-politics, centred around a master of ceremonies. Lewis-Williams’s book bound the neurological and the sociological together into a single argument.
His feat was all the smarter for being highly self-conscious. In its course, the professor pushed his way forward through the history of palaeolithic studies, appraising and criticising predecessors, before borrowing from the philosopher Alison Wylie a name for his own method. He was ‘cabling’, he claimed: that is, ‘intertwining multiple strands of evidence’. Where – as so often in archaeology – one such strand was weak and incomplete, strands from associated fields of research could carry the weight instead, allowing the story to clamber up towards a plateau of coherence. What height, then, had Lewis-Williams arrived at? Not actually at ‘an origin of image-making’, as one of the book’s chapter titles brashly claimed: he never fully told us why one cave-goer should first bring paint to the rockface nor why a second should make sense of the resultant markings, let alone why they and the San rendered their spirit-animals with such marvellous lyrical naturalism. (Besides, 45,000 years separate the Blombos engravings from the earliest cave art, a gap from which we have next to no evidence of complex symbolism: our ignorance remains stupendous.)
And yet the integrated perspective gained by Lewis-Williams’s cabling of disciplines opened up plenty of interesting new questions. If at Lascaux we glimpse the beginnings of social stratification, how did that process develop, and how did it interact with the arrival of agriculture? Inside the Neolithic Mind (2005), which Lewis-Williams wrote with a colleague, David Pearce, carried the inquiry over to the megalithic sites: first to the astonishing ‘temple’ constructed around 9600 BCE at Göbekli Tepe in south-eastern Turkey, one of the major archaeological discoveries of our era; then on from the Near East to Stonehenge and Newgrange, and hence into the third millennium BCE. This successor volume, equally rich in bold conjectures, took care to justify and qualify the seemingly provocative reductionism of the neurological approach. Their argument, they pleaded, was ‘in no way deterministic’, for ‘all the stages and experiences of consciousness that we distinguish are mediated by culture.’
It turns out, now, that Lewis-Williams wants to carry his interpretation of history forward yet again – through the classical world and on into the present. It turns out that in the course of his long scientific career he has been assiduously going through the library of great thinkers and making critical notes. It turns out that in his mid-seventies he feels there is something he ought to let the world know. He wants to make it clear that the anthropological empathy colouring his earlier writings was merely a heuristic ploy. Let there be no doubt: those San trance-dancers, those cave-painters, those megalith-erectors – those countless subscribers, across the millennia, to the epiphenomena of an intensified consciousness – have all been in the grip of a grievous and fundamental error. The process of identifying that error started in 585 BCE, when Thales of Miletus combined observations and mathematics correctly to predict a solar eclipse. From then onwards, very haltingly at first and yet, from the 17th century, inexorably, scientific reason has been rolling back the brain’s wayward tendencies. For sure, those tendencies are innate, but then so is the appendix: from a modern Darwinian perspective, they’re similarly anomalous features of the organism. Surely by now those who would erect an ontology on them are looking pretty cornered. Sooner or later, they’ll be forced into saying it. Go on, it’s simple: ‘There is no God.’
And so the intrepid mountaineer hauls himself up over the final overhang – and collapses into a company of picnickers. Richard Dawkins, Daniel Dennett, Christopher Hitchens and Sam Harris motored up to his chosen summit a while ago; and here, sure enough, stepping forward to pat the newcomer on the back and welcome him along, who should it be but Philip Pullman? ‘Magnificent … a sane, courteous and devastating criticism of religion,’ reads his statutory puff on the dust jacket of the latest addition to the New Atheist library. Lewis-Williams, however, makes a hesitant arrival. The seasoned communicator of archaeologists’ excitements, the masterly critic of methodologies, he has too dry a temper for any popular cut and thrust. In his preface he expresses scepticism about the effectiveness of his fellow atheists’ tracts and claims that he’s joining their company ‘reluctantly’.
Yet within a few pages, he too feels obliged to hack and slash his way through Western history. Overshadowed by déjà vu and a weary sense of duty, his efforts make for stale, sour reading. The usual suspects are named: authoritarian Plato; ‘a man of the first century AD named Saul’ who ‘hailed from Tarsus’ (what age group is this pitched at?); the ‘wily’ Emperor Constantine; and Augustine and Aquinas, with ‘their obsessed, twisted minds’. At last, after the benighted Middle Ages (‘another country, another world, and a distasteful one at that’), glimmers of reason start to shine through (‘all in all, Newton was a man of mixed beliefs’) before Darwin sheds his sunlight. And yet even today, we find supposed men of reason – those theologians across campus – determined to go on blundering around in the dark!
Lewis-Williams sounds not just cross with most of history, but fed up with everything from ritual dances and cathedrals (‘they all come down to tinkering with the neurology of the brain’) to Eastern meditation (‘no more than consciousness fiddling’). He cares nothing for level-headedness or, pace Philip Pullman, for courtesy. ‘How ghastly,’ he yelps, gawping at the ritual bloodlettings of the Maya. In fact, he sounds pretty peeved with the universe itself. ‘The world and all that is in it is actually a higgledy-piggledy, wasteful mess … Evolutionary history is littered with wasteful, meaningless dead-ends. Perhaps human beings are another.’
Possibly – but it’s a poor way to win them over. No doubt Lewis-Williams should have taken lessons in rhetorical comportment from the bullish, sanguine Daniel Dennett, whose Breaking the Spell (2006) is by far the best mannered of the atheist tracts. ‘The world is sacred,’ Dennett paradoxically affirms, even while he tries to subsume sacredness within a story of Darwinian causation: yes, he urges his all too God-respecting fellow Americans, you could at once be ontologically virtuous and still have your epiphanic cakes and ale. Nonetheless, when it comes to content, the grumpy neuro-archaeologist may just have the edge over the jaunty philosopher. Dennett’s speculations as to how religion might have evolved among palaeolithic humans, drawing on the work of anthropologists such as Pascal Boyer and Scott Atran, hardly come together in a reliable ‘cabling’: rather, his narrative is a flimsy clutch of ‘what ifs’, patched up with this or that ‘plausible candidate for filling in the blank’, as he himself puts it. Religion must certainly have evolved, but it is anything but certain that Dennett’s book tells us how. In the middle three chapters of Conceiving God’s ten – the book’s most worthwhile section – Lewis-Williams gives the issues a tighter conceptualisation. God, he reckons, has been a persistent presence in human affairs because a trinity of interlocking categories sustains him. ‘Religious experience’, tied to certain brain states, generates socially arbitrated systems of ‘religious belief’, and these get expressed in concrete, hierarchical forms of ‘religious practice’ – which in turn foment new experiences. His familiarity with the neurology of trances and shamanic ‘inner journeys’ makes the chapter on ‘experience’ particularly incisive.
Does his conceptualisation here support his assertion that there is no God? Clearly, if it works, it adds a level to our preceding descriptions of religion. It doesn’t however cancel them. If you tell me, ‘The thermometer reads -7º centigrade,’ you aren’t superseding my remark that ‘it’s bitterly cold.’ When you say ‘I sense the presence of God’ and I say, ‘The right temporal lobe of your brain is aroused,’ we have no cause to argue. What, then, is the order of priority here? The pervasive coloration of affairs with intention and emotion that’s implicit in calling the cold ‘bitter’ – or for that matter in remarks on the cosmic weather, ranging from ‘God loves us’ to ‘the world is a wasteful mess’ – is fundamental to any narrative of our personal experience and will not be wished away. And without personal experience we have nothing: personal experience comes first. At the same time it constantly awaits correction. When the temperature drops to -12º C that ‘bitter’ may acquire a retrospective sweetness; equally, you may come to realise that your inward vision left you blind to your next-door neighbour’s distress. Against that perspectival flux, the abstract linear registration of the thermometer – and of science in general – seems to offer a necessary neutral constant.
To ring-fence events from scientific description, as the religious-minded often do, to insist that, here and there, spiritual weeds ‘miraculously’ burst through the world’s physical tarmac, is essentially incoherent. At root, there can only be one structure of causation – call it physical, call it spiritual, call it what you like – because that’s what causation is: it’s how all events are temporally related. It seems, however, that we have at least two indispensable ways to describe events and that these track each other somewhat inadequately. Maybe we can think of them as ‘levels’ – accepting, like the San, that we inhabit a tiered cosmos – but somehow we have to keep both in view. That, it seems to me, is one of the challenges taken on by theology: to attend to the personal, intention-suffused perspective that seizes on a given place (or person, or chunk of bright ochre, or archaeological discovery) and affirms it to be crucial, at the same time reconciling it with the contemplation of place-neutral, ideal, linear pattern, the type of vision promised by science.
I think theology enrages Lewis-Williams and his fellow New Atheists, partly no doubt because they would prefer their enemies to be stupid, but chiefly because it assails their own sentimental ring-fencing: they zealously cherish their apprehensions of a pure, intention-free Darwinian universal story. Such a vision of the cosmos can be beautiful and spiritually consoling – it is, after all, a prospect of selflessness – and they long for nothing to disrupt it. Threatened, claustrophobically, by any reminder that the totality of experience is stained through and through with feeling and that large swathes of human behaviour make no sense without that acknowledgment, the atheists fire barrages of angry, distracting flak – witness Lewis-Williams’s specious claim that ‘supernaturalism inevitably leads to oppressive government’ – but they also, more reasonably, ask their assailants to spell out what character any creator of such a world as this could have.
That question forms the flipside of the scientific question that Lewis-Williams tries to answer – ‘Why are human beings typically religious?’ – and probably neither of them will ever get a wholly satisfying reply. Suppose I am lying very ill in a hospital ward. The viruses in my body and those dragging my neighbour to an agonising death, not to mention the cold pasta bake and the Daily Mail on my bedside tray: all these, by any intelligible theology, are either as God wills, or ‘in’ his nature somehow – he’s answerable for them, anyway. Suppose that, praying to God, I gain the strength to make an unlikely recovery; and suppose I then thank God for his mercy. If I do so, I do not assert that my neighbour failed to pray hard enough, nor that the hospital should now drop the forms of medical care that – in the staff’s view – enabled me to live. I simply claim that this is the minimal necessary story into which my own experience fits. My recovery is my chunk of ochre, my place to start. If my prayer of thanks for it could be recast as a statement, it might not exactly be refutable, but it would be patently incomplete and I would have no idea how to perfect it. Nonetheless, all causes and events most certainly form a single fabric, and seeking to live, trying to retain a conscious purchase on that unity, I implicitly state both that I stand dependent to it and that it is precious. There is a way to sing those statements: ‘God created me, and God is good,’ the voices go; though you may prefer to stand aside from the song.




June 14, 2010

LRB on Sonnets



Toolkit for Tinkerers

Colin Burrow

The Art of the Sonnet by Stephen Burt and David Mikics. Harvard, 451 pp, £25.95, May 2010, ISBN 978 0 674 04814 0
Sonnets have no rival. They’ve been written about kingfishers, love, squirrels, the moon (too often), God, despair, more love, grief, exultation, time, decay, church bells beyond the stars heard, war, statues, castles, rivers, revolutionaries, architecture, madness, seascapes, letters, kisses, and more or less everything else from apocalypse to zoos. Since its invention in 13th-century Sicily the sonnet has been the most versatile and enduring of poetic forms. It has been pumped with inscape and instress by Gerard Manley Hopkins, filled with sentiment by Anna Seward, cut and pasted by Ted Berrigan (his 1964 Sonnets were apparently assembled with the help of the 1960s equivalent of a Pritt Stick), and worked into a tortuous frenzy by Michelangelo. Blank verse and the heroic couplet, the staples of English versification from the 16th to the 19th century, seem small-timers by comparison. Sestinas have come and gone. Ottava rima and rhyme royal had their day, but lost favour when readers ceased to want long poems which combined storytelling with epigrammatic cleverness. Even now, when set poetic forms are generally snarled or snored at, the sonnet is probably the only verse shape that almost all literate people would be able to identify, if only through having seen Shakespeare’s ‘Let me not to the marriage of true minds’ printed in the order of service at weddings. Most people of a certain age could recite a sonnet or two by Wilfred Owen, or Keats, or Shakespeare.

read more How did this half-page filler, a half-pint form of a mere 14 lines, come to be so successful? From a reader’s point of view the answer is obvious. A sonnet is short enough not to get lost in but long enough to encompass at least one thought and probably a counter-thought too. They’re teachable and learnable. From a poet’s perspective the sonnet is a dream of organised flexibility, offering both liberties and bounds. For large portions of English literary history the word ‘sonnet’ could be used to describe more or less any short poem, but even the narrow definition favoured by the OED (‘14 decasyllabic lines, with rhymes arranged according to one or other of certain definite schemes’) allows for ingenious transformation. Fourteen is a good number to divide up. It can yield three quatrains and a couplet (the so-called Shakespearean sonnet, actually first used in the late Henrician period), or an octave and a sestet (the Petrarchan form, actually found as early as Dante), or even a set of seven couplets. Within each of these variations there may be further variations: do the quatrains hide a couplet within them (abba) or do they make up a couplet of rhymes (abab)? Should ‘a’ and ‘b’ rhymes dominate the octave, or can ‘c’ and ‘d’ jostle their way in too? The sonnet is a toolkit for tinkerers, and its formal flexibility can be matched by shifts and tricks in argument: poets who liked to turn things upside down could begin with a six-line rhymed unit (as Shelley did to evoke the topsy-turvy world of ‘England in 1819’), or use quatrain to refute quatrain, or break up the quatrains into couplets in quizzical dialogue with each other – as Alison Brackenbury does in her mischievous sonnet of 2004 called ‘Homework. Write a Sonnet. About Love?’
When poets have written about the sonnet they have tended to represent it as a small orderly space. Samuel Daniel is often quoted as having said: ‘is it not most delightful to see much excellently ordered in a small room?’ But he preceded that remark by commenting ‘especially seeing our passions are often without measure’, and potentially measureless freedoms also seem to come into poets’ minds when they think of the sonnet. That’s probably true even of Wordsworth’s declaration that ‘Nuns fret not at their convent’s narrow room’ and that ‘’twas pastime to be bound/Within the Sonnet’s scanty plot of ground.’ As Leigh Hunt drily noted in The Book of the Sonnet (1867), ‘thousands of nuns, there is no doubt, have fretted horribly, and do fret.’ That surely was part of Wordsworth’s point: a sonnet is not just an orderly space, but one which contains a passion or a thought fretting to get out. And that’s why, throughout its history, the sonnet has appealed to people who think of themselves as innovators or modernists – as Petrarch, Sidney, Dante, Michelangelo and Shakespeare did, as well as more recent experimenters such as Hopkins, John Berryman or the sub-Prynnean Tony Lopez. Donne and Hopkins used sonnets as vehicles for religious anguish because it’s so easy to suggest that they’re buckling under pressure, that the spirit will not run true to the form, or to God. The sonnet has a structure that invites mild rebellion. Its formal restrictions suggest less the unfretful Mother Teresa than the Julie Andrews kind of nun, who might just want to rip off the wimple and sing.
Stephen Burt and David Mikics’s collection of 100 sonnets through the ages is heavily weighted towards poems from the 20th and 21st centuries, and also towards some occasionally groan-worthy American poems – though perhaps hearts less jaded than mine leap up at Emma Lazarus’s effusion on the Statue of Liberty (‘Give me your tired, your poor,/Your huddled masses yearning to breathe free’). Many of the more recent poems in the collection fret at the discipline of the sonnet form, and several transform it into a vehicle for poetic liberty. At the more extreme end is Elizabeth Bishop’s ‘Sonnet’, which, with 14 half-rhymed or unrhymed lines of six syllables or fewer, looks like the left-hand half of a sonnet cracked down the middle. It begins ‘Caught – the bubble/in the spirit level’ and goes on ‘Freed – the broken/thermometer’s mercury’. It asks us to wonder if being captured is actually preferable to being freed: mercury cannot measure without a thermometer around it, nor can spirits level without a vial of glass; but mercury freed – quicksilver – is fun. For Bishop the word ‘sonnet’ suggests the benefits and costs of enclosing the amorphous within a fragile container. Amy Clampitt’s ‘The Cormorant in Its Element’ is more obedient, but it stages a natural rebellion by diving from the octave to the sestet with a hyphen, as the
            cormorant astounding-
ly, in one sleek involuted arabesque, a vertical
turn on a dime, goes into that inimitable
vanishing-and-emerging-from-under-the-briny-deep act.
One reason sonnets have come to be thought of as the natural form of what we now call ‘lyric poetry’ is that they can imply stories, or gesture to wider truths which might be immanent in a simple daily action, like praying, grieving, cutting hay, watching birds or reading a letter. Sonnets have often been written in larger groups, or ‘sequences’ as they’re misleadingly called, which can make these larger ambitions apparent. Groups of sonnets do not form sequences in the sense that the numbers 3,6,9 do: they don’t follow one from another in a predictable order but work together to imply a personal history, or an argument. Individual sonnets within a sequence are not bound to fit in, and sometimes single poems or small groups of poems can suddenly hint at a different version of the story from that which seems to be related in the larger sequence – as the small number of sonnets about the supposed ‘Dark Lady’ in Shakespeare’s sequence appear to do.
This aspect of the sonnet began with Dante’s Vita Nuova of 1295, which mixes verse with prose in order to relate, part as allegory, part as fiction, his love for Beatrice and his sorrow at her death. Dante’s sonnets sometimes recapitulate the surrounding narrative and sometimes widen its emotional scope. Each is followed by a short critical analysis which explains how the poem divides into two or three sections. The sonnets are therefore ‘occasional’, but not in the simple sense of being occasioned by a particular moment. They’re lyric responses to the larger story from which they arise, but they’re also presented as works which might stand on their own: Dante gives each its own title as well as its own commentary. La Vita Nuova prompted Petrarch to write his Rime sparse, which in turn led scores of sonneteers throughout 16th-century Europe and beyond to adore their Lauras, Delias, Stellas and Cassandras. Dante made the sonnet a lyric form in which a whole situation, a life, perhaps even a civilisation could be embedded in 14 lines.
That legacy runs right through the sonnet tradition. It enables an individual sonnet to function as a synecdoche: a single sonnet is visibly small and partial, a mere sonetto, a shapely little suono or ‘sound’, but it is a part which may suggest a larger story. The Petrarchan tradition is often disparaged today (Burt and Mikics have little time for it), but Petrarch took the sonnet a step further than Dante, and not just by getting rid of prose narrative and making the ‘narrative’ links between the poems implicit rather than explicit. The Petrarchan lover represents being in love as an endless state of pining and yearning, but also hopes that he is part of a story (boy inches towards girl, or boy inches towards boy; girl dies, or boy leaves, or scorns the poet). That gives Petrarch’s sonnets a curious temporal status: they’re caught between telling a story and the endless process of loving. And the paradox of loving is that it can be at once a state of being and a particular act that testifies to that state – a declaration of passion, a statement of physical desire. That paradox shapes the Petrarchan tradition, in which the expression of love is an obsessively repeated act which strives to carry an eternity of loving within it. This has a number of consequences for the history of the sonnet. Most Petrarchan poets are afraid of being swamped by repetition, and of replicating Petrarch’s story. The worst do indeed repeat and self-replicate endlessly. But it also means that the Petrarchan love sonnet has a touch of what came to be thought of as the sublime: because it does not quite tell a story it seems always to be gesturing to something beyond itself, a love which is never either fully consummated or revealed, but which is grandly on the edges of vision, glimpsed only in parts, through individual and more or less defective sonnets.
The sonnet fell out of favour in England for around a century after Milton’s death in 1674. It’s probably fruitless to try to explain why this happened, since ‘causes’ in literary history are as chaotically multiple as those that underlie changes in fashion. People get bored of static intensity in short poems as they periodically tire of floral shirts. But there is something about the deliberate provisionality of the sonnet which makes it unimaginable that Alexander Pope should ever have written one, or that Ben Jonson (who wrote only six) would take them seriously. Milton’s abrasive political sonnets, prompting ‘the age to quit their clogs’, which used the form to make an urgent response to both personal and political events, may not have helped the status of the sonnet in the early 18th century either. The revival of the form in the final quarter of the 18th century, though, makes sense in ways that go beyond fashion. At around this time there was a new excitement about Shakespeare’s sonnets, which came to be read as confessional poems (Wordsworth was following a whole generation of commentators when he claimed that ‘With this key/Shakespeare unlock’d his heart’). There was also a growing interest in the sublime and in ruins. This was one of those uncanny moments in literary history when a later age both misreads what’s going on in earlier writing and recognises something in it that appears obvious once you have been taught to see it. Suddenly the sonnet seemed like the perfect vehicle for a small-scale personal meditation on bare ruined choirs, a modest form that could gesture towards sublime emotions.
In her Original Sonnets on Various Subjects (1799), Anna Seward quoted a ‘Mr White’ from the Gentleman’s Magazine in 1786 who said ‘the style of the sonnet should be nervous, and, where the subject will with propriety bear elevation, sublime.’ Mary Robinson (described by Coleridge as ‘a woman of undoubted genius’, but perhaps too full of dim memories of Gray and Milton quite to deserve that praise) duly described the moon as ‘sublimely still, and beautifully pale!’ In ‘On Bamborough Castle’, William Lisle Bowles praises a sublime ruin in lines that shake up Shakespeare, Milton and Wordsworth with a dash of Mary Robinson:
Ye holy Towers that shade the wave-worn steep,
Long may ye rear your aged brows sublime,
Though, hurrying silent by, relentless Time
Assail you, and the winter whirlwind’s sweep!
Bowles, creaking though he now sounds, was a big influence on the sonnets of the major Romantic poets, which were generally written to stand on their own rather than in sequences, and which were often inspired by Miltonic vehemence as well as by a belief that Shakespearean sonnets revealed personal emotion; the love theme tends to drop out or be transformed. So Coleridge’s ‘Work without Hope’ (1825) adapts the commonplace of the Petrarchan tradition that the year renews and birds and bees fall in love while the speaker remains alone and unloved, a theme on which the Earl of Surrey (who had been freshly edited in 1815), among others, had composed variations. Coleridge, though, is not a frustrated lover but a frustrated poet, yearning to produce a larger artwork which lies beyond the scope of the poem and beyond the capacity of the poet: ‘Bloom, o ye amaranths! bloom for whom ye may,/For me ye bloom not!’ The sonnet had arrived as an apologia for a non-existent longer work, or as a testament to broken spiritual energy: as Hopkins put it (perhaps echoing Coleridge, perhaps Surrey), ‘birds build – but not I build.’
Poets continued to build in sonnets’ pretty rooms. Shelley could thunder against his times in a revival of Milton’s political sonnets, but he also wrote the sublime and seemingly fragmentary ‘Ozymandias’, which takes a broken work of art as a miniature token of a larger story about tyranny:
‘My name is Ozymandias, king of kings:
Look on my works, ye Mighty, and despair!’
Nothing beside remains. Round the decay
Of that colossal wreck, boundless and bare
The lone and level sands stretch far away.
A sonnet without a sequence is a part without a whole, and that is one reason ‘Ozymandias’ is so powerful. We see only a part of an artwork, ‘vast and trunkless legs of stone’ and ‘a shatter’d visage’. Around those fragments lie deserts of ‘lone and level sands’. A part can reverberate with the force of a whole, and can convey nostalgia, fear or excitement about the absence of that whole. Keats works in a similar way when he describes the ‘dizzy pain’ elicited by the Elgin Marbles, ‘That mingles Grecian grandeur with the rude/Wasting of old Time – with a billowy main –/A sun – a shadow of a magnitude.’ ‘A shadow of a magnitude’ magnificently evokes a larger structure that isn’t there, and also suggests the curious power of the sonnet to evoke a larger lost form. Rilke’s ‘Archaic Torso of Apollo’, in the same vein, describes a headless statue that ‘holds fast and shines’ even though ‘We never knew his head and all the light/that ripened in his fabled eyes.’ The poem exploits the power that comes from seeing only a part of things; it ends with the abrupt order, ‘You must change your life,’ and the statue seems to generate a surplus of authority by being partly lost. Probably a poem which was not a sonnet could have done the same thing. But the post-Romantic sonnet, with its roots in a poetic of the sublime, and with its buried legacy of sequences that use single poems to articulate a larger story, is particularly well suited to create this kind of shock. A part has lost its whole, but gains from the loss.
This is not an easy or an entirely comfortable legacy, and 20th-century writers of sonnets have sometimes seemed to try too hard either to be like or to differentiate themselves from earlier exponents of the form. In this collection there are those like Tony Harrison who want to hector the sonnet into becoming anti-elitist; there are others like Forrest Gander who are perhaps too keen to see the form as just a set of rules to break. But there are some great recent poems here, several of which manage that distinctive sonnetish trick of describing a small occasion in a way that suggests an obscured larger history. Patrick Kavanagh’s ‘Epic’ turns a boundary dispute between Irish farmers into a Homeric encounter. Geoffrey Hill, the modern master of the sonnet as vehicle for embedded history, reflects on the idea of England in one of the sonnets from ‘An Apology for the Revival of Christian Architecture in England’ (not ‘Loss and Gain’, the one that Burt and Mikics include, but the even finer ‘The Laurel Axe’). The poem describes a single scene, but also alludes to the attitudes which enable one to notice that scene and which also perhaps distort or ‘romanticise’ it:
Platonic England, house of solitudes,
rests in its laurels and its injured stone,
replete with complex fortunes that are gone,
beset by dynasties of moods and clouds.
That quatrain weights every word with time (‘Platonic England’ is an allusion to Coleridge), but also with mythologies which inspire intimacy and mistrust. The line ‘replete with complex fortunes that are gone’ is dazzling, and it suggests why the sonnet is such a good form in which to explore English histories and church histories in particular. It’s not just that Donne and Herbert wrote sonnets; rather the sonnet has become a short space that can be filled with time, ‘replete with complex fortunes that are gone’. A ‘house of solitudes’ (and sonnets are often figured as protective enclosures, a pretty room, a cell) that ‘rests in its laurels’ is not just like a great house surrounded by a laurel hedge: the phrase suggests something resting on its laurels, near to ruin.
Because sonnets tend to imply so much and say so relatively little they have always generated commentary: Dante’s came complete with their author’s notes, Petrarch’s were repeatedly worked over by more or less pedantic editors, and Michelangelo’s were first printed accompanied by a thick layer of neo-Platonist commentary. Shakespeare’s have been picked over at great length by Edmund Malone, Helen Vendler, Stephen Booth and so many others, while the fearsome poet (though no sonneteer) J.H. Prynne has produced a whole volume of commentary on the single Sonnet 94 (‘They that have power to hurt’). Commentary on sonnets is particularly hard to write, because it can end up filling in the gaps which really need to remain gaps if the poem is to retain its power – to imply, as it were, that we need to see and know the whole of the Elgin Marbles in order to understand what Keats is on about when he enthuses over their fragments. Burt and Mikics write two or three pages about each of their poems, and mostly these are clear and patient guides to rhythm and form, allusions, their relations to the lives of their authors. Sometimes they sound a little like patient teachers doing the diligence (or mostly doing the diligence, since there are a couple of howlers: the course of English poetry might have been rather different if the Earl of Surrey had lived to become ‘a proud Elizabethan nobleman’, rather than being executed in 1547, 11 years before Elizabeth’s accession). Often, though (and particularly in the commentaries signed by Burt), they say just the right thing to make their readers turn back to the poems. Since the editors regard the sonnet as ‘a shape where strong emotion might make sense’, they tend to position each poem on an axis that runs fairly smoothly from formalism to autobiography. They are of their age in doing so, but it means that they don’t always recognise how the sonnet can function as a symbolic fragment which obliquely alludes to larger narratives.

June 11, 2010

Poem du jour

First Glance

by BEN WILKINSON

Like that, the sudden hell-bent flap
of a pigeon at the window -
as if livid, bothered

by my lifting some slim volume
from a shelf,
rather than half-trapped,

taking glass for air
and flailing against a trick
of the light as much as itself,

reminds me of that time I saw
what I thought was you
(before I truly knew you)

kissing someone else,
only to find you, minutes later,
strolling up the street I was traipsing down.

June 9, 2010

TLS on Death

on Death  

My future me

by THOMAS NAGEL

a review of Mark Johnston's SURVIVING DEATH
416pp. Princeton University Press. £24.95 (US $35). 978 0 691 13012 5

If your existence depends on the life of a particular human being, you will vanish when that creature dies: the centre of consciousness that is now reading the TLS will be annihilated, and the universe will close over you. In Surviving Death, an ambitious and quixotic book, Mark Johnston shows a deep understanding of the natural fear of death and rejects a number of traditional religious and philosophical accounts of how we might survive it. He then offers his own explanation of how, even if one assumes a naturalistic world-view, surviving our own biological death might be theoretically possible. But the hope of survival he offers, apart from its philosophical implausibility, is one which neither the author nor his readers have a significant chance of achieving. So the book offers little comfort; but it is stimulating, written with skill and charm, and packed with illuminating philosophical reflection on the question of what we are, and what it is for us to persist over time - on the relations among selves, persons, human beings, bodies and souls.
What is it you care about, if you don't want to die? Not the survival of a particular organism, as such. The survival of that human animal concerns you because it is a condition of your continuing existence; if you could survive its death, then even though you might miss the old jalopy, the worst would be averted. But can we give sense, and perhaps even credence, to this possibility? Most of us can easily imagine waking up on the Day of Judgment, or being reincarnated as someone else, but perhaps that is just a trick of the imagination, a projection of the self that corresponds to no real possibility.
read more Johnston is moved in this inquiry not only by the pure wish to survive, but by another wish that seems to require survival for its fulfilment: the wish that goodness should be rewarded. Those who are good are not good for the sake of reward, but if great sacrifice in the name of the good is not rewarded, Johnston believes, the importance and even the rationality of goodness are threatened. He concludes, like Kant, that faith in the importance of goodness requires hope of reward in an afterlife. I have no sympathy for this view, because I believe that the reasons to be good are self-sufficient, even if they require sacrifice. But Johnston's conviction leads him to seek a demonstration that death is better for the good than for the bad, and that will be the key to his analysis of survival.
First, however, he has to dispose of the considerable array of alternative theories. These fall into three types: immortality of the soul, resurrection of the body, and psychological continuity. Johnston rejects each of them as a way we might survive death, for different reasons, and his arguments constitute an excellent tour of the territory of theological and philosophical theories of personal identity. He says that there is no evidence against the naturalistic view that nothing but a properly functioning brain is necessary for conscious mental life; in particular, psychical research has turned up no credible evidence for a detachable soul. He argues on subtle metaphysical grounds that even a body just like yours, reassembled by God at the Day of Judgment out of the same atoms that constituted your body shortly before your death, would not be the same body. And he maintains that, though we care about the continuation of our memories and personalities, such psychological continuity alone is not enough to guarantee that a future psychological replica will be you.
Though the issues are far from settled, Johnston makes a good case for the view that none of these three forms of survival is available. His weakest argument is that even if you had an immaterial soul, that would not justify your special concern for its future, since everyone else would also have such a soul - to which the reply is that your soul's future experiences are the only ones that would be yours. Suppose, however, that we concur with Johnston in setting aside these three types of account; what is the alternative?
To decide whether surviving death is possible, we need to know what would make a future experience mine. One of Johnston's important and plausible claims is that we cannot discover this by a priori reflection on what to say about various possible cases, because our concept of personal identity does not work that way. Instead, it operates by "offloading" the conditions of identity on to the real nature of certain actual persisting things - human animals, in our own case - which we reidentify only by their manifest properties. As he puts it: "The idea of offloading can be expressed by means of a motto, 'I don't know what the (non-trivial) sufficient conditions for identity over time are, but I do know a persisting object when I see one'". This phenomenon of offloading is familiar from the case of "natural kinds" like water or gold, whose real essences can't be discovered by a priori reflection on our concepts, but require empirical investigation.
We can offload the criteria for identity over time by referring to what metaphysicians call a substance, that is, "something whose present manifestation determines what it would be to have that very same thing again". Living things are the clearest examples of substances, in virtue of their active disposition to maintain themselves over time. To determine personal identity - identity of the self - we offload on to persisting human beings, a class of living things that we regard as possessing embodied minds. And now that we have learnt about the dependence of mental life more specifically on the operation of the brain, we add that if a brain could be kept alive without its body, it would continue to embody the same mind. This seems to imply that the true conditions of personal identity are determined by how mental life is generated in the brain, and it seems to rule out decisively any possibility that we might survive biological death. I think that is the correct conclusion. Johnston, however, believes he can escape it.
To do so, however, he must dismantle the ordinary idea of the persisting self, an idea he evokes vividly as follows: "The most immediate way in which I am given to myself is as the one at the center of this arena of presence and action". This is the subjective sense of "I", and it is this subjective I for whose interests he has an immediate, absorbing concern, and whose death he finds terrifying. "My sheer desire to survive may feed a desire that Johnston survive, but it is not itself a desire that Johnston survive. It is the desire that there will continue to be someone with the property of being me." The crux of Johnston's argument is that there is no such property - or none that could justify the special future-directed self-concern to which it is supposed to give guidance. The way the world is, independent of our attitudes, does not determine what it would be for this same arena of presence and action to exist at a later time.
Johnston denies (unconvincingly, I believe) that this subjective sameness can be secured by offloading on to the persistence conditions of the particular human being who occupies this arena of presence and action at the present moment. Instead, he claims that the self is a merely intentional object, whose identity is not an objective matter, but depends on what the subject takes it to be. Like the dagger that Macbeth imagines, its re-identification at different times is wholly determined by how the subject sees it. Johnston's relativism about personal identity is a radical inversion of the traditional dependence of your future-directed concerns on your belief about who will be you. He contends that personal identity is "responsedependent": it is the disposition of your future-directed concerns that determines who will be you, instead of the other way around.
To introduce this idea, Johnston imagines three tribes of human animals, whom he calls Hibernators, Teletransporters and Humans. The Hibernators fall into a deep sleep during the winter months, and they do not regard the person who will wake up in the spring in their body, with their memories and personality, as being numerically the same person as they are; they do not believe they survive dreamless sleep. The Teletransporters, on the other hand, are accustomed to superfast travel of the kind familiar from science fiction. They step into a machine that takes a complete reading of the microconstitution of their body, destroys the body and sends the information at the speed of light to a target machine at the destination, where a body physically and mentally indistinguishable from the original is produced from local materials. The Teletransporters believe they survive these trips, and unproblematically regard the person who will step out of the target machine as themselves. Finally, the Humans believe that they survive dreamless sleep and don't believe they would survive teleportation: they wouldn't get into one of those machines for a million pounds. In each case, the conviction is immediate and shows itself in unreflective patterns of special future-directed concern.
Johnston says that the Hibernators, Teletransporters and Humans are all right, each on their own terms. There is no objective fact that could make one belief right and the others wrong. Identity is response-dependent, and it is the disposition to identify with some future person deeply and consistently - to care about what happens to him or her in the first-person way - that constitutes the identity-determining disposition.
And here is the punchline: You can survive the biological death of the human being who is now at the centre of your arena of presence and action, if you develop a disposition of future-directed concern for all of the human beings who will exist after he is gone - if, in other words, you become someone who literally loves your posterity as yourself. But once the independent reality of the self is recognized as an illusion, this becomes the rational attitude to take: "If there is no persisting self worth caring about, the premium or excess that special self-concern expects and rejoices in cannot represent a reasonable demand or expectation . . . . One's own interests are not worth considering because they are one's own but simply because they are interests, and interests, wherever they arise and are legitimate, are equally worthy of consideration".
Therefore agape, the universal love that is the Christian ideal of goodness, brings with it its own reward, for those who can attain it. Persons are protean: a single person may be constituted by one human being, or by a series of human beings, or even by a huge crowd of human beings, "the onward rush of humanity", depending on which interests he is immediately disposed to incorporate into his practical outlook. Johnston's theory vindicates the importance of goodness by making absolute goodness the condition of continued life.
This form of survival through extreme selflessness would require a transformation that is out of the reach of almost everyone, and in any case not subject to the will. Johnston adds, though, that even if we cannot attain this perfect goodness, it is important to transcend natural selfishness and nepotism in a more familiar way, by recognizing that everyone's interests have the same importance as our own. Even if we cannot be truly good, we can become "good enough", not to survive death but to "face death down, to see through it to a pleasing future in which individual personalities flourish . . . . For the utterly selfish, however, the obliteration of their individual personalities is the obliteration of everything of real importance to them". But this is a familiar point, and does not cancel the absoluteness of one's own death.
To accept Johnston's theory that identity is relative, that persons are protean, and that we could survive death by coming to identify with future human beings, would require at least as large a dose of wishful thinking as belief in the immortality of the soul or the resurrection of the body. It seems far more likely that the world, in particular the facts about how the brain sustains the mind, determines what we are, even though those facts are still largely unknown. Johnston's scepticism about a purely mental substance as the carrier of personal identity is reasonable, but the familiar, and alas perishable human animal is harder to dislodge from its decisive control over our fate.

NYT on Marriage



What Brain Scans Can Tell Us About Marriage

By TARA PARKER-POPE

THE sudden breakup of Al and Tipper Gore’s seemingly idyllic marriage was the latest and among the sharpest reminders that the only two people who know what’s going on in a marriage are the two people who are in it.
The truth is that most marriages, even our own, are something of a mystery to outsiders. Several years ago, a marriage researcher — Robert W. Levenson, director of the psychophysiology laboratory at the University of California, Berkeley — and his colleagues produced a video of 10 couples talking and bickering. Dr. Levenson knew at the time that five of the couples had been in troubled relationships and eventually divorced. He showed the video to 200 people, including pastors, marriage therapists and relationship scientists, asking them to spot the doomed marriages. They guessed wrong half the time. “People on the outside aren’t very good at telling how marriages are really working,” he said.
Even so, academic researchers have become increasingly fascinated with the inner workings of long-married couples, subjecting them to a battery of laboratory tests and even brain scans to unravel the mystery of lasting love. Bianca Acevedo, a postdoctoral researcher at the University of California, Santa Barbara, studies the neuroscience of relationships and began a search for long-married couples who were still madly in love. Through a phone survey, she collected data on 274 men and women in committed relationships, and used relationship scales to measure marital happiness and passionate love.
read more Dr. Acevedo expected to find only a small percentage of long-married couples still passionately in love. To her surprise, about 40 percent of them continued to register high on the romance scale. The remaining 60 percent weren’t necessarily unhappy. Many had high levels of relationship satisfaction and were still in love, just not so intensely.
In a separate study, 17 men and women who were passionately in love agreed to undergo scans to determine what lasting romantic love looks like in the brain. The subjects, who had been married an average of about 21 years, viewed a picture of their spouse. As a control, they also viewed photos of two friends.
Compared with the reaction when looking at others, seeing the spouse activated parts of the brain associated with romantic love, much as it did when couples who had just fallen in love took the same test. But in the older couples, researchers spotted something extra: parts of the brain associated with deep attachment were also activated, suggesting that contentment in marriage and passion in marriage aren’t mutually exclusive.
“They have the feelings of euphoria, but also the feelings of calm and security that we feel when we’re attached to somebody,” Dr. Acevedo said. “I think it’s wonderful news.”
So how do these older couples keep the fires burning? Beyond the brain scans, it was clear that these couples remained active in each other’s lives.
“They were still very much in love and engaged in the relationship,” Dr. Acevedo said. “That’s something that seems different from the Gores, who said they had grown apart.”
Indeed, if there is a lesson from the Gore breakup, it’s that with marriage, you’re never done working on it.
“It’s not that you have to be constantly scared about your relationship, but you do have to renew it,” said Stephanie Coontz, a marriage historian at Evergreen State College in Olympia, Wash. “I think the warning we should take from this is not that marriages are doomed, but that you can’t skate indefinitely and be doing different things and not really be paying attention to the marriage itself.”
Research from Stony Brook University in New York suggests that couples who regularly do new and different things together are happier than those who repeat the same old habits. The theory is that new experiences activate the dopamine system and mimic the brain chemistry of early romantic love.
In a new study, the Stony Brook scientists will have couples playing either a mundane or exciting video game together while their brains are being scanned.. The goal is to see how sharing a new and challenging experience with a spouse changes the neural activation of the brain.
But for those of us without a brain scanner, there are simple ways to find out if your relationship is growing or vexed by boredom. Among the questions to ask yourself: How much does your partner provide a source of exciting experiences? How much has knowing your partner made you a better person? In the last month, how often did you feel that your marriage was in a rut?
If the answers aren’t exactly what you hoped for, take heart. From a statistical standpoint, your risk for divorce begins to fall once you’ve passed the 10-year mark. According to Betsey Stevenson, an economist at the University of Pennsylvania’s Wharton School, recent Census Bureau data show that only about 4 percent of recently ended marriages involved couples married for 40 years or more.
And it’s worth noting that the Gores married in 1970s, the beginning of a generation of couples that has consistently struggled with marriage more than any other group. Dr. Stevenson calls them the “greatest divorcing generation.”
Lost in the discussion about the Gore divorce is the inherent optimism that the decision represents. Professor Coontz recalls living next door to a couple in their 70s who disliked each other so much that during the summer, they sat outside in lawn chairs on the opposite sides of the house. “I think it’s good that people can go ahead and start over before they get to that level of anger and hostility,” she said.
Dr. Stevenson called the Gore breakup a “glass-half-full story.”
“They had 40 years of marriage, and they had what, by many dimensions, should be considered a successful marriage,” she said. “The fact that they both can look forward and see a promising future by not being married — it’s unfortunate that the answer is ‘yes,’ but it’s also somewhat a celebration about how much optimism they have for the rest of their lives.”

Tara Parker-Pope writes the Well column for The New York Times and is the author of “For Better: The Science of a Good Marriage,” which was released last month by Dutton.



Listening to Bajka



2010

June 8, 2010

The Economist on Birds and Bees


Sex hormones

For the birds

What regulates the lengths of human fingers?


I’ll show you mine if you show me yours

FROM financial traders’ propensity to make risky decisions to badly behaved schoolboys’ claims to be suffering from attention deficit hyperactivity disorder, testosterone makes a perfect scapegoat. In both of these cases, and others, many researchers reckon that the underlying cause is exposure to too much of that male hormone in the womb. Positive effects are claimed, too. Top-flight female football players and successful male musicians may also have fetal testosterone exposure to thank for their lot in life.

Yet the evidence that it is exposure in utero to testosterone that causes all these things relies on a shaky chain of causation. What these people actually share is a tendency for their ring fingers to be longer than their index fingers. This peculiarity of anatomy is often ascribed to fetal testosterone exposure because it is common in men and much rarer in women, and because there seems to be a correlation between the point in gestation when it appears and surges of testosterone in the womb. But the link has never been proved decisively. It has, rather, just become accepted wisdom.

Research carried out on birds now suggests that the accepted wisdom could be wrong. In their study of the feet of zebra finches published this week in the Proceedings of the Royal Society, Wolfgang Forstmeier and his colleagues at the Max Planck Institute for Ornithology in Seewiesen, Germany, conclude that oestrogen—the hormone of femininity—rather than testosterone, may be to blame.

Although it is well over 300m years since people and finches had a common ancestor, the basic vertebrate body plan is the same in both. So, a few years ago Dr Forstmeier, an expert on finch behaviour, wondered if the link between digit ratio and behaviour might show up in his animals, too.


read more

It did. The ratio between a zebra finch’s second and fourth digits (which are not fingers but toes in birds) is associated with more courtship songs by males and fewer flirtatious hops by females—in other words with more masculine behaviour, regardless of the sex of the individual.

Dr Forstmeier probed the matter further. He has been investigating the birds’ oestrogen and androgen receptors—molecules that respond to female and male hormones, respectively.



Zebra crossing

The receptors in question orchestrate both behavioural and physical development, including some types of bone growth, in many vertebrate species. Different versions of a receptor (encoded by genes that have slightly different DNA sequences) can be more or less sensitive to the appropriate hormone. That led Dr Forstmeier to ask whether the type of hormone receptor a bird has influences its digit ratio, its sexual behaviour or both.

To find out, he looked for correlations between genes, ratios and behaviour in more than 1,100 zebra finches. Surprisingly, in view of the working assumption about humans, the type of testosterone receptor that a bird had proved to be irrelevant. Its oestrogen-receptor variant, however, had a significant impact on both digit ratio and courtship behaviour. This suggests that the sorts of predispositions that in people are blamed on fetal testosterone are caused in birds by fetal oestrogen (or, rather, the response to it).

That does not, of course, mean the same thing is true in people: 300m years is quite a long time for differences to emerge. It is also true that the digit-ratio that predicts male-like behaviour in birds is the opposite of the one found in humans (ie, the second digit, rather than the fourth, is the longer of the two). But it does suggest that it would be worth double-checking. Though science likes to think of itself as rational, it is just as prone to fads and assumptions as any other human activity. That, plus the fact that most scientists are men, may have led to some lazy thinking about which hormone is more likely to control gender-related behaviour. Just possibly, the trader’s finger should be pointing at oestrogen, not testosterone.




June 7, 2010

Poem du jour


A Maxim

by Carl Dennis

To live each day as if it might be the last
Is an injunction that Marcus Aurelius
Inscribes in his journal to remind himself
That he, too, however privileged, is mortal,
That whatever bounty is destined to reach him
Has reached him already, many times.
But if you take his maxim too literally
And devote your mornings to tinkering with your will,
Your afternoons and evenings to saying farewell
To friends and family, you’ll come to regret it.
Soon your lawyer won’t fit you into his schedule.
Soon your dear ones will hide in a closet
When they hear your heavy step on the porch.
And then your house will slide into disrepair.
If this is my last day, you’ll say to yourself,
Why waste time sealing drafts in the window frames
Or cleaning gutters or patching the driveway?
If you don’t want your heirs to curse the day
You first opened Marcus’s journals,
Take him simply to mean you should find an hour
Each day to pay a debt or forgive one,
Or write a letter of thanks or apology.
No shame in leaving behind some evidence
You were hoping to live beyond the moment.
No shame in a ticket to a concert seven months off,
Or, better yet, two tickets, as if you were hoping
To meet by then someone who’d love to join you,
Two seats near the front so you catch each note.