Tag Archives: writing

The Force That Drives the Flower – Annie Dillard – The Atlantic

The Force That Drives the Flower – Annie Dillard – The Atlantic.

I wakened myself last night with my own shouting. It must have been that terrible yellow plant I saw pushing through the flood-damp soil near the log by Tinker Creek, the plant as fleshy and featureless as a slug, that erupted through the floor of my brain as I slept, and burgeoned into the dream of fecundity that woke me up.

I was watching two huge luna moths mate. Luna moths are those fragile ghost moths, fairy moths, whose five-inch wings are swallow-tailed, a pastel green bordered in silken lavender. From the hairy head of the male sprouted two enormous, furry antennae that trailed down past his ethereal wings. He was on top of the female, hunching repeatedly with a horrible animal vigor.

It was the perfect picture of utter spirituality and utter degradation. I was fascinated and could not turn away my eyes. By watching them I in effect permitted their mating to take place and so committed myself to accepting the consequences—all because I wanted to see what would happen. I wanted in on a secret.

And then the eggs hatched and the bed was full of fish. I was standing across the room in the doorway, staring at the bed. The eggs hatched before my eyes, on my bed, and a thousand chunky fish swarmed there in a viscid slime. The fish were firm and fat, black and white, with triangular bodies and bulging eyes. I watched in horror as they squirmed three feet deep, swimming and oozing about in the glistening, transparent slime. Fish in the bed!—and I awoke. My ears still rang with the foreign cry that had been my own voice.

Fool, I thought: child, you child, you ignorant, innocent fool. What did you expect to see—angels? For it was understood in the dream that the bed full of fish was my own fault, that if I had turned away from the mating moths the hatching of their eggs wouldn’t have happened, or at least would have happened in secret, elsewhere. I brought it on myself, this slither, this swarm.

I don’t know what it is about fecundity that so appalls. I suppose it is the teeming evidence that birth and growth, which we value, are ubiquitous and blind, that life itself is so astonishingly cheap, that nature is as careless as it is bountiful, and that with extravagance goes a crushing waste that will one day include our own cheap lives. Every glistening egg is a memento mori.

Now, in late June in the Blue Ridge, things are popping outside. Creatures extrude or vent eggs; larvae fatten, split their shells, and eat them; spores dissolve or explode; root hairs multiply, corn puffs on the stalk, grass yields seed, shoots erupt from the earth turgid and sheathed; wet muskrats, rabbits, and squirrels slide into the sunlight, mewling and blind; and everywhere watery cells divide and swell, swell and divide. I can like it and call it birth and regeneration, or I can play the devil’s advocate and call it rank fecundity—and say that it’s hell that’s a-poppin’.

This is what I plan to do. Partly as a result of my terrible dream, I have been thinking that the landscape of the intricate world that I have cherished is inaccurate and lopsided. It’s too optimistic. For the notion of the infinite variety of detail and the multiplicity of forms is a pleasing one; in complexity are the fringes of beauty, and in variety are generosity and exuberance. But all this leaves something vital out of the picture. It is not one monarch butterfly I see, but a thousand. I myself am not one, but legion. And we are all going to die.

In this repetition of individuals is a mindless stutter, an imbecilic fixedness that must be taken into account. The driving force behind all this fecundity is a terrible pressure I also must consider, the pressure of birth and growth, the pressure that squeezes out the egg and bursts the pupa, that hungers and lusts and drives the creature relentlessly toward its own death. Fecundity, then, is what I have been thinking about, fecundity and the pressure of growth. Fecundity is an ugly word for an ugly subject. It is ugly, at least, in the eggy animal world. I don’t think it is for plants.

* * * * *

I never met a man who was shaken by a field of identical blades of grass. An acre of poppies and a forest of spruce boggle no one’s mind. Even ten square miles of wheat gladdens the hearts of most people, although it is really as unnatural and freakish as the Frankenstein monster; if man were to die, I read, wheat wouldn’t survive him more than three years. No, in the plant world, and especially among the flowering plants, fecundity is not an assault on human values. Plants are not our competitors; they are our prey and our nesting materials. We are no more distressed at their proliferation than an owl is at a population explosion among field mice.

After the flood last year I found a big tulip tree limb that had been wind-thrown into Tinker Creek. The current dragged it up on some rocks on the bank, where receding waters stranded it. A month after the flood I discovered that it was growing new leaves. Both ends of the branch were completely exposed and dried. I was amazed. It was like the old fable about the corpse growing a beard; it was as if the woodpile in my garage were suddenly to burst greenly into leaf. The way plants persevere in the bitterest of circumstances is utterly heartening. I can barely keep from unconsciously ascribing a will to these plants, a do-or-die courage, and I have to remind myself that coded cells and mute water pressure have no idea how grandly they are flying in the teeth of it all.

In the lower Bronx, for example, enthusiasts found an ailanthus tree that was fifteen feet long growing from the corner of a garage roof. It was rooted in and living on “dust and roofing cinders.” Even more spectacular is a desert plant, Ibervillea sonorae—a member of the gourd family—that Joseph Wood Krutch describes. If you see this plant in the desert, you see only a dried chunk of loose wood. It has neither roots nor stems; it’s like an old gray knothole. But it is alive. Each year before the rainy season comes, it sends out a few roots and shoots. If the rain arrives, it grows flowers and fruits; these soon wither away, and it reverts to a state as quiet as driftwood.

Well, the New York Botanical Garden put a dried Ibervillea sonorae on display in a glass case. “For seven years,” says Joseph Wood Krutch, “without soil or water, simply lying in the case, it put forth a few anticipatory shoots and then, when no rainy season arrived, dried up again, hoping for better luck next year.” That’s what I call flying in the teeth of it all.

(It’s hard to understand why no one at the New York Botanical Garden had the grace to splash a glass of water on the thing. Then they could say on their display case label, “This is a live plant.” But by the eighth year what they had was a dead plant, which is precisely what it had looked like all along. The sight of it, reinforced by the label, “Dead Ibervillea sonorae,” would have been most melancholy to visitors. I suppose they just threw it away.)

The growth pressure of plants can do an impressive variety of tricks. Bamboo can grow three feet in twenty-four hours, an accomplishment that is capitalized upon, legendarily, in that exquisite Asian torture in which a victim is strapped to a mesh bunk a mere foot above a bed of healthy bamboo plants whose woodlike tips have been sharpened. For the first eight hours he is fine, if jittery; then he starts turning into a colander, by degrees.

Down at the root end of things, blind growth reaches astonishing proportions. So far as I know, only one real experiment has ever been performed to determine the extent and rate of root growth, and when you read the figures, you see why. I have run into various accounts of this experiment, and the only thing they don’t reveal is how many lab assistants were blinded for life.

The experimenters studied a single grass plant, winter rye. They let it grow in a greenhouse for four months; then they gingerly spirited away the soil—under microscopes, I imagine—and counted and measured all the roots and root hairs. In four months the plant had set forth 378 miles of roots—that’s about three miles a day—in 14 million distinct roots. This is mighty impressive, but when they get down to the root hairs, I boggle completely. In those same four months the rye plant created 14 billion root hairs, and those little things placed end to end just about wouldn’t quit. In a single cubic inch of soil, the length of the root hairs totaled 6000 miles.

Other plants use water power to heave the rock earth around as though they were merely shrugging off a silken cape. Rutherford Platt tells about a larch tree whose root had cleft a one-and-a-half-ton boulder and hoisted it a foot into the air. Everyone knows how a sycamore root will buckle a sidewalk, a mushroom will shatter a cement basement floor. But when the first real measurements of this awesome pressure were taken, nobody could believe the figures.

Rutherford Platt tells the story in The Great American Forest, one of the most interesting books ever written:

In 1875, a Massachusetts farmer, curious about the growing power of expanding apples, melons, and squashes, harnessed a squash to a weight-lifting device which had a dial like a grocer’s scale to indicate the pressure exerted by the expanding fruit. As the days passed, he kept piling on counterbalancing weight; he could hardly believe his eyes when he saw his vegetables quietly exerting a lifting force of 5 thousand pounds per square inch. When nobody believed him, he set up exhibits of harnessed squashes and invited the public to come and see. The Annual Report of the Massachusetts Board of Agriculture, 1875, reported: “Many thousands of men, women, and children of all classes of society visited it. Mr. Penlow watched it day and night, making hourly observations; Professor Parker was moved to write a poem about it; Professor Seelye declared that he positively stood in awe of it.”

All this is very jolly. Unless perhaps I were strapped down above a stand of growing, sharpened bamboo, I am unlikely to feel the faintest queasiness either about the growth pressure of plants or their fecundity. Even when the plants get in the way of human “culture,” I don’t mind. When I read how many thousands of dollars a city like New York has to spend to keep underground water pipes free of ailanthus, ginko, and sycamore roots, I cannot help but give a little cheer. After all, water pipes are almost always an excellent source of water. In a town where resourcefulness and beating the system are highly prized, these primitive trees can fight city hall and win.

But in the animal world things are different, and human feelings are different. While we’re in New York, consider the cockroaches under the bed and the rats in the early morning clustered on the porch stoop. Apartment houses are hives of swarming roaches. Or again: in one sense you could think of Manhattan’s land as high-rent, high-rise real estate; in another sense you could see it as an enormous breeding ground for rats, acres and acres of rats. I suppose that the cockroaches don’t do as much actual damage as the roots do; nevertheless, the prospect does not please. Fecundity is anathema only in the animal. “Acres and acres of rats” has a suitably chilling ring to it that is decidedly lacking if I say instead “acres and acres of tulips.”

* * * * *

The landscape of earth is dotted and smeared with masses of apparently identical individual animals, from the great Pleistocene herds that blanketed grasslands to the gluey gobs of bacteria that clog the lobes of lungs. The oceanic breeding grounds of pelagic birds are as teeming and cluttered as any human Calcutta. Lemmings blacken the earth and locusts the air. Grunion run thick in the ocean, corals pile on pile, and protozoans explode in a red tide stain. Ants take to the skies in swarms, mayflies hatch by the millions, and molting cicadas coat the trunks of trees. Have you seen the rivers run red and lumpy with salmon?

Consider the ordinary barnacle, the rock barnacle. Inside every one of those millions of hard white cones on the rocks—the kind that bruises your heel as you bruise its head—is of course a creature as alive as you or me. Its business in life is this: when a wave washes over it, it sticks out twelve feathery feeding appendages and filters the plankton for food. As it grows, it sheds its skin like a lobster, enlarges its shell, and reproduces itself without end. The larvae “hatch into the sea in milky clouds.” The barnacles encrusting a single half-mile of shore can leak into the water a million million larvae. How many is that to a human mouthful? In sea water they grow, molt, change shape wildly, and eventually, after several months, settle on the rocks, turn into adults, and build shells. Inside the shells they have to shed their skins. Rachel Carson was always finding the old skins; she reported: “Almost every container of sea water I bring up from the shore is flecked with white, semitransparent objects…. Seen under the microscope, every detail of structure is perfectly represented…. In the little cellophane-like replicas I can count the joints of the appendages; even the bristles, growing at the bases of the joints, seem to have been slipped out of their casings.” All in all, rock barnacles may live four years.

My point about rock barnacles is those million million larvae “in milky clouds” and those shed flecks of skin. Sea water seems suddenly to be but a broth of barnacle bits. Can I fancy that a million million human infants are more real?

I have seen the mantis’ abdomen dribbling out eggs in wet bubbles like tapioca pudding glued to a thorn. I have seen a film of a termite queen as big as my face, dead white and featureless, glistening with slime, throbbing and pulsing out rivers of globular eggs. Termite workers, who looked like tiny dock workers unloading the Queen Mary, licked each egg to prevent mold as fast as it was extruded. The whole world is an incubator for incalculable numbers of eggs, each one coded minutely and ready to burst.

The egg of a parasite chalcid fly, a common small fly, multiplies unassisted, making ever more identical eggs. The female lays a single fertilized egg in the flaccid tissues of its live prey, and that egg divides and divides. As many as 2000 new parasitic flies will hatch to feed on the host’s body with identical hunger. Similarly—only more so—Edwin Way Teale reports that a lone aphid, without a partner, breeding “unmolested” for one year, would produce so many living aphids that, although they are only a tenth of an inch long, together they would extend into space 2500 light-years. Even the average goldfish lays 5000 eggs, which it will eat as fast as it lays, if permitted. The sales manager of Ozark Fisheries in Missouri, which raises commercial goldfish for the likes of me, said, “We produce, measure, and sell our product by the ton.” The intricacy of goldfish and aphids multiplied mindlessly into tons and light-years is more than extravagance; it is holocaust, parody, glut.

The pressure of growth among animals is a kind of terrible hunger. These billions must eat in order to fuel their surge to sexual maturity so that they may pump out more billions of eggs. And what are the fish on the bed going to eat, or hatched mantises in a Mason jar going to eat, but each other? There is a terrible innocence in the benumbed world of the lower animals, reducing life there to a universal chomp. Edwin Way Teale, in The Strange Lives of Familiar Insects—a book I couldn’t live without—describes several occasions of meals mouthed under the pressure of a hunger that knew no bounds.

There is the dragonfly nymph, for instance, which stalks the bottom of the creek and the pond in search of live prey to snare with its hooked, unfolding lip. Dragonfly nymphs are insatiable and mighty. They clasp and devour whole minnows and fat tadpoles. “A dragonfly nymph,” says Teale, “has even been seen climbing up out of the water on a plant to attack a helpless dragonfly emerging, soft and rumpled, from its nymphal skin.” Is this where I draw the line?

It is between mothers and their offspring that these feedings have truly macabre overtones. Look at lacewings. Lacewings are those fragile green creatures with large, transparent wings. The larvae eat enormous numbers of aphids, the adults mate in a fluttering rush of instinct, lay eggs, and die by the millions in the first cold snap of fall. Sometimes, when a female lays her fertile eggs on a green leaf atop a slender stalked thread, she is hungry. She pauses in her laying, turns around, and eats her eggs one by one, then lays some more, and eats them, too.

Anything can happen, and anything does; what’s it all about? Valerie Eliot, T. S. Eliot’s widow, wrote in a letter to the London Times: “My husband, T. S. Eliot, loved to recount how late one evening he stopped a taxi. As he got in the driver said: ‘You’re T. S. Eliot.’ When asked how he knew, he replied: ‘Ah, I’ve got an eye for a celebrity. Only the other evening I picked up Bertrand Russell, and I said to him, “Well, Lord Russell, what’s it all about,” and, do you know, he couldn’t tell me.'” Well, Lord God, asks the delicate, dying lacewing whose mandibles are wet with the juice secreted by her own ovipositor, what’s it all about? (“And, do you know…”)

* * * * *

Although mothers devouring their own offspring is patently the more senseless, somehow the reverse behavior is the more appalling. In the death of the parent in the jaws of its offspring I recognize a universal drama that chance occurrence has merely telescoped, so that I can see all the players at once. Gall gnats, for instance, are common small flies. Sometimes, according to Teale, a gall gnat larva, which does not resemble the adult in the least, and has certainly not mated, nevertheless produces within its body eggs, live eggs, which then hatch within its soft tissues. Sometimes the eggs hatch alive even within the quiescent body of the pupa. The same incredible thing occasionally occurs within the genus Miastor, again to both larvae and pupae. “These eggs hatch within their bodies and the ravenous larvae which emerge immediately begin devouring their parents.” In this case, I know what it’s all about, and I wish I didn’t. The parents die, the next generation lives, ad majorem gloriam, and so it goes.

You are an ichneumon wasp. You mated and your eggs are fertile. If you can’t see a caterpillar on which to lay your eggs, your young will starve. When the eggs hatch, the young will eat any body in which they find themselves, so if you don’t kill them by emitting them broadcast over the landscape, they’ll eat you alive. But if you let them drop over the fields you will probably be dead yourself, of old age, before they even hatch to starve, and the whole show will be over and done, and a wretched one it was. You feel them coming, and coming, and you struggle to rise….

Not that the ichneumon wasp is making any conscious choice. If she were, her dilemma would be truly the stuff of tragedy; Aeschylus need have looked no further than the ichneumon. That is, it would be the stuff of real tragedy if only Aeschylus and I could convince you that the ichneumon is really and truly as alive as we are, and that what happens to it matters. Will you take it on faith?

Here is one last story. It shows that the pressures of growth “gang aft a-gley.” The clothes moth, whose caterpillar eats wool, sometimes goes into a molting frenzy that Teale describes as “curious.” “A curious paradox in molting is the action of a clothes-moth larva with insufficient food. It sometimes goes into a ‘molting frenzy,’ changing its skin repeatedly and getting smaller and smaller with each change.” Smaller and smaller … can you imagine the frenzy? Where shall we send our sweaters? The diminution process could, in imagination, extend to infinity, as the creature frantically shrinks and shrinks and shrinks to the size of a molecule, then an electron, but never can shrink to absolute nothing and end its terrible hunger. I feel like Ezra: “And when I heard this thing, I rent my garment and my mantle, and plucked off the hair of my head and of my beard, and sat down astonied.”

* * * * *

I am not kidding anyone if I pretend that these awesome pressures to eat and breed are wholly mystifying. The million million barnacle larvae in a half-mile of shore water, the rivers of termite eggs, and the light-years of aphids ensure the living presence, in a scarcely concerned world, of ever more rock barnacles, termites, and aphids.

It’s chancy out there. Dog whelks eat rock barnacles, worms invade their shells, shore ice razes them from the rocks and grinds them to a powder. Can you lay aphid eggs faster than chickadees can eat them? Can you find a caterpillar, can you beat the killing frost?

As far as lower animals go, if you lead a simple life you probably face a boring death. Some animals, however, lead such complicated lives that not only do the chances for any one animal’s death at any minute multiply greatly but so also do the varieties of the deaths it might die. The ordained paths of some animals are so rocky they are preposterous. The horsehair worm in the duck pond, for instance, wriggling so serenely near the surface, is the survivor of an impossible series of squeaky escapes. I did a bit of research into the life cycles of these worms, which are shaped exactly like hairs from a horse’s tail, and learned that although scientists are not exactly sure what happens to any one species of them, they think it might go something like this:

You start with long strands of eggs wrapped around vegetation in the duck pond. The eggs hatch, the larvae emerge, and each seeks an aquatic host, say a dragonfly nymph. The larva bores into the nymph’s body, where it feeds and grows and somehow escapes. Then if it doesn’t get eaten, it swims over to the shore where it encysts on submersed plants. This is all fairly improbable, but not impossibly so.

Now the coincidences begin. First, presumably, the water level of the duck pond has to drop. This exposes the vegetation so that the land host organism can get at it without drowning. Horsehair worms have various land hosts, such as crickets, beetles, and grasshoppers. Let’s say ours can only make it if a grasshopper comes along. Fine. But the grasshopper had best hurry, for there is only so much fat stored in the encysted worm, and it might starve. Well, here comes just the right species of grasshopper, and it is obligingly feeding on shore vegetation. Now I have not observed any extensive grazing of grasshoppers on any grassy shores, but obviously it must occur. Bingo, then, the grasshopper just happens to eat the encysted worm.

The cyst bursts. The worm emerges in all its hideous length, up to thirty-six inches, inside the body of the grasshopper, on which it feeds. I presume that the worm must eat enough of its host to stay alive, but not so much that the grasshopper will keel over dead far from water. Entomologists have found tiger beetles dead and dying on the water whose insides were almost perfectly empty except for the white, coiled bodies of horsehair worms. At any rate, now the worm is almost an adult, ready to reproduce. But first it’s got to get out of this grasshopper.

Biologists don’t know what happens next. If at the critical stage the grasshopper is hopping in a sunny meadow away from a duck pond or ditch, which is entirely likely, then the story is over. But say it happens to be feeding near the duck pond. The worm perhaps bores its way out of the grasshopper’s body, or perhaps is excreted. At any rate, there it is on the grass, drying out. Now some biologists have to go so far as to invoke a “heavy rain,” falling from heaven at this fortuitous moment, in order to get the horsehair worm back into the water where it can mate and lay more seemingly doomed eggs. You’d be thin, too.

Other creatures have it just about as easy. A blood fluke starts out as an egg in human feces. If it happens to fall into fresh water, it will live only if it happens to encounter a certain. species of snail. It changes in the snail, swims out, and now needs to find a human being in the water in order to bore through his skin. It travels around in the man’s blood, settles down in the blood vessels of his intestine, and turns into a sexually mature blood fluke, either male or female. Now it has to find another fluke, of the opposite sex, which also just happens to have traveled the same circuitous route and landed in the same unfortunate man’s intestinal blood vessels. Other flukes lead similarly improbable lives, some passing through as many as four hosts.

But it is for gooseneck barnacles that I reserve the largest measure of awe. Recently I saw photographs taken by members of the Ra expedition. One showed a glob of tar as big as a softball, jetsam from a larger craft, which Heyerdahl and his crew spotted in the middle of the Atlantic Ocean. The tar had been in the sea for a long time; it was overgrown with gooseneck barnacles. The gooseneck barnacles were entirely incidental, but for me they were the most interesting thing about the whole expedition. How many gooseneck barnacle larvae must be dying out there in the middle of vast oceans for every one that finds a glob of tar to fasten to? You’ve seen gooseneck barnacles washed up on the beach; they grow on old ship’s timber, driftwood, strips of rubber—anything that’s been afloat in the sea long enough. They do not resemble rock barnacles in the least, although the two are closely related. They have pinkish shells extending in a flattened oval from a flexible bit of “gooseneck” tissue that secures them to the substratum.

I have always had a fancy for these creatures, but I’d always assumed that they lived near shores, where chance floating holdfasts, are more likely to occur. What are they doing—what are the larvae doing—out there in the middle of the ocean? They drift and perish, or, by some freak accident in a world where anything can happen, they latch and flourish. If I dangled my hand from the deck of the Ra into the sea, could a gooseneck barnacle fasten there? If I gathered a cup of ocean water, would I be holding a score of dying and dead barnacle larvae? Should I throw them a chip? What kind of a world is this, anyway? Why not make fewer barnacle larvae and give them a decent chance? Are we dealing in life, or in death?

* * * * *

I have to look at the landscape of the blue-green world again. Just think: in all the clean, beautiful reaches of the solar system, our planet alone is a blot; our planet alone has death. I have to acknowledge that the sea is a cup of death and the land is a stained altar stone. We the living are survivors huddled on flotsam, living on jetsam. We are escapees. We wake in terror, eat in hunger, sleep with a mouthful of blood.

The faster death goes, the faster evolution goes. If an aphid lays a million eggs, several might survive. Now, my right hand, in all its human cunning, could not make one aphid in a thousand years. But these aphid eggs—which run less than a dime a dozen, which run absolutely free—can make aphids as effortlessly as the sea makes waves. Wonderful things, wasted. It’s a wretched system.

Any three-year-old can see how unsatisfactory and clumsy is this whole business of reproducing and dying by the billions. We have not yet encountered any god who is as merciful as a man who flicks a beetle over on its feet. There is not a people in the world that behaves as badly as praying mantises. But wait, you say, there is no right and wrong in nature; right and wrong is a human concept. Precisely: we are moral creatures, then, in an amoral world. The universe that suckled us is a monster that does not care if we live or die—does not care if it itself grinds to a halt. It is fixed and blind, a robot programmed to kill. We are free and seeing; we can only try to outwit it at every turn to save our skins.

This view requires that a monstrous world running on chance and death, careening blindly from nowhere to nowhere, somehow produced wonderful us. I came from the world, I crawled out of a sea of amino acids, and now I must whirl around and shake my fist at that sea, and cry Shame! If I value anything at all, then I must blindfold my eyes when I near the randomly shaped Swiss Alps. We must as a culture disassemble our telescopes and settle down to backslapping. We little blobs of soft tissue crawling around on this one planet’s skin are right, and the whole universe is wrong.

Or consider the alternative.

Julian of Norwich, the great English anchorite and theologian, cited, in the manner of the prophets, these words from God: “See, I am God: see, I am in all things: see, I never lift my hands off my works, nor ever shall, without end…. How should anything be amiss?” But now not even the simplest and best of us sees things the way Julian did. It seems to us that plenty is amiss. So much is amiss that I must consider the second fork in the road, that creation itself is blamelessly, benevolently askew by its very free nature, and that it is only human feeling that is freakishly amiss. The frog I saw being sucked by a giant water bug had, presumably, a rush of pure feeling for about a second, before its brain turned to broth. I, however, have been sapped by various strong feelings about the incident almost every day for several years.

Do the barnacle larvae care? Does the lacewing who eats her eggs care? If they do not care, then why am I making all this fuss? If I am a freak, then why don’t I hush?

Our excessive emotions are so patently painful and harmful to us as a species that I can hardly believe that they evolved. Other creatures manage to have effective matings and even stable societies without great emotions, and they have a bonus in that they need not ever mourn. (But some higher animals have emotions that we think are similar to ours: dogs, elephants, otters, and the sea mammals mourn their dead. Why do that to an otter? What creator could be so cruel, not to kill otters, but to let them care?) It would seem that emotions are the curse, not death—emotions that appear to have devolved upon a few freaks as a special curse from Malevolence.

All right then. It is our emotions that are amiss. We are freaks, the world is fine, and let us all go have lobotomies to restore us to a natural state. We can leave the library then, go back to the creek lobotomized, and live on its banks as untroubled as any muskrat or reed. You first.

Of the two ridiculous alternatives, I rather favor the second. Although it is true that we are moral creatures in an amoral world, the world’s amorality does not make it a monster. Rather, I am the freak. Perhaps I don’t need a lobotomy, but I could use some calming down, and Tinker Creek is just the place for it. I must go down to the creek again. It is where I belong, although as I become closer to it, my fellows appear more and more freakish, and my home in the library more and more limited. Imperceptibly at first, and now consciously, I shy away from the arts, from the human emotional stew. I read what the men with telescopes and microscopes have to say about the landscape, I read about the polar ice, and I drive myself deeper and deeper into exile from my own kind. But, since I cannot avoid the library altogether—the human culture that taught me to speak in its tongue—I bring human values to the creek, and so save myself from being brutalized.

What I have been after all along is not an explanation but a picture. This is the way the world is, altar and cup, lit by the fire from a star that has only begun to die. My rage and shock at the pain and death of individuals of my kind is the old, old mystery, as old as man, but forever fresh, and completely unanswerable. My reservations about the fecundity and waste of life among other creatures are, however, mere squeamishness. After all, I’m the one having the nightmares. It is true that many of the creatures live and die abominably, but I am not called upon to pass judgment. Nor am I called upon to live in that same way, and those creatures who are are mercifully unconscious.

The picture of fecundity and its excesses and of the pressures of growth and its accidents is of course no different from the picture I have long cherished of the world as an intricate texture of a bizarre variety of forms. Only now the shadows are deeper. Extravagance takes on a sinister, wastrel air, and exuberance blithers. When I added the dimension of time to the landscape of the world, I saw how freedom grew the beauties and horrors from the same live branch. This landscape is the same as that one, with a few more details added, and a different emphasis. Instead of one goldfish swimming in its intricate bowl, I see tons and tons of goldfish laying and eating billions and billions of eggs. The point of all the eggs is of course to make goldfish one by one—nature loves the idea of the individual, if not the individual himself—and the point of a goldfish is pizazz. This is familiar ground. I merely failed to acknowledge that it is death that is spinning the globe.

It is harder to take, but surely it’s been thought about. I cannot really get very exercised over the hideous appearance and habits of some deep-sea jellies and fishes, and I exercise easy. But about the topic of my own death I am decidedly touchy. Nevertheless, the two phenomena are two branches of the same creek, the creek that waters the world. Its source is freedom, and its network of branches is infinite. The graceful mockingbird that falls drinks there and sips in the same drop a beauty that waters its eyes and a death that fledges and flies. The petals of tulips are flaps of the same doomed water that swells and hatches in the ichneumon’s gut.

That something is everywhere and always amiss is part of the very stuff of creation. It is as though each clay form had baked into it, fired into it, a blue streak of nonbeing, a shaded emptiness like a bubble that not only shapes its very structure but that also causes it to list and ultimately explode. We could have planned things more mercifully, perhaps, but our plan would never get off the drawing board until we agreed to the very compromising terms that are the only ones that being offers.

The world has signed a pact with the devil; it had to. It is a covenant to which every thing, even every hydrogen atom, is bound. The terms are clear: if you want to live, you have to die; you cannot have mountains and creeks without space, and space is a beauty married to a blind man. The blind man is Freedom, or Time, and he does not go anywhere without his great dog Death. The world came into being with the signing of the contract. A scientist calls it the Second Law of Thermodynamics. A poet says, “The force that through the green fuse drives the flower/ Drives my green age.” This is what we know. The rest is gravy.

What’s Up With That: Why It’s So Hard to Catch Your Own Typos | Science | WIRED

What’s Up With That: Why It’s So Hard to Catch Your Own Typos | Science | WIRED.

type

Typos suck. They are saboteurs, undermining your intent, causing your resume to land in the “pass” pile, or providing sustenance for an army of pedantic critics. Frustratingly, they are usually words you know how to spell, but somehow skimmed over in your rounds of editing. If we are our own harshest critics, why do we miss those annoying little details?

[…]

“When you’re writing, you’re trying to convey meaning. It’s a very high level task,” he said.

As with all high level tasks, your brain generalizes simple, component parts (like turning letters into words and words into sentences) so it can focus on more complex tasks (like combining sentences into complex ideas). “We don’t catch every detail, we’re not like computers or NSA databases,” said Stafford. “Rather, we take in sensory information and combine it with what we expect, and we extract meaning.” When we’re reading other peoples’ work, this helps us arrive at meaning faster by using less brain power. When we’re proof reading our own work, we know the meaning we want to convey. Because we expect that meaning to be there, it’s easier for us to miss when parts (or all) of it are absent. The reason we don’t see our own typos is because what we see on the screen is competing with the version that exists in our heads.

[…]

Generalization is the hallmark of all higher-level brain functions. It’s similar to how our brains build maps of familiar places, compiling the sights, smells, and feel of a route. That mental map frees your brain up to think about other things. Sometimes this works against you, like when you accidentally drive to work on your way to a barbecue, because the route to your friend’s house includes a section of your daily commute. We can become blind to details because our brain is operating on instinct. By the time you proof read your own work, your brain already knows the destination.

This explains why your readers are more likely to pick up on your errors. Even if you are using words and concepts that they are also familiar with, their brains are on this journey for the first time, so they are paying more attention to the details along the way and not anticipating the final destination.

But even if familiarization handicaps your ability to pick out mistakes in the long run, we’re actually pretty awesome at catching ourselves in the act. (According to Microsoft, backspace is the third-most used button on the keyboard.) In fact, touch typists—people who can type without looking at their fingers—know they’ve made a mistake even before it shows up on the screen. Their brain is so used to turning thoughts into letters that it alerts them when they make even minor mistakes, like hitting the wrong key or transposing two characters. In a study published earlier this year, Stafford and a colleague covered both the screen and keyboard of typists and monitored their word rate. These “blind” typists slowed down their word rate just before they made a mistake.

Touch typists are working off a subconscious map of the keyboard. As they type, their brains are instinctually preparing for their next move. “But, there’s a lag between the signal to hit the key and the actual hitting of the key,” Stafford said. In that split second, your brain has time to run the signal it sent your finger through a simulation telling it what the correct response will feel like. When it senses an error, it sends a signal to the fingers, slowing them down so they have more time to adjust.

As any typist knows, hitting keys happens too fast to divert a finger when it’s in the process of making a mistake. But, Stafford says this evolved from the same mental mechanism that helped our ancestors’ brains make micro adjustments when they were throwing spears.

Unfortunately, that kind of instinctual feedback doesn’t exist in the editing process. When you’re proof reading, you are trying to trick your brain into pretending that it’s reading the thing for the first time. Stafford suggests that if you want to catch your own errors, you should try to make your work as unfamiliar as possible. Change the font or background color, or print it out and edit by hand. “Once you’ve learned something in a particular way, it’s hard to see the details without changing the visual form,” he said.

Against Editors

Against Editors.

Against Editors

Here is the traditional career track for someone employed in journalism: first, you are a writer. If you hang on, and don’t wash out, and manage not to get laid off, and don’t alienate too many people, at some point you will be promoted to an editor position. It is really a two-step career journey, in the writing world. Writing, then editing. You don’t have to accept a promotion to an editing position of course. You don’t have to send your kids to college and pay a mortgage, necessarily. If you want to get regular promotions and raises, you will, for the most part, accept the fact that your path takes you away from writing and into editing, in some form. The number of pure writing positions that offer salaries as high as top editing positions is vanishingly small. Most well-paid writers are celebrities in the writing world. That is how few of them there are.

Here is the problem with this career path: writing and editing are two completely different skills. There are good writers who are terrible editors. (Indeed, some of the worst editors are good writers!) There are good editors who lack the creativity and antisocial personality disorders that would make them great writers. This is okay. This is natural. It is thoroughly unremarkable for an industry to have different positions that require different skill sets. The problem in the writing world is that, in order to move up, the writer must stop doing what he did well in the first place and transition into an editing job that he may or may not have any aptitude for. It is impossible to count how many great writers have made the dutiful step up the career ladder to become an editor and forsaken years of great stories that could have been written had they remained writers. Journalism’s two-step career path is a tragedy, because it robs the world of many talented writers, who spend the latter half of their careers in the conceptual muddle of various editing positions.

It is also a farce. The grand traditional print media system—still seen today in top-tier magazines and newspapers—in which each writer’s story is monkeyed with by a succession of ever more senior editors is, on the whole, a waste of time and resources. If you believe that having four editors edit a story produces a better story than having one editor edit a story, I submit that you have the small mind of a middle manager, and should be employed not in journalism but in something more appropriate for your numbers-based outlook on life, like carpet sales. Writing is not a field in which quantity produces quality. Writing is more often an endeavor in which the passion and vision of one person produces a piece of work that must then be defended against an onslaught of competing visions of a series of editors who did not actually write or report the story—but who have some great ideas on how it should be changed.

Go find a story published a few years ago in The New Yorker, perhaps America’s most tightly edited magazine. Give that story to an editor, and tell him it’s a draft. I guarantee you that that editor will take that story—well-polished diamond that it presumably is—and suggest a host of changes. Rewrite the story to the specifications of the new editor. Then take it to another editor, and repeat the process. You will find, once again, that the new editor has changes in mind. If you were a masochist, you could continue this process indefinitely. You would never find an editor who read the story, set down his pencil, and said, “Looks fine. This story is perfect.” This is because editing is an art, not a science. To imagine that more editors will produce a better story is akin to imagining that a song by your favorite band would be better if, after the band finished it, it was remixed by a succession of ten producers, one after the other. Would it be different? Yes. Would it be better? I doubt it. The only thing you can be sure of is that it would not be the song that the actual musicians wanted it to be.

When any industry fills itself with middle managers, those middle managers will quite naturally work to justify their own existence. The less their own existence is inherently necessary, the harder they will work to appear to be necessary. An editor who looks over a story and declares it to be fine is not demonstrating his own necessity. He is therefore placing himself in danger of being seen as unnecessary. Editors, therefore, tend to edit. Whether it is necessary or not.

This is not to say that editing is not a legitimate job. It is. It is also a necessary step in the writing process. But it is not the most important role in the writing process. That would be writing, which any honest editor will tell you is much harder than editing. (An editor who will not admit this is not worth listening to.) Reporting is a difficult chore. Writing is a psychologically agonizing struggle. Editing is not easy, but not as onerous as either of the two tasks that precede it. You would never know that, though, by looking at the relative salaries of the people who do the work.

Good editors are valuable. They are also rare. If we simply kept the good ones and dismissed the bad ones, the ranks of editors would immediately shrink to saner levels. Editors are an important part of writing—a subordinate part. Their role in the industry should be equally subordinate. It is absurd that most writers must choose between a career spent writing and a career that offers raises and promotions. The “new” online media, happily, tends to be less editor-heavy than the big legacy media outlets that have sprouted entire ecosystems of editors and sub-editors over the course of decades. This is partly because the stark economics of online journalism make clear just how wasteful all those extra editors are. To hire a new editor instead of a new writer is to give up actual stories in favor of… some marginal improvements, somewhere, or perhaps nothing at all.

When all of the people in the writing world are dead and gone, the only thing that we will leave behind are our stories. Stories are, ultimately, what matter. Stories are what websites and magazines and media “brands” live and die on. Stories come from writers. Writers come first. They shouldn’t be second-class citizens in their own industry.

Welcome to the Virtual City – J.G. Ballard

A city built for speed is a city built for success | science fictional.

 

 

 

 

 

 

 

 

 

 

 

Ed Ruscha, Gasoline Stations, 1989.

Shepperton, for what it’s worth, is not suburbia. If it is a suburb of anywhere, it is of London Airport, not London. And that is the clue to my dislike of cities and my admiration for what most people think of as a faceless dead-land of inter-urban sprawl. Hurrying back from Heathrow or a West Country weekend to their ludicrously priced homes in Fulham or Muswell Hill, they carefully avert their gaze from this nightmare terrain of dual carriageways, police cameras, science parks and executive housing, an uncentred realm bereft of civic identity, tradition or human values, a zone fit only for the alienated and footloose, those without past or future.

And that, of course, is exactly what we like about it. We like the fast dual carriageways, the easy access motorways, the limitless parking lots. We like the control-tower architecture, the absence of civic authority, the rapid turnover of friendships and the prosperity filtered through car and appliance purchases. We like roads that lead past airports, we like air-freight offices and rent-acar forecourts, we like impulse-buy holidays to anywhere that takes our fancy. The triangle formed by the M3 and the M4, enclosing Heathrow and the River Thames, is our zone of possibility, far from the suffocating city politics and self-obsessions of the metropolis (transport, ugh, fares, rents, kerb-side vomit). We are the unenfranchised citizens of the shopping mall and the marina, the internet and cable TV. And we’re in no hurry for you to join us.

J.G. Ballard, “Welcome to the Virtual City”, Tate, Spring 2001. p 33.

What Does J.G. Ballard Look Like? Part 2: Design Observer

What Does J.G. Ballard Look Like? Part 2: Design Observer.


Peter Klasen, Les Bruits de la Ville (The Noises of the City), acryllic on canvas, 92 x 73 cm, 1966


Peter Klasen, Anatomie du plaisir (Anatomy of Pleasure), acrylic and oil on canvas, 81 x 65 cm, 1964-65


Peter Klasen, Une rencontre a bien eu lieu (A Meeting Took Place), acrylic on canvas, 130 x 162 cm, 1965

Here, I want to consider a German-born artist based in France whose paintings are the most Ballardian I have ever seen. So far as I am aware Peter Klasen has never been discussed previously in relation to Ballard or his writing. There are good reasons for supposing that Ballard was unaware of Klasen’s work and I have found no evidence to suggest that the artist was aware of Ballard, though it remains a possibility. The remarkable overlap in their thinking and practice at a critical moment in the 1960s is a matter of synchronicity, not influence.

[…]

Ballard’s impact on the art world has been a subject of growing interest, which was given an additional spur by his death in 2009. His readily acknowledged debt to Surrealism is already well covered and critical attention has recently moved to his friendship with the artist Eduardo Paolozzi.

[…]

the Gagosian Gallery in London mounted the exhibition “Crash: Homage to J.G. Ballard.” This included artists Ballard is known to have admired — Dalí, De Chirico, Paul Delvaux, Edward Hopper, Ed Ruscha, Francis Bacon, Eduardo Paolozzi, Tacita Dean — as well as artists felt by the curators to share concerns with the writer, including Richard Prince, Jeff Koons, Cindy Sherman, Jake and Dinos Chapman, Douglas Gordon and Damien Hirst. (See the lavish catalogue designed by Graphic Thought Facility.)

[…]

The three paintings shown here are typical of Klasen’s work in the mid-1960s. All of these images utilize a combinatorial system derived from modernist montage of the 1920s. Occasionally Klasen glues images and small objects to the canvas, but just as often he paints the entire “montage” as a seamless unit. The component images are shattered into fragments and here Klasen differs from an American Pop artist such as James Rosenquist whose image quotations are more complete, continuous and celebratory. The resemblance to Richard Hamilton, whose painterly probes of popular culture also fused image-sections into new aesthetic configurations, comes in the way Klasen deploys these fragments across the picture plane, allowing zones of unoccupied space to open up between them. Although traditional commercial Pop iconography sometimes appears (a hotdog, a bowl of food, a lipstick), Klasen’s overriding concern is the equivalence between female body parts drawn from advertising and glamour pictures — lips, eyes, breasts, elbow — and the manufactured or mechanical elements, which include taps, valves, plugs, handles, switches, syringes, steering wheels and car windows. He presents both types of image on equal terms within the painting’s symbiotically organized structure. Several of the same image fragments recur from picture to picture and Klasen’s color-drained image-world becomes a semiotic pressure chamber in which new forms of control (and desire?) subordinate the erotic presence of the female subjects.

In an interview in 2008, Klasen recalled the influence during these years of Jean-Luc Godard’s approach to film-collage, his essayistic abstractions, disruptive inter-titles and anti-cinematic moments of rupture. A graphic montage using sources also favored by Klasen can be seen in a poster from 1966 for Godard’s Two or Three Things I Know About Her about the life of a prostitute in Paris. If Klasen’s pictures are still “sexy” to us, despite their coldness and extreme, disassociating fragmentation, then it’s a violently ultra-modern kind of sexiness.

Now consider this passage from a chapter titled “Notes Towards a Mental Breakdown” in The Atrocity Exhibition (first published with the title “The Death Module” in New Worlds no. 173, July 1967):

Operating Formulae. Gesturing Catherine Austin into the chair beside his desk, Dr Nathan studied the elegant and mysterious advertisements which had appeared that afternoon in copies of Vogue and Paris-Match. In sequence they advertised: (1) The left orbit and zygomatic arch of Marina Oswald. (2) The angle between two walls. (3) A neural interval— a balcony unit on the twenty-seventh floor of the Hilton Hotel, London. (4) A pause in an unreported conversation outside an exhibition of photographs of automobile accidents. (5) The time, 11:47 a.m., June 23rd, 1975. (6) A gesture — a supine forearm extended across a candlewick bedspread. (7) A moment of recognition — a young woman’s buccal pout and dilated eyes.

This is one of Ballard’s celebrated image lists found throughout The Atrocity Exhibition. The items that comprise the “operating formulae” can be seen as a miniature exhibition list, as an extreme form of conceptual montage, and as a forced marriage of apparently unrelated images (a classic Surrealist stratagem), which replicates the scrambled structure of the narratives within each chapter, and the way these non-linear chapters ultimately cohere as a work. At the same time, it would be possible to use Ballard’s image kit as a set of instructions to assemble a montage on paper that might then resemble a painting by Klasen (zygomatic arch, angle between walls, balcony unit, accident photos, forearm, dilated eyes, etc.). What both Ballard and Klasen share at this point in the mid-1960s is a cold, appraising, analytical eye. It’s impossible to tell how they feel about what they show, or to know what they want us to feel, if anything at all. Their findings are disturbing and perhaps even repellent from a humanist perspective, yet the new aesthetic forms they use to embody them are, even today, exciting, provocative and tantalizingly difficult to resolve.


Page from “The Summer Cannibals” by J.G. Ballard, New Worlds no. 186, January 1969
Later published in The Atrocity Exhibition. Design by Nigel Francis

Ballard’s experiments with condensed collage-novels in the late 1950s have received increasing attention and they were shown at the Gagosian Gallery; the “Advertiser’s Announcements” he presented in Ambit from the summer of 1967 appear in the catalogue. A few months earlier, in New Worlds no. 167 (October 1966), Ballard published a series of comments on his new experimental texts, under the title “Notes from Nowhere.” He considers the intersection of three kinds of plane: the world of public events, the immediate personal environment, and the inner world of the psyche. “Where these planes intersect,” he writes, “images are born.” In Ballard’s attempt to locate himself, by calling on “the geometry of my own postures, the time-values contained in this room, the motion-space of highways, staircases, the angles between these walls,” the intersection of planes again suggests Klasen’s surgically precise combinatorial technique. Ballard goes on to propose that it might one day be possible “to represent a novel or short story, with all its images and relationships, simply as a three-dimensional geometric model.” Then, just a few lines later, in a curious unedited moment that seems to express his ambivalence, he says that he is worried that a work of fiction could become “nothing more than a three-dimensional geometric model.”

By the early 1970s, Klasen had severely reduced the number of image fragments and the agitated visual complexity seen in his earlier montages. In a development that actualizes Ballard’s conception of a new kind of three-dimensional fiction, Klasen’s constructions, while still wall-mounted, become fully three-dimensional with projecting pipes and bathroom fittings. The unrelenting hygienic cruelty of this work, its absolute concentration on a few fetishistic motifs to the exclusion of everything else — breasts and basin, waist and switches, lips and bidet — bears comparison with the strange mental journey Ballard would undertake as he worked on Crash, the ultimate statement of his ideas about the sexualization of our relationship with technology. “Nothing is spontaneous, everything is stylized, including human behaviour,” he said in 1970, in an interview with Lynn Barber in Penthouse. “And once you move into this area where everything is stylized, including sexuality, you’re leaving behind any kind of moral or functional relevance.” Also in 1970, in a brief manifesto, reprinted in his latest monograph, Klasen set out his aims:

Play on the dialectic of a photographic reproduction and its pictorial transposition.

Play on the magical and poetic power of an object out of place.

Respond to the aggression of society with another aggression.

Show that beauty is everywhere, in a bathroom, for example.

Demonstrate that a bidet, a washbasin, a switch can exercise the same fascination on the spectator as the mouth, the body of a woman or a racing car.

Return these images and objects to the spectator-consumer, allowing him to react to these object-tableaux and to project his own fantasies onto them.

Stimulate his awareness by providing him with aesthetic and ideological information about himself and the world that surrounds him.

[…]

“Respond to the aggression of society with another aggression”: this is exactly what Ballard had done in The Atrocity Exhibition, responding to what he called the “death of affect” — of ordinary emotional responses to events — by playing it out within the glinting, recursive, multi-planar architecture of his book, returning society’s images to the “spectator-consumer,” with their inherent characteristics pulled to the surface and intensified, as a morally ambiguous invitation to know oneself better. Ballard, too, had found a perverse kind of beauty in this material, which is one reason why his writing of this period continues to exert its extraordinary hold on readers.

[…]

The overlapping concerns of Ballard and Klasen in the mid- to late 1960s represent one of the great might-have-beens of contemporary art and literature, but a belated union is still possible. It’s hard to imagine better images than Klasen’s, ready-made or otherwise, for the covers of future editions of The Atrocity Exhibition and Crash. It’s strange that the French, great admirers of both these books, haven’t cracked this one already.

What I believe, J.G. Ballard

What I believe, J.G. Ballard

Photos from the Freeway series by Catherine Opie

I believe in the power of the imagination to remake the world, to release the truth within us, to hold back the night, to transcend death, to charm motorways, to ingratiate ourselves with birds, to enlist the confidences of madmen.

I believe in my own obsessions, in the beauty of the car crash, in the peace of the submerged forest, in the excitements of the deserted holiday beach, in the elegance of automobile graveyards, in the mystery of multi-storey car parks, in the poetry of abandoned hotels.

I believe in the forgotten runways of Wake Island, pointing towards the Pacifics of our imaginations.

I believe in the mysterious beauty of Margaret Thatcher, in the arch of her nostrils and the sheen on her lower lip; in the melancholy of wounded Argentine conscripts; in the haunted smiles of filling station personnel; in my dream of Margaret Thatcher caressed by that young Argentine soldier in a forgotten motel watched by a tubercular filling station attendant.

I believe in the beauty of all women, in the treachery of their imaginations, so close to my heart; in the junction of their disenchanted bodies with the enchanted chromium rails of supermarket counters; in their warm tolerance of my perversions.

I believe in the death of tomorrow, in the exhaustion of time, in our search for a new time within the smiles of auto-route waitresses and the tired eyes of air-traffic controllers at out-of-season airports.

I believe in the genital organs of great men and women, in the body postures of Ronald Reagan, Margaret Thatcher and Princess Di, in the sweet odours emanating from their lips as they regard the cameras of the entire world.

I believe in madness, in the truth of the inexplicable, in the common sense of stones, in the lunacy of flowers, in the disease stored up for the human race by the Apollo astronauts.

I believe in nothing.

I believe in Max Ernst, Delvaux, Dali, Titian, Goya, Leonardo, Vermeer, Chirico, Magritte, Redon, Duerer, Tanguy, the Facteur Cheval, the Watts Towers, Boecklin, Francis Bacon, and all the invisible artists within the psychiatric institutions of the planet.

I believe in the impossibility of existence, in the humour of mountains, in the absurdity of electromagnetism, in the farce of geometry, in the cruelty of arithmetic, in the murderous intent of logic.

I believe in adolescent women, in their corruption by their own leg stances, in the purity of their dishevelled bodies, in the traces of their pudenda left in the bathrooms of shabby motels.

I believe in flight, in the beauty of the wing, and in the beauty of everything that has ever flown, in the stone thrown by a small child that carries with it the wisdom of statesmen and midwives.

I believe in the gentleness of the surgeon’s knife, in the limitless geometry of the cinema screen, in the hidden universe within supermarkets, in the loneliness of the sun, in the garrulousness of planets, in the repetitiveness or ourselves, in the inexistence of the universe and the boredom of the atom.

I believe in the light cast by video-recorders in department store windows, in the messianic insights of the radiator grilles of showroom automobiles, in the elegance of the oil stains on the engine nacelles of 747s parked on airport tarmacs.

I believe in the non-existence of the past, in the death of the future, and the infinite possibilities of the present.

I believe in the derangement of the senses: in Rimbaud, William Burroughs, Huysmans, Genet, Celine, Swift, Defoe, Carroll, Coleridge, Kafka.

I believe in the designers of the Pyramids, the Empire State Building, the Berlin Fuehrerbunker, the Wake Island runways.

I believe in the body odours of Princess Di.

I believe in the next five minutes.

I believe in the history of my feet.

I believe in migraines, the boredom of afternoons, the fear of calendars, the treachery of clocks.

I believe in anxiety, psychosis and despair.

I believe in the perversions, in the infatuations with trees, princesses, prime ministers, derelict filling stations (more beautiful than the Taj Mahal), clouds and birds.

I believe in the death of the emotions and the triumph of the imagination.

I believe in Tokyo, Benidorm, La Grande Motte, Wake Island, Eniwetok, Dealey Plaza.

I believe in alcoholism, venereal disease, fever and exhaustion.

I believe in pain.

I believe in despair.

I believe in all children.

I believe in maps, diagrams, codes, chess-games, puzzles, airline timetables, airport indicator signs.

I believe all excuses.

I believe all reasons.

I believe all hallucinations.

I believe all anger.

I believe all mythologies, memories, lies, fantasies, evasions.

I believe in the mystery and melancholy of a hand, in the kindness of trees, in the wisdom of light.

“Tip-of-the-Tongue Syndrome,” Transactive Memory, and How the Internet Is Making Us Smarter | Brain Pickings

“Tip-of-the-Tongue Syndrome,” Transactive Memory, and How the Internet Is Making Us Smarter | Brain Pickings.

Vannevar Bush’s ‘memex’ — short for ‘memory index’ — a primitive vision for a personal hard drive for information storage and management.

“At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.”

[…]

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it.

[…]

“Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.”

[…]

“Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.”

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt,

“This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

“If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.”

This, he argues, is also how and why libraries were born — the death of the purely oral world and the proliferation of print after Gutenberg placed new demands on organizing and storing human knowledge. And yet storage and organization soon proved to be radically different things:

“The Gutenberg book explosion certainly increased the number of books that libraries acquired, but librarians had no agreed-upon ways to organize them. It was left to the idiosyncrasies of each. A core job of the librarian was thus simply to find the book each patron requested, since nobody else knew where the heck the books were. This created a bottleneck in access to books, one that grew insufferable in the nineteenth century as citizens began swarming into public venues like the British Library. “Complaints about the delays in the delivery of books to readers increased,” as Matthew Battles writes in Library: An Unquiet History, “as did comments about the brusqueness of the staff.” Some patrons were so annoyed by the glacial pace of access that they simply stole books; one was even sentenced to twelve months in prison for the crime. You can understand their frustration. The slow speed was not just a physical nuisance, but a cognitive one.”

The solution came in the late 19th century by way of Melville Dewey, whose decimal system imposed order by creating a taxonomy of book placement, eventually rendering librarians unnecessary — at least in their role as literal book-retrievers. They became, instead, curiosity sherpas who helped patrons decide what to read and carry out comprehensive research. In many ways, they came to resemble the editors and curators who help us navigate the internet today, framing for us what is worth attending to and why.

[…]

“The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.”

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

“Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.”

[…]

This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

“We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone.”

[…]

Outsourcing our memory to machines rather than to other humans, in fact, offers certain advantages by pulling us into a seemingly infinite rabbit hole of indiscriminate discovery:

“In some ways, machines make for better transactive memory buddies than humans. They know more, but they’re not awkward about pushing it in our faces. When you search the Web, you get your answer — but you also get much more. Consider this: If I’m trying to remember what part of Pakistan has experienced many U.S. drone strikes and I ask a colleague who follows foreign affairs, he’ll tell me “Waziristan.” But when I queried this once on the Internet, I got the Wikipedia page on “Drone attacks in Pakistan.” A chart caught my eye showing the astonishing increase of drone attacks (from 1 a year to 122 a year); then I glanced down to read a précis of studies on how Waziristan residents feel about being bombed. (One report suggested they weren’t as opposed as I’d expected, because many hated the Taliban, too.) Obviously, I was procrastinating. But I was also learning more, reinforcing my schematic understanding of Pakistan.”

[…]

“The real challenge of using machines for transactive memory lies in the inscrutability of their mechanics. Transactive memory works best when you have a sense of how your partners’ minds work — where they’re strong, where they’re weak, where their biases lie. I can judge that for people close to me. But it’s harder with digital tools, particularly search engines. You can certainly learn how they work and develop a mental model of Google’s biases. … But search companies are for-profit firms. They guard their algorithms like crown jewels. This makes them different from previous forms of outboard memory. A public library keeps no intentional secrets about its mechanisms; a search engine keeps many. On top of this inscrutability, it’s hard to know what to trust in a world of self-publishing. To rely on networked digital knowledge, you need to look with skeptical eyes. It’s a skill that should be taught with the same urgency we devote to teaching math and writing.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

“These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.”

[…]

“Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.”

[…]

“In an ideal world, we’d all fit the Renaissance model — we’d be curious about everything, filled with diverse knowledge and thus absorbing all current events and culture like sponges. But this battle is age-old, because it’s ultimately not just technological. It’s cultural and moral and spiritual; “getting young people to care about the hard stuff” is a struggle that goes back centuries and requires constant societal arguments and work. It’s not that our media and technological environment don’t matter, of course. But the vintage of this problem indicates that the solution isn’t merely in the media environment either.”

[…]

“A tool’s most transformative uses generally take us by surprise.”

[…]

“How should you respond when you get powerful new tools for finding answers?

Think of harder questions.”

Secrets of the Stacks — Medium

Secrets of the Stacks — Medium.

Choosing books for a library like mine in New York is a fulltime job. The head of acquisitions at the Society Library, Steven McGuirl, reads Publishers Weekly, Library Journal, The Times Literary Supplement, The New Yorker, The New York Review of Books, the London Review of Books, The London Times, and The New York Times to decide which fiction should be ordered. Fiction accounts for fully a quarter of the forty-eight hundred books the library acquires each year. There are standing orders for certain novelists—Martin Amis, Zadie Smith, Toni Morrison, for example. Some popular writers merit standing orders for more than one copy.

But first novels and collections of stories present a problem. McGuirl and his two assistants try to guess what the members of the library will want to read. Of course, they respond to members’ requests. If a book is requested by three people, the staff orders it. There’s also a committee of members that meets monthly to recommend books for purchase. The committee checks on the librarians’ lists and suggests titles they’ve missed. The whole enterprise balances enthusiasm and skepticism.

Boosted by reviews, prizes, large sales, word of mouth, or personal recommendations, a novel may make its way onto the library shelf, but even then it is not guaranteed a chance of being read by future generations. Libraries are constantly getting rid of books they have acquired. They have to, or they would run out of space. The polite word for this is “deaccession,” the usual word, “weeding.” I asked a friend who works for a small public library how they choose books to get rid of. Is there a formula? Who makes the decision, a person or a committee? She told me that there was a formula based on the recommendations of the industry-standard CREW manual.

CREW stands for Continuous Review Evaluation and Weeding, and the manual uses “crew” as a transitive verb, so one can talk about a library’s “crewing” its collection. It means weeding but doesn’t sound so harsh. At the heart of the CREW method is a formula consisting of three factors—the number of years since the last copyright, the number of years since the book was last checked out, and a collection of six negative factors given the acronym MUSTIE, to help decide if a book has outlived its usefulness. M. Is it Misleading or inaccurate? Is its information, as so quickly happens with medical and legal texts or travel books, for example, outdated? U. Is it Ugly? Worn beyond repair? S. Has it been Superseded by a new edition or a better account of the subject? T. Is it Trivial, of no discernible literary or scientific merit? I. Is it Irrelevant to the needs and interests of the community the library serves? E. Can it be found Elsewhere, through interlibrary loan or on the Web?

Obviously, not all the MUSTIE factors are relevant in evaluating fiction, notably Misleading and Superseded. Nor is the copyright date important. For nonfiction, the CREW formula might be 8/3/MUSTIE, which would mean “Consider a book for elimination if it is eight years since the copyright date and three years since it has been checked out and if one or more of the MUSTIE factors obtains.” But for fiction the formula is often X/2/MUSTIE, meaning the copyright date doesn’t matter, but consider a book for elimination if it hasn’t been checked out in two years and if it is TUIE—Trivial, Ugly, Irrelevant, or Elsewhere.

[…]

People who feel strongly about retaining books in libraries have a simple way to combat the removal of treasured volumes. Since every system of elimination is based, no matter what they say, on circulation counts, the number of years that have elapsed since a book was last checked out, or the number of times it has been checked out overall, if you feel strongly about a book, you should go to every library you have access to and check out the volume you care about. Take it home awhile. Read it or don’t. Keep it beside you as you read the same book on a Kindle, Nook, or iPad. Let it breathe the air of your home, and then take it back to the library, knowing you have fought the guerrilla war for physical books.

[…]

So many factors affect a novel’s chances of surviving, to say nothing of its becoming one of the immortal works we call a classic: how a book is initially reviewed, whether it sells, whether people continue to read it, whether it is taught in schools, whether it is included in college curricula, what literary critics say about it later, how it responds to various political currents as time moves on.

[…]

De Rerum Natura, lost for fifteen hundred years, was found and its merit recognized. But how many other works of antiquity were not found? How many works from past centuries never got published or, published, were never read?

If you want to see how slippery a judgment is “literary merit” and how unlikely quality is to be recognized at first glance, nothing is more fun—or more comforting to writers—than to read rejection letters or terrible reviews of books that have gone on to prove indispensable to the culture. This, for example, is how the New York Times reviewer greeted Lolita: “Lolita . . . is undeniably news in the world of books. Unfortunately, it is bad news. There are two equally serious reasons why it isn’t worth any adult reader’s attention. The first is that it is dull, dull, dull in a pretentious, florid and archly fatuous fashion. The second is that it is repulsive.”

Negative reviews are fun to write and fun to read, but the world doesn’t need them, since the average work of literary fiction is, in Laura Miller’s words, “invisible to the average reader.” It appears and vanishes from the scene largely unnoticed and unremarked.

[…]

Whether reviews are positive or negative, the attention they bring to a book is rarely sufficient, and it is becoming harder and harder for a novel to lift itself from obscurity. In the succinct and elegant words of James Gleick, “The merchandise of the information economy is not information; it is attention. These commodities have an inverse relationship. When information is cheap, attention becomes expensive.” These days, besides writing, novelists must help draw attention to what they write, tweeting, friending, blogging, and generating meta tags—unacknowledged legislators to Shelley, but now more like unpaid publicists.

On the Web, everyone can be a reviewer, and a consensus about a book can be established covering a range of readers potentially as different as Laura Miller’s cousins and the members of the French Academy. In this changed environment, professional reviewers may become obsolete, replaced by crowd wisdom. More than two centuries ago, Samuel Johnson invented the idea of crowd wisdom as applied to literature, calling it “the common reader.” “I rejoice to concur with the common reader; for by the common sense of readers, uncorrupted by literary prejudices, after all the refinements of subtilty and the dogmatism of learning, must be finally decided all claim to poetical honours.” Virginia Woolf agreed and titled her wonderful collection of essays on literature The Common Reader.

[…]

The Common Reader, however, is not one person. It is a statistical average, the mean between this reader’s one star for One God Clapping and twenty other readers’ enthusiasm for this book, the autobiography of a “Zen rabbi,” producing a four-star rating. What the rating says to me is that if I were the kind of person who wanted to read the autobiography of a Zen rabbi, I’d be very likely to enjoy it. That Amazon reviewers are a self-selected group needs underlining. If you are like Laura Miller’s cousins who have never heard of Jonathan Franzen, you will be unlikely to read Freedom, and even less likely to review it. If you read everything that John Grisham has ever written, you will probably read his latest novel and might even report on it. If you read Lolita, it’s either because you’ve heard it’s one of the great novels of the twentieth century or because you’ve heard it’s a dirty book. Whatever brings you to it, you are likely to enjoy it. Four and a half stars.

The idea of the wisdom of crowds, popularized by James Surowiecki, dates to 1906, when the English statistician Francis Galton (Darwin’s cousin) focused on a contest at a county fair for guessing the weight of an ox. For sixpence, a person could buy a ticket, fill in his name, and guess the weight of the animal after butchering. The person whose guess was closest to the actual weight of the ox won a prize. Galton, having the kind of mind he did, played around with the numbers he gathered from this contest and discovered that the average of all the guesses was only one pound off from the actual weight of the ox, 1,198 pounds. If you’re looking for the Common Reader’s response to a novel, you can’t take any one review as truth but merely as a passionate assertion of one point of view, one person’s guess at the weight of the ox.

“I really enjoy reading this novel it makes you think about a sex offender’s mind. I’m also happy that I purchased this novel on Amazon because I was able to find it easily with a suitable price for me.”

“Vladimir has a way with words. The prose in this book is simply remarkable.”

“Overrated and pretentious. Overly flowery language encapsulating an uninteresting and overdone plot. Older man and pre-adolescent hypersexual woman—please let’s not exaggerate the originality of that concept, it has existed for millennia now. In fact, you’ll find similar stories in every chapter of the Bible.”

“Like many other folk I read Lolita when it first came out. I was a normally-sexed man and I found it excitingly erotic. Now, nearing 80, I still felt the erotic thrill but was more open to the beauty of Nabokov’s prose.”

“Presenting the story from Humbert’s self-serving viewpoint was Nabokov’s peculiarly brilliant means by which a straight, non-perverted reader is taken to secret places she/he might otherwise dare not go.”

“A man who was ‘hip’ while maintaining a bemused detachment from trendiness, what would he have made of shopping malls? Political correctness? Cable television? Alternative music? The Internet? . . . Or some of this decade’s greatest scandals, near-Nabokovian events in themselves, like Joey Buttafuoco, Lorena Bobbitt, O. J. Simpson, Bill and Monica? Wherever he is (Heaven, Hell, Nirvana, Anti-Terra), I would like to thank Nabokov for providing us with a compelling and unique model of how to read, write, and perceive life.”

What would the hip, bemused author of Lolita have made of Amazon ratings? I like to think that he would have reveled in them as evidence of the cheerful self- assurance, the lunatic democracy of his adopted culture.

“Once a populist gimmick, the reviews are vital to make sure a new product is not lost in the digital wilderness,” the Times reports.

Amazon’s own gatekeepers have removed thousands of reviews from its site in an attempt to curb what has become widespread manipulation of its ratings. They eliminated some reviews by family members and people considered too biased to be entitled to an opinion, competing writers, for example. They did not, however, eliminate reviews by people who admit they have not read the book. “We do not require people to have experienced the product in order to review,” said an Amazon spokesman.