Tag Archives: ethnography

Faster horses – Analog Senses

Faster horses – Analog Senses.

There’s a great quote that is often attributed to Henry Ford, the man who revolutionized the automobile industry with the introduction of the Model T in 1908. You’ve probably heard it before:

If I had asked my customers what they wanted they would have said a faster horse.

Whether Ford actually ever said them or not, those are wise words, and they apply to a great many things beyond cars. The gist of it is that consumers largely judge new products by comparing them to their existing competitors. That’s how we instinctively know if something is better. However, what happens when an entirely new product comes along? What happens when there are no real competitors?

When there’s no reference, there’s no objective way to quantify how good —or bad— a product is. As a last resort, people will still try to compare it to the closest thing they can think of, even if the comparison doesn’t really work. That can be a dangerous thing, but it can also be an opportunity.

The main lesson behind Ford’s words is that, if you aim to create a revolution, you must be willing to part with the existing preconceptions that are holding your competitors back. Only then will you be able to take a meaningful leap forward. That will surely attract some criticism in the beginning, but once the product manages to stand on its own, people will see it for what it really is.

The tech world is largely governed by that rule. It’s what we now call disruption. Apple, in particular, is famous for anticipating what people need before they even know it, disrupting entire markets. That’s arguably the main reason behind their massive success during the past decade.

In retrospect, Apple products are often seen as revolutionary, but only after they’ve gained a foothold in the market and more importantly, in our collective consciousness. Only then, people start seeing them for the revolutionary devices they always were. At the time of their announcement, though, they tend to face strong criticism from people that don’t really understand them. Apple products are usually not terribly concerned with conforming to the status quo and in fact, more often than not they’re actively trying to disrupt it. And that drives some people nuts.

It happened with the iPod:

No wireless. Less space than a nomad. Lame.

It happened with the iPhone.

That is the most expensive phone in the world and it doesn’t appeal to business customers because it doesn’t have a keyboard, which makes it not a very good email machine…

It also happened with the iPad.

It’s just a big iPod touch.

There’s another example that’s particularly telling. During the last episode of The Talk Show, John Gruber and Ben Thompson reminded me of the public criticism that the original iPhone faced when Apple announced it. Much of that criticism was focused on its non-removable battery, a first in the mobile phone industry at the time. Back then, many people were used to carrying a spare battery in case their phone happened to die mid-day. Once the iPhone arrived and people couldn’t swap batteries anymore, they became angry. The iPhone didn’t conform to what they already knew, and they didn’t like it.

But the iPhone was never a horse.

7 years later, swappable batteries are no longer a thing, and nobody remembers them anymore. Some people may think of it as nice-to-have, and some others prefer to carry an extra battery pack, but for the most part, battery-swappability is not a factor driving smartphone sales.

Was it ever really a big deal?

Of course not. Swappable batteries were never a feature, they were merely a way to deal with the technological shortcomings of the time. Apple knew that if they managed to get a full day’s worth of use out of the iPhone’s battery, there wouldn’t be a need for it to be removable anymore, and they trusted people to eventually understand and accept that. It was a gamble, but history has shown that they were right.

The same thing happened with MacBooks a few years ago, but by then, Apple’s solution had already proven to be the right one. Indeed, it seems a bit silly to complain about a non-removable battery when your laptop gets 12 hours of battery life.

And yet, no matter how many times Apple has been right in the past, people keep finding reasons to complain about their new products. The Apple Watch, of course, is no different:

Apple Watch is ugly and boring (and Steve Jobs would have agreed).

It’s not even a finished product, and some people are already slamming it. And it’s only going to get worse.

People don’t like what they don’t understand and so far, nobody understands the Apple Watch. I’m not even sure anybody can; we just don’t know enough about it at this point. In the absence of a valid reference, many are sure to dismiss it as either irrelevant or flawed, simply because it doesn’t conform to their own existing preconceptions. Because, like the iPhone, the Apple Watch is not a horse either.

That’s a very human response, deeply rooted in our nature. It’s actually uncontrollable, to a degree. We’ve been evolutionary conditioned to be wary of the unknown, because there was a time not so long ago, when our very survival depended on it. However, given that we’re not fighting smilodons for food anymore, perhaps we should at least try to keep an open mind about things. Especially shiny things that cost hundreds —or thousands— of dollars and have the potential to disrupt our entire lives and redefine the way we communicate with each other.

I’m not saying that you should like the Apple Watch. I’m certainly not saying you should buy one. I’m just saying, it can’t hurt to give it the benefit of the doubt. There’s so much to gain and so little to lose.

The Apple Watch is not a faster horse but who knows? It just may end up being your favorite thing.

Advertisements

“the moment seizes us” – Boyhood

Untitled-1_0004_Layer 2

Untitled-1_0003_Layer 1

Untitled-1_0002_Layer 3

Untitled-1_0001_Layer 4

Untitled-1_0000_Layer 5

Boyhood is a 2014 American coming-of-age drama film written and directed by Richard Linklater and starring Patricia Arquette, Ellar Coltrane, Lorelei Linklater and Ethan Hawke. The film was shot intermittently over an eleven-year period from May 2002 to October 2013, showing the growth of the young boy and his sister to adulthood.

http://en.wikipedia.org/wiki/Boyhood_(film)

What’s Up With That: Why It’s So Hard to Catch Your Own Typos | Science | WIRED

What’s Up With That: Why It’s So Hard to Catch Your Own Typos | Science | WIRED.

type

Typos suck. They are saboteurs, undermining your intent, causing your resume to land in the “pass” pile, or providing sustenance for an army of pedantic critics. Frustratingly, they are usually words you know how to spell, but somehow skimmed over in your rounds of editing. If we are our own harshest critics, why do we miss those annoying little details?

[…]

“When you’re writing, you’re trying to convey meaning. It’s a very high level task,” he said.

As with all high level tasks, your brain generalizes simple, component parts (like turning letters into words and words into sentences) so it can focus on more complex tasks (like combining sentences into complex ideas). “We don’t catch every detail, we’re not like computers or NSA databases,” said Stafford. “Rather, we take in sensory information and combine it with what we expect, and we extract meaning.” When we’re reading other peoples’ work, this helps us arrive at meaning faster by using less brain power. When we’re proof reading our own work, we know the meaning we want to convey. Because we expect that meaning to be there, it’s easier for us to miss when parts (or all) of it are absent. The reason we don’t see our own typos is because what we see on the screen is competing with the version that exists in our heads.

[…]

Generalization is the hallmark of all higher-level brain functions. It’s similar to how our brains build maps of familiar places, compiling the sights, smells, and feel of a route. That mental map frees your brain up to think about other things. Sometimes this works against you, like when you accidentally drive to work on your way to a barbecue, because the route to your friend’s house includes a section of your daily commute. We can become blind to details because our brain is operating on instinct. By the time you proof read your own work, your brain already knows the destination.

This explains why your readers are more likely to pick up on your errors. Even if you are using words and concepts that they are also familiar with, their brains are on this journey for the first time, so they are paying more attention to the details along the way and not anticipating the final destination.

But even if familiarization handicaps your ability to pick out mistakes in the long run, we’re actually pretty awesome at catching ourselves in the act. (According to Microsoft, backspace is the third-most used button on the keyboard.) In fact, touch typists—people who can type without looking at their fingers—know they’ve made a mistake even before it shows up on the screen. Their brain is so used to turning thoughts into letters that it alerts them when they make even minor mistakes, like hitting the wrong key or transposing two characters. In a study published earlier this year, Stafford and a colleague covered both the screen and keyboard of typists and monitored their word rate. These “blind” typists slowed down their word rate just before they made a mistake.

Touch typists are working off a subconscious map of the keyboard. As they type, their brains are instinctually preparing for their next move. “But, there’s a lag between the signal to hit the key and the actual hitting of the key,” Stafford said. In that split second, your brain has time to run the signal it sent your finger through a simulation telling it what the correct response will feel like. When it senses an error, it sends a signal to the fingers, slowing them down so they have more time to adjust.

As any typist knows, hitting keys happens too fast to divert a finger when it’s in the process of making a mistake. But, Stafford says this evolved from the same mental mechanism that helped our ancestors’ brains make micro adjustments when they were throwing spears.

Unfortunately, that kind of instinctual feedback doesn’t exist in the editing process. When you’re proof reading, you are trying to trick your brain into pretending that it’s reading the thing for the first time. Stafford suggests that if you want to catch your own errors, you should try to make your work as unfamiliar as possible. Change the font or background color, or print it out and edit by hand. “Once you’ve learned something in a particular way, it’s hard to see the details without changing the visual form,” he said.

What’s Up With That: Your Best Thinking Seems to Happen in the Shower | Science | WIRED

What’s Up With That: Your Best Thinking Seems to Happen in the Shower | Science | WIRED.

shower-thoughts

Long drives, short walks, even something like pulling weeds, all seem to have the right mix of monotony and engagement to trigger a revelation. They also happen to be activities where it’s difficult to take notes. It turns out that aimless engagement in an activity is a great catalyst for free association, but introducing a pen and paper can sterilize the effort.

[…]

The common thread in these activities is they are physically or mentally active, but only mildly so. They also need to be familiar or comfortable enough that you stay engaged but not bored, and last long enough to have an uninterrupted stream of thought.

Kounios explains that our brains typically catalog things by their context: Windows are parts of buildings, and the stars belong in the night sky. Ideas will always mingle to some degree, but when we’re focused on a specific task our thinking tends to be linear.

Kounios likes to use the example of a stack of bricks in your backyard. You walk by them every day with hardly a second thought, and if asked you’d describe them as a building material (maybe for that pizza oven you keep meaning to put together). But one day in the shower, you start thinking about your neighbor’s walnut tree. Those nuts sure look tasty, and they’ve been falling in your yard. You suddenly realize that you can smash those nuts open using the bricks in your backyard!

[…]

As ideas become untethered, they are free to bump up against other ideas they’ve never had the chance to encounter, increasing the likelihood of a useful connection.

[…]

Like Archimedes, when you are working on a problem your brain tends to fixate on one or a few different strategies. Kounios says these are like ruts that your mental wheels get stuck in. “If you take a break however, those thought patterns no longer dominate your thinking,” he said. The problem gets removed from the mental ruts and mingles with other ideas you’re carrying in your head. Eventually, it finds one—or several—that click together and rise up like Voltron into a solution. This is called fixation forgetting.

[…]

It’s not clear how your brain decides which are the right connections, but it’s obvious that the farther your brain can roam, the better. Research has shown that your brain builds bigger creative webs when you’re in a positive mood. This makes sense, because when you’re anxious you’re less likely to take a chance on creativity. Even when resting or taking a break, anxious brains tend to obsess on linear solutions. This may be part of the reason that when you bring a way to record your thoughts into the equation—such as a notebook, voice recorder or word processor—the thoughts worth recording become scarce.

[…]

“Not having an explicit task is the main ingredient for random insights,” Kounios said. “Once you have a pen and paper there, it’s not really your mind wandering.”

 

Me, Myself, and I by Stephen Greenblatt | The New York Review of Books

Me, Myself, and I by Stephen Greenblatt | The New York Review of Books.

File:Shunga woman reading.jpg

Shunga woman reading

Laqueur’s most recent book, Solitary Sex: A Cultural History of Masturbation, shares with Making Sex the same startling initial premise: that something we take for granted, something that goes without saying, something that simply seems part of being human has in fact a history, and a fascinating, conflicted, momentous history at that.

[…]

Masturbation is virtually unique, in the array of more or less universal human behaviors, in arousing a peculiar and peculiarly intense current of anxiety.

This anxiety, Laqueur observes, is not found in all cultures and is not part of our own culture’s distant origins. In ancient Greece and Rome, masturbation could be the object of transitory embarrassment or mockery, but it had little or no medical or, as far as we can tell, cultural significance. More surprisingly, Laqueur argues, it is almost impossible to find in ancient Jewish thought. This claim at first seems dubious because in Genesis 38 we read that Onan “spilled his seed upon the ground,” an act that so displeased the Lord that He struck him dead. Onanism indeed became a synonym for masturbation, but not for the rabbis who produced the Talmuds and midrashim. For them the sin of Onan was not masturbation but a willful refusal to procreate. Their conceptual categories—procreation, idolatry, pollution—evidently did not include a significant place for the sinful indulgence in gratuitous, self-generated sexual pleasure. Some commentators on a pronouncement by Rabbi Eliezer—“Any- one who holds his penis when he urinates is as though he brought the flood into the world”—seem close to condemning such pleasure, but on closer inspection these commentators too are concerned with the wasting of semen.

Medieval Christian theologians, by contrast, did have a clear concept of masturbation as a sin, but it was not, Laqueur claims, a sin in which they had particularly intense interest. With the exception of the fifth-century abbot John Cassian, they were far more concerned with what Laqueur calls the ethics of social sexuality than they were with the ethics of solitary sex. What mattered most were “perversions of sexuality as perversions of social life, not as a withdrawal into asocial autarky.” Within the monastery anxiety focused far more on sodomy than on masturbation, while in the world at large it focused more on incest, bestiality, fornication, and adultery.

[…]

Church fathers could not share in particularly intense form the Jewish anxiety about Onan, precisely because the Church most honored those whose piety led them to escape from the whole cycle of sexual intercourse and generation. Theologians did not permit masturbation, but they did not focus sharply upon it, for sexuality itself, and not only nonreproductive sexuality, was to be overcome. A very severe moralist, Raymond of Peñafort, did warn married men against touching themselves, but only because arousal might make them want to copulate more often with their wives.

[…]

Reformation theologians did not fundamentally alter the traditional conception of masturbation or significantly intensify the level of interest in it. To be sure, Protestants vehemently castigated Catholics for creating institutions—monasteries and convents—that in their view denigrated marriage and inevitably fostered masturbation. Marriage, the Reformers preached, was not a disappointing second choice made by those who could not embrace the higher goal of chastity; it was the fulfillment of human and divine love. Sexual pleasure in marriage, provided that it was not excessive or pursued for its own sake, was not inherently sinful, or rather any taint of sinfulness was expunged by the divinely sanctioned goal of procreation. In the wake of Luther and Calvin masturbation remained what it had been for the rabbis: an act whose sinfulness lay in the refusal of procreation, the prodigal wasting of seed.

In one of his early sonnets, Shakespeare wittily turns such “unthrifty” wasting into economic malpractice:

Unthrifty loveliness, why dost thou spend
Upon thyself thy beauty’s legacy?

In bequeathing the young man such loveliness, nature expected him to pass it along to the next generation; instead the “beauteous niggard” is holding on to it for himself and refusing to create the child who should rightly bear his image into the future. Masturbation, in the sonnet, is the perverse misuse of an inheritance. The young man merely spends upon himself, and thereby throws away, wealth that should rightly generate more wealth:

For having traffic with thyself alone,
Thou of thyself thy sweet self dost deceive.
Then how when nature calls thee to be gone:
What acceptable audit canst thou leave?

  Thy unused beauty must be tombed with thee,

  Which usèd, lives th’executor to be.

The young man, as the sonnet characterizes him, is a “profitless usurer,” and when his final reckoning is made, he will be found in arrears. The economic metaphors here have the odd effect of praising usury, still at the time regarded both as a sin and as a crime. There may be an autobiographical element here—the author of The Merchant of Venice was himself on occasion a usurer, as was his father—but Shakespeare was also anticipating a recurrent theme in the history of “modern masturbation” that concerns Laqueur: from the eighteenth century onward, masturbation is assailed as an abuse of biological and social economy. Still, a poem like Shakespeare’s only shows that masturbation in the full modern sense did not yet exist: by “having traffic” with himself alone, the young man is wasting his seed, but the act itself is not destroying his health or infecting the whole social order.

The Renaissance provides a few glimpses of masturbation that focus on pleasure rather than the avoidance of procreation. In the 1590s Shakespeare’s contemporary Thomas Nashe wrote a poem about a young man who went to visit his girlfriend who was lodging—just for the sake of convenience, she assured him—in a whorehouse. The man was so aroused by the very sight of her that he had the misfortune of prematurely ejaculating, but the obliging lady managed to awaken him again. Not, however, long enough for her own satisfaction: to his chagrin, the lady only managed to achieve her “solace” by means of a dildo which, she declared, was far more reliable than any man. This piece of social comedy is closer to what Laqueur would consider authentic “modern” masturbation, for Nashe’s focus is the pursuit of pleasure rather than the wasting of seed, but it is still not quite there.

Laqueur’s point is not that men and women did not masturbate throughout antiquity, the Middle Ages, and the Renaissance—the brief confessional manual attributed to Gerson assumes that the practice is ubiquitous, and the historian finds no reason to doubt it—but rather that it was not regarded as a deeply significant event. It is simply too infrequently mentioned to have counted for a great deal, and the few mentions that surface tend to confirm its relative unimportance. Thus in his diary, alongside the many occasions on which he had a partner in pleasure, Samuel Pepys jotted down moments in which he enjoyed solitary sex, but these latter did not provoke in him any particular shame or self-reproach. On the contrary, he felt a sense of personal triumph when he managed, while being ferried in a boat up the Thames, to bring himself to an orgasm—to have “had it complete,” as he put it—by the strength of his imagination alone. Without using his hands, he noted proudly, he had managed just by thinking about a girl he had seen that day to pass a “trial of my strength of fancy…. So to my office and wrote letters.” Only on such solemn occasions as High Mass on Christmas Eve in 1666, when the sight of the queen and her ladies led him to masturbate in church, did Pepys’s conscience speak out, and only in a very still, small voice.

The seismic shift came about some half-century later, and then not because masturbation was finally understood as a horrible sin or an economic crime but rather because it was classified for the first time as a serious disease. “Modern masturbation,” Solitary Sex begins, “can be dated with a precision rare in cultural history.” It came into being “in or around 1712” with the publication in London of a short tract with a very long title: Onania; or, The Heinous Sin of Self Pollution, and all its Frightful Consequences, in both SEXES Considered, with Spiritual and Physical Advice to those who have already injured themselves by this abominable practice. And seasonable Admonition to the Youth of the nation of Both SEXES….The anonymous author—Laqueur identifies him as John Marten, a quack surgeon who had published other works of soft-core medical pornography—announced that he had providentially met a pious physician who had found remedies for this hitherto incurable disease. The remedies are expensive, but given the seriousness of the condition, they are worth every penny. Readers are advised to ask for them by name: the “Strengthening Tincture” and the “Prolific Powder.”

[…]

But marketing alone cannot explain why “onanism” and related terms began to show up in the great eighteenth-century encyclopedias or why one of the most influential physicians in France, the celebrated Samuel Auguste David Tissot, took up the idea of masturbation as a dangerous illness or why Tissot’s 1760 work, L’Onanisme, became an instant European literary sensation.

[…]

Tissot “definitively launched masturbation,” as Laqueur puts it, “into the mainstream of Western culture.” It was not long before almost the entire medical profession attributed an inexhaustible list of woes to solitary sex, a list that included spinal tuberculosis, epilepsy, pimples, madness, general wasting, and an early death.

[…]

Modern masturbation—and this is Laqueur’s brilliant point—was the creature of the Enlightenment. It was the age of reason, triumph over superstition, and the tolerant, even enthusiastic acceptance of human sexuality that conjured up the monster of self-abuse. Prior to Tissot and his learned medical colleagues, it was possible for most ordinary people to masturbate, as Pepys had done, without more than a twinge of guilt. After Tissot, anyone who indulged in this secret pleasure did so in the full, abject knowledge of the horrible consequences. Masturbation was an assault on health, on reason, on marriage, and even on pleasure itself. For Enlightenment doctors and their allies did not concede that masturbation was a species of pleasure, however minor or embarrassing; it was at best a false pleasure, a perversion of the real. As such it was dangerous and had at all costs to be prevented.

[…]

There were, Laqueur suggests, three reasons why the Enlightenment concluded that masturbation was perverse and unnatural. First, while all other forms of sexuality were reassuringly social, masturbation—even when it was done in a group or taught by wicked servants to children—seemed in its climactic moments deeply, irremediably private. Second, the masturbatory sexual encounter was not with a real, flesh-and-blood person but with a phantasm. And third, unlike other appetites, the addictive urge to masturbate could not be sated or moderated. “Every man, woman, and child suddenly seemed to have access to the boundless excesses of gratification that had once been the privilege of Roman emperors.”

Privacy, fantasy, insatiability: each of these constitutive features of the act that the Enlightenment taught itself to fear and loathe is, Laqueur argues, a constitutive feature of the Enlightenment itself. Tissot and his colleagues had identified the shadow side of their own world: its interest in the private life of the individual, its cherishing of the imagination, its embrace of a seemingly limitless economy of production and consumption. Hammering away at the social, political, and religious structures that had traditionally defined human existence, the eighteenth century proudly brought forth a shining model of moral autonomy and market economy—only to discover that this model was subject to a destructive aberration. The aberration—the physical act of masturbating—was not in itself so obviously dreadful. When Diderot and his circle of sophisticated encyclopédistes offered their considered view of the subject, they acknowledged that moderate masturbation as a relief for urgent sexual desires that lacked a more satisfying outlet seemed natural enough. But the problem was that “moderate masturbation” was a contradiction in terms: the voluptuous, fiery imagination could never be so easily restrained.

Masturbation then became a sexual bugbear, Laqueur argues, because it epitomized all of the fears that lay just on the other side of the new sense of social, psychological, and moral independence. A dramatic increase in individual autonomy was bound up, as he convincingly documents, with an intensified anxiety about unsocialized, unreproductive pleasure, pleasure fueled by seductive chimeras ceaselessly generated by the vagrant mind:

The Enlightenment project of liberation—the coming into adulthood of humanity—made the most secret, private, seemingly harmless, and most difficult to detect of sexual acts the centerpiece of a program for policing the imagination, desire, and the self that modernity itself had unleashed.

The dangers of solitary sex were linked to one of the most telling modern innovations. “It was not an accident,” Laqueur writes, in the careful phrase of a historian eager at once to establish a link and to sidestep the issue of causality, that Onania was published in the age of the first stock market crashes, the foundation of the Bank of England, and the eruption of tulip-mania. Masturbation is the vice of civil society, the culture of the marketplace, the world in which traditional barriers against luxury give way to philosophical justifications of excess. Adam Smith, David Hume, and Bernard Mandeville all found ways to celebrate the marvelous self-regulating quality of the market, by which individual acts of self-indulgence and greed were transformed into the general good. Masturbation might at first glance seem to be the logical emblem of the market: after all, the potentially limitless impulse to gratify desire is the motor that fuels the whole enormous enterprise. But in fact it was the only form of pleasure-seeking that escaped the self-regulating mechanism: it was, Mandeville saw with a shudder, unstoppable, unconstrained, unproductive, and absolutely free of charge. Far better, Mandeville wrote in his Defense of Public Stews (1724), that boys visit brothels than that they commit “rapes upon their own bodies.”

The revealing contrast here is with an earlier cultural innovation, the public theaters, which were vigorously attacked in Shakespeare’s time for their alleged erotic power. The theaters, moralists claimed, were “temples to Venus.” Aroused audiences would allegedly rush off at the play’s end to make love in nearby inns or in secret rooms hidden within the playhouses themselves.

[…]

In the late seventeenth century John Dunton—the author of The Night-walker, or Evening Rambles in Search After Lewd Women (1696)—picked up a whore in the theater, went to her room, and then tried to give her a sermon on chastity. She vehemently objected, saying that the men with whom she usually went home were far more agreeable: they would pretend, she said, that they were Antony and she would pretend that she was Cleopatra. The desires that theaters awakened were evidently understood to be fundamentally social: irate Puritans never charged that audiences were lured into an addiction to solitary sex. But that is precisely the accusation leveled at the experience of reading imaginative fiction.

It was not only the solitude in which novels could be read that contributed to the difference between the two attacks; the absence of the bodies of the actors and hence the entire reliance on imagination seemed to make novels more suitable for solitary than social sex. Eighteenth-century doctors, tapping into ancient fears of the imagination, were convinced that when sexual excitement was caused by something unreal, something not actually present in the flesh, that excitement was at once unnatural and dangerous. The danger was greatly intensified by its addictive potential: the masturbator, like the novel reader—or rather, precisely as novel reader—could willfully mobilize the imagination, engaging in an endless creation and renewing of fictive desire. And shockingly, with the spread of literacy, this was a democratic, equal opportunity vice. The destructive pleasure was just as available to servants as to masters and, still worse, just as available to women as to men. Women, with their hyperactive imaginations and ready sympathies, their proneness to tears, blushes, and fainting fits, their irrationality and emotional vagrancy, were thought particularly subject to the dangerous excitements of the novel.

[…]

at the beginning of the twentieth century, the whole preoccupation—the anxiety, the culture of surveillance, the threat of death and insanity—began to wane. The shift was by no means sudden or decisive, and traces of the older attitudes obviously persist not only in schoolboy legends and many zany, often painful family dramas but also in the nervous laughter that attends the whole topic. Still, the full nightmare world of medicalized fear and punishment came to an end. Laqueur tells this second part of the story far more briskly: he attributes the change largely to the work of Freud and liberal sexology, though he also acknowledges how complex and ambivalent many of the key figures actually were. Freud came to abandon his conventional early views about the ill effects of masturbation and posited instead the radical idea of the universality of infant masturbation. What had been an aberration became a constitutive part of the human condition. Nevertheless the founder of psychoanalysis constructed his whole theory of civilization around the suppression of what he called the “perverse elements of sexual excitement,” beginning with autoeroticism. In this highly influential account, masturbation, as Laqueur puts it, “became a part of ontogenesis: we pass through masturbation, we build on it, as we become sexual adults.”

[…]

Solitary Sex ends with a brief account of modern challenges to this theory of repression, from the championing of women’s masturbation in the 1971 feminist best seller Our Bodies, Ourselves to the formation of groups with names like the SF Jacks—“a fellowship of men who like to jack-off in the company of like-minded men,” as its Web site announces—and the Melbourne Wankers. A series of grotesque photographs illustrates the transgressive fascination that masturbation has for such contemporary artists as Lynda Benglis, Annie Sprinkle, and Vito Acconci. The latter made a name for himself by masturbating for three weeks while reclining in a box under a white ramp on the floor of the Sonnabend Gallery in New York City: “so, art making,” Laqueur observes, “is literally masturbating.”

[…]

Conjuring up his childhood in Combray, Proust’s narrator recalls that at the top of his house, “in the little room that smelt of orris-root,” he looked out through the half-opened window and

with the heroic misgivings of a traveller setting out on a voyage of exploration or of a desperate wretch hesitating on the verge of self-destruction, faint with emotion, I explored, across the bounds of my own experience, an untrodden path which for all I knew was deadly—until the moment when a natural trail like that left by a snail smeared the leaves of the flowering currant that drooped around me.

For this brief moment in Swann’s Way (1913), it is as if we had reentered the cultural world that Laqueur chronicles so richly, the world in which solitary sex was a rash voyage away beyond the frontiers of the natural order, a headlong plunge into a realm of danger and self-destruction. Then, with the glimpse of the snail’s trail, the landscape resumes its ordinary, everyday form, and the seemingly untrodden path is disclosed—as so often in Proust—to be exceedingly familiar.

[…]

Proust does not encourage us to exaggerate the significance of masturbation—it is only one small, adolescent step in the slow fashioning of the writer’s vocation. Still, Laqueur’s courageous cultural history (and it took courage, even now, to write this book) makes it abundantly clear why for Proust—and for ourselves—the celebration of the imagination has to include a place for solitary sex.

The Hi-Tech Mess of Higher Education by David Bromwich | The New York Review of Books

The Hi-Tech Mess of Higher Education by David Bromwich | The New York Review of Books.

bromwich_1-081414.jpg

Students at Deep Springs College in the California desert, near the Nevada border, where education involves ranching, farming, and self-governance in addition to academics – Jodi Cobb/National Geographic/Getty Images

The financial crush has come just when colleges are starting to think of Internet learning as a substitute for the classroom. And the coincidence has engendered a new variant of the reflection theory. We are living (the digital entrepreneurs and their handlers like to say) in a technological society, or a society in which new technology is rapidly altering people’s ways of thinking, believing, behaving, and learning. It follows that education itself ought to reflect the change. Mastery of computer technology is the major competence schools should be asked to impart. But what if you can get the skills more cheaply without the help of a school?

A troubled awareness of this possibility has prompted universities, in their brochures, bulletins, and advertisements, to heighten the one clear advantage that they maintain over the Internet. Universities are physical places; and physical existence is still felt to be preferable in some ways to virtual existence. Schools have been driven to present as assets, in a way they never did before, nonacademic programs and facilities that provide students with the “quality of life” that makes a college worth the outlay. Auburn University in Alabama recently spent $72 million on a Recreation and Wellness Center. Stanford built Escondido Village Highrise Apartments. Must a college that wants to compete now have a student union with a food court and plasma screens in every room?

[…]

The model seems to be the elite club—in this instance, a club whose leading function is to house in comfort thousands of young people while they complete some serious educational tasks and form connections that may help them in later life.

[…]

A hidden danger both of intramural systems and of public forums like “Rate My Professors” is that they discourage eccentricity. Samuel Johnson defined a classic of literature as a work that has pleased many and pleased long. Evaluations may foster courses that please many and please fast.

At the utopian edge of the technocratic faith, a rising digital remedy for higher education goes by the acronym MOOCs (massive open online courses). The MOOC movement is represented in Ivory Tower by the Silicon Valley outfit Udacity. “Does it really make sense,” asks a Udacity adept, “to have five hundred professors in five hundred different universities each teach students in a similar way?” What you really want, he thinks, is the academic equivalent of a “rock star” to project knowledge onto the screens and into the brains of students without the impediment of fellow students or a teacher’s intrusive presence in the room. “Maybe,” he adds, “that rock star could do a little bit better job” than the nameless small-time academics whose fame and luster the video lecturer will rightly displace.

That the academic star will do a better job of teaching than the local pedagogue who exactly resembles 499 others of his kind—this, in itself, is an interesting assumption at Udacity and a revealing one. Why suppose that five hundred teachers of, say, the English novel from Defoe to Joyce will all tend to teach the materials in the same way, while the MOOC lecturer will stand out because he teaches the most advanced version of the same way? Here, as in other aspects of the movement, under all the talk of variety there lurks a passion for uniformity.

[…]

The pillars of education at Deep Springs are self-governance, academics, and physical labor. The students number scarcely more than the scholar-hackers on Thiel Fellowships—a total of twenty-six—but they are responsible for all the duties of ranching and farming on the campus in Big Pine, California, along with helping to set the curriculum and keep their quarters. Two minutes of a Deep Springs seminar on citizen and state in the philosophy of Hegel give a more vivid impression of what college education can be than all the comments by college administrators in the rest of Ivory Tower.

[…]

Teaching at a university, he says, involves a commitment to the preservation of “cultural memory”; it is therefore in some sense “an effort to cheat death.”

Jenny Diski reviews ‘Cubed’ by Nikil Saval · LRB 31 July 2014

Jenny Diski reviews ‘Cubed’ by Nikil Saval · LRB 31 July 2014.

The story of the office begins in counting houses, where scribes kept their heads down accounting for the transformation of goods into wealth and vice versa. You might go as far back as ancient Egypt or stay sensible and look to mercantile Europe for the beginnings of bureaucracy, and the need to keep written accounts of business in one place. Saval gives a nod to the medieval guilds but settles on the 19th century as the start of the office proper, still in Europe, although this is an overwhelmingly American account of the American office. The closer you get to modernity in Cubed, the more the emphasis is on buildings and the more diminished the figure of the worker inside the buildings (until you get to the end and the buildings begin to disappear, although so too do the workers). It’s not a mystery. The design and construction of entire purpose-built structures for office work is a modern phenomenon. Scribes, to stretch the notion of office work, wrote in scriptoria, rooms in monasteries which were built for the more general purpose of worshipping God and housing those devoted to the various tasks (among which the reproduction of scripture) involved in doing so. Clerks are more likely to be what we think of when we want to look at the early days of office work. They emerged from their religious duties to assist commerce in keeping track of business, where we recognise them as dark-suited, substantially present characters in Trollope, Thackeray and Dickens. The ready-made spaces these clerks worked in became ‘offices’, rather than special buildings defining the work they pursued. They kept their books and scratched out their invoices in regular private houses given over to business, and sat or stood at desks in rooms they shared with their bosses for both convenience and oversight – this too disappears and then returns in postmodernity when hierarchy is spatially, if not actually, flattened.

Proximity has always been an important issue for office workers, so much so that it eventually precluded any form of unionisation. Rather than organise to improve their pay and conditions, office workers chose to keep close to their superiors in the hope, not always forlorn, that they would rise in prominence thanks to patronage. Physical closeness applied in the Dickensian office, but there are other ways to achieve it. In The Apartment (perfectly depicting the apex of the American way of office life in 1960 as North by Northwest perfectly depicts the fantasised alternative), Jack Lemmon gets close to his boss, which gets him ever closer to a key to the executive washroom, by lending his apartment to executives for their extra-marital assignations.

[…]

The pre-20th-century office worker saw himself as a cut above the unsalaried labouring masses, and was as ambivalent about his superiors, who were his only means of rising, as the rest of the working world was about him. Dandyish clerks prided themselves on not being workers, on the cleanness of their job (thus the whiteness of the collars), and on being a step above hoi polloi. They became a massed workforce in the United States, where the attitude towards the scribe and record-keeper changed, so that they came to be seen both as effete and untrustworthy, like Dickens’s Heep, and as ominous and unknowable, like Bartleby, but without receiving the amazed respect of Melville’s narrator. By 1855 in New York they were the third largest occupational group. Their self-esteem as their numbers grew was not shared: ‘Nothing about clerical labour was congenial to the way most Americans thought of work … At best, it seemed to reproduce things … the bodies of real workers were sinewy, tanned by the relentless sun, or blackened by smokestack soot; the bodies of clerks were slim, almost feminine in their untested delicacy.’ In Vanity Fair, the clerks are ‘“vain, mean, selfish, greedy, sensual and sly, talkative and cowardly”, and spent all their minimal strength attempting to dress better than “real men who did real work”.’

 

By the mid-20th century sex had created a new division within clerical labour. The secretary was almost invariably a woman and so was the typist, who worked in massed serried ranks, although (again to be seen in The Apartment) there was also a pool of anonymous desks for mute men with accounting machines, like Lemmon as C.C. Baxter. The secretaries lived inside a bubble of closeness to power, looking to burst through it into management or marriage, most likely the latter, geishas at work whose most realistic hope was to become domestic geishas, while the typists (originally called typewriters) and number-crunchers clattering on their machines on their own floor merely received dictated or longhand work to type or add up, distributed by runners, and so were not likely to catch the eye of an executive to give them a hand up unless they were prepared to wait outside their own apartment in the rain.

The pools of workers as well as the interior design of offices were under the spell of Taylorism, the 1950s fetish for a time and motion efficiency that tried to replicate the rhythm enforced in the factories to which office workers felt so superior. The idea that things that need doing and the people doing them could be so organised that they operated together as smoothly as cogs in a machine is everlastingly seductive. Anyone who spends half a day reorganising their home office, rejigging their filing system, arranging their work space ‘ergonomically’ knows this. It isn’t just a drive for cost efficiency, but some human tic that has us convinced that the way we organise ourselves in relation to our work holds a magic key to an almost effortless success. Entire online magazines like Lifehacker and Zen Habits are devoted to time-and-money-saving tweaks for work and home (‘An Easy Way to Find the Perfect Height for Your Chair or Standing Desk’; ‘Five Ways to Spend a Saved Hour at Work’; ‘Ten Tips to Work Smarter, Not Harder’; ‘What to Think about While You Exercise’). At a corporate level, this meant erecting buildings and designing their interiors and work systems to achieve office nirvana. No time, no motion wasted. The utopian dream of architects, designers and managers comes together in the form-follows-function mantra, beginning with Adler and Sullivan’s Wainwright Building in St Louis in 1891, although, as Saval points out, from the start it was really all about form follows finance:

The point was not to make an office building per specification of a given company … but rather to build for an economy in which an organisation could move in and out of a space without any difficulty. The space had to be eminently rentable … The skylines of American cities, more than human ingenuity and entrepreneurial prowess, came simply to represent dollars per square foot.

The skyscraper, the apotheosis of form following finance and function, appears once the manufacture of elevators allowed buildings of more than the five floors that people are prepared to walk up. It was a perfect structure philosophically and speculatively to house the now millions of workers whose job it was to keep track of manufacturing, buying and selling – ‘the synthesis of naked commerce and organic architecture’ as foreseen by Louis Sullivan, mentor to Frank Lloyd Wright. The basic unit of the skyscraper is the ‘cell’: ‘We take our cue from the individual cell, which requires a window with its separating pier, its sill and lintel, and we, without more ado, make them look all alike because they are all alike.’ The International Style reached its glory period with the vertical cities designed by Sullivan, Mies van der Rohe, Philip Johnson, Henry-Russell Hitchcock. The Philadelphia Savings Fund Society Building, the Rockefeller Center, the UN Secretariat Building, Lever House and the Seagram Building were visually stunning statements of corporate power and prevailed by making the perceived virtues of repetition and monotony in design synonymous with economy and order. Even the need for a window in each cell was obviated with the invention of an efficient air-conditioning system and electric lighting, allowing more rational ways to provide light and air. However beautiful or banal the exterior, curtained in glass or blank with concrete, the buildings served as hives for the masses who performed their varied tasks to produce the evidence of profit. They were Taylorist cathedrals, and new techniques of ergonomics and personality-testing for employees compounded the organisational religious zeal, so that individuals more than ever before became bodies operating within physical space, whose ‘personalities’ were tested for the lack of them in the search for compliance and conformity. Business jargon added mind-conditioning on a par with air-conditioning, keeping everyone functioning optimally within the purposes of the mini-city.

The popular sociology books that began to appear in the 1960s criticising this uniformity were read avidly by the office workers who started to see themselves as victims. The Lonely Crowd, The Organisation Man, The Man in the Grey Flannel Suit, the movie The Apartment itself, described a dystopian conformity that mid-century business America had produced in entire lives, not just in the working day. An alternative was proposed by office designers such as Robert Propst at Herman Miller, who were still working on behalf of the corporations, but who saw Taylorism as deadening the creative forces that were beginning to be seen as useful to business, perhaps as a result of the rise of advertising. Open plan became the solution. The cell opened out to the entire floor space of the building and it became a matter of how to subdivide that space to suit the varied tasks each individual needed to do, while retaining openness; to create office interiors in which workers needed to move around to achieve their goals, ideally bumping into one another on the way to permit the fortuitous cross-pollination of ideas. Cubes arrived, boxes without lids for people, but humane, alterable and adaptable to their needs (or the needs of the business for which they worked). Lots of little adjustable cells inside the main cell. Walls became flexible and low enough to be chatted over. Herman Miller’s Action Office and the concept of Bürolandschaft, the landscaped office, replaced the fundamental lonely cell and created its own kind of hell: ‘unpleasant temperature variations, draughts, low humidity, unacceptable noise levels, poor natural lighting, lack of visual contact with the outside and lack of natural ventilation’. And in addition there was a felt loss of privacy that had people bringing in all manner of knick-knacks to their cubes as self-identifiers and status symbols.

Another kind of office work came along with the arrival of the dotcom revolution. Not paper work but screen work. Like advertising but growing crazily, not humdrum invoice-stamping and letter-writing, but innovative programming that required intense brainwork from young, ill-disciplined talent who needed to be kept at their screens as much as possible while being nurtured and refuelled on the job. Being young and not having any connection with the office work of the past, the new workforce was offered on-site playgrounds that kept obsessive minds refreshed but still focused. Hierarchies were loosened, or more accurately given the appearance of being loosened. Jeans and T-shirts replaced suits, all youthful needs (except sleep-inducing sex) were catered for: pizzas and carbonated drinks, basketball and brightly coloured nursery furniture for the young geniuses to lounge or nap on when they were exhausted with programming. The open-plan office moved towards ‘main streets’ with side offices for particular purposes, often themed like Disneyland with lots of communal meeting and playing places, scooters to get around, and built-in time for workers to develop their own pet projects. The Herman Miller Aeron chair, still so desirable, was a design response to the need to sit for long periods working at a screen. It’s advertised as being ergonomically created for people to sit comfortably on stretchy mesh for up to 12 hours at a time.

In advertising, Jay Chiat decided that office politics were a bar to inspirational thinking. He hired Frank Gehry to design his ‘deterritorialised’ agency offices in Venice, California in 1986. ‘Everyone would be given a cellular phone and a laptop computer when they came in. And they would work wherever they wanted.’ Personal items, pictures or plants had to be put in lockers. There were no other private spaces. There were ‘Tilt-A-Whirl domed cars … taken from a defunct amusement park ride, for two people to have private conferences. They became the only place where people could take private phone calls.’ One employee pulled a toy wagon around to keep her stuff together. It rapidly turned into a disaster. People got to work and had no idea where they were to go. There were too many people and not enough chairs. People just stopped going to work. In more formal work situations too, the idea of the individual workstation, an office or a personal desk, began to disappear and designers created fluid spaces where people wandered to settle here and there in specialised spaces. For some reason homelessness was deemed to be the answer to a smooth operation.

The great days of office buildings dictating where and how individuals work within them may have gone. There are new architects and designers who collaborate with the workers themselves to produce interiors that suit their needs and desires. ‘Co-design’ – allowing the users of a space to have an equal say in how it is organised – is a first sign that buildings, sponsored by and monuments to corporate power, might have lost their primacy over the individuals engaged to work in them. But if the time of grand structures is over, it’s probably an indication that corporate power has seen a better way to sustain itself. The shift away from monolithic vertical cities of work and order might be seen as the stage immediately preceding the disappearance of the office altogether and the start of the home-working revolution we’ve been told has been on its way ever since futurology programmes in the 1950s assured us we’d never get out of our pyjamas within the year.

Fantasies of home-working, as people began to see round the corner into a computerised future, were forever being promised but never really came to anything. The idea made management nervous. How to keep tabs on people? How were managers to manage? And it alarmed office workers. It wasn’t perhaps such a luxury after all not having to face the nightmare of commuting or those noisy open-plan dystopias, when confronted instead by the discipline needed to get down to and keep at work at home, operating around the domestic needs of the family, and having no one to chat to around the water cooler that wasn’t there. Even now, when the beneficial economics of freelancing and outsourcing has finally got a grip on corporate accountants, there is something baffling and forlorn about the sight, as you walk past café after café window, of rows of people tapping on their MacBook Air. There for company in the communal space, but wearing isolating headphones to keep out the chatter, rather than sitting in their own time in quiet, ideally organised, or lonely, noisy, cramped home offices. Cafés with free wifi charge by the coffee to replicate a working atmosphere in what was once a place for daydreaming and chat. The freedom of home-working is also the freedom from employment benefits such as paid holidays, sick pay, pensions; and the freedom of permatemp contracts or none at all and the radical uncertainty about maintaining a steady income. These workers are a serious new class, known as the precariat: insecure, unorganised, taking on too much work for fear of famine, or frighteningly underemployed. The old rules of employment have been turned upside down. These new non-employees, apparently, need to develop a new ‘self-employed mindset’, in which they treat their employers as ‘customers’ of their services, and do their best to satisfy them, in order to retain their ‘business’. The ‘co-working’ rental is the most recent arrival. Space in a building with office equipment and technical facilities is hired out to freelancers, who work together but separately in flexible spaces on their own projects, in a bid ‘to get out of their apartments and be sociable in an office setting’. Office space has returned to what it really was, dollars per square foot, which those who were once employees now pay to use, without the need for rentiers to provide more than a minimum of infrastructure. The US Bureau of Labor Statistics projects that ‘by 2020 freelancers, temps, day labourers and independent contractors will constitute 40 per cent of the workforce.’ Some think up to 50 per cent. Any freelancer will tell you about the time and effort required to drum up business and keep it coming (networking, if you like) which cuts down on how much work you can actually do if you get it. When they do get the work, they no longer get the annual salaries that old-time clerks were so proud to receive. Getting paid is itself time-consuming and difficult. It’s estimated that more than 77 per cent of freelancers have had trouble collecting payment, because contractors try to retain fees for as long as possible. Flexibility sounds seductive, as if it allows individuals to live their lives sanely, fitting work and leisure together in whatever way suits them and their families best. But returning the focus to the individual worker rather than the great corporate edifice simply adds the burdens of management to the working person’s day while creating permanent anxiety and ensuring employee compliance. As to what freelancers actually do in their home offices, in steamy cafés, in co-working spaces, I still have no idea, but I suspect that the sumptuous stationery cupboard is getting to be as rare as a monthly salary cheque.