Tag Archives: error

The Technical Constraints That Made Abbey Road So Good – The Atlantic

The Technical Constraints That Made Abbey Road So Good – The Atlantic.

The sanctum sanctorum of Abbey Road is Studio Two, the room where the majority of The Beatles’ recordings were made.

Standing at the threshold of Studio Two, it doesn’t look all that different from a small school gymnasium: a big rectangular box with white walls, 24-foot-high ceilings, and a parquet floor. But as soon as we entered, any thoughts of dribbling basketballs fell away, as I began to remember images of John Lennon and Paul McCartney standing around a microphone at the far end of the room, working out their harmonies.


When each of the tools in that display was first introduced, many music experts were totally wrong about the impact they would have on creative culture. “Records will kill live music,” they said as the phonograph gained popularity. Tape recording was initially viewed with suspicion by recordists accustomed to using disc-cutting lathes.

As digital technology arrived, many people thought it would surely relegate analog recording equipment to the scrap heap. In what seems like a stunning example of shortsightedness, some of Abbey Road’s most noteworthy gear was sold off in a 1980 sale as “memorabilia” at bargain-basement prices. One example—A 4-track recorder used on “Sgt. Peppers’” went for just $800 (that’s $2,300 in today’s money).

For melodic pop music, Studio Two has physical, tonal qualities which transcend its humble appearance. “It emphasizes the midrange,” Kehew says, ”and has a warm, short reverb unusual for a room its size.” These reverberant qualities are so well known that Abbey Road’s rental contract actually prohibits any sampling of its distinctive acoustic signature. As I stood in the room, I could hear the echoes of the vocals and kick drums on some of my favorite recordings of all time.


Kehew agrees that every tool can have a place as part of an artistic palate. “Old is not good or bad,” he said. “Question it. Try it. Listen. Buy weird bad gear and great quality gear—see what it does for you. I love Jon Brion’s quote—‘I don’t want to be Lo-Fi or Hi-Fi, I want to be ALL-Fi!’”

Scott touched on this in the lecture too, recounting that this was the approach that caused Beatles producer George Martin to turn down Abbey Road’s first 8-track recorder for use on the White Album. The 4-track recorders used for years by The Beatles had been specially modified to help create some of their signature sounds. Because the new 8-track recorder lacked those modifications, Martin declined to bring it into the session. His thinking, Scott said, was that it would be better for the process to maintain continuity.

In an ironic twist, Scott mentioned that The Beatles themselves had a different idea. They decided to use the 8-track without Martin’s permission, which got Scott and another engineer into a fair amount of trouble. The fact that the device was used to track parts of “While My Guitar Gently Weeps” probably helped accelerate the forgiveness. Even though new technologies can kill off old ways of working, it’s ultimately up to humans to decide the hour that they should.

“It was the 60s,” Scott said of the incident. “Rules were meant to be broken.”

At the beginning of the Beatles era, technicians had to complete what amounted to an extended apprenticeship program—and were even required to wear white lab coats (Winston Churchill once quipped that Abbey Road made him feel like he was visiting a hospital). Prospective engineers were brought up through the ranks slowly and instructed on the “rules of the process” at each stage.

But as the 60s went on, culture—specifically counter-culture—began seeping into the studio and changing that dynamic relationship between the engineers and their tools. Over time, the room became filled with incredibly skilled people who were willing to break any rule if it helped their artists create new and interesting sounds.

It was this combination of playfulness, openness to risk-taking, and deep professionalism which enabled Abbey Road’s technicians to respond to seemingly off-the-wall requests from The Beatles. Engineers began to record amps inside cupboards to get unique sounds. The studio’s tape recorders were rewired to automatically double-track performances. The tapes themselves were sped-up, slowed-down, sliced, and looped—to great effect. Even a joke, Scott says, was turned into an engineering puzzle that he had to solve when John Lennon took him up on his “suggestion” to fit the entire band in a small utility closet for the recording of “Yer Blues.”

A sort of positive feedback loop was happening: Culture was driving the development of technologies which, in turn, emboldened that creative culture to go even farther to create new tools and techniques. This embrace of the unorthodox didn’t mean that the Abbey Road staff abandoned everything they had been taught in the “white coat days,” though. In fact, Scott says it was that training which gave engineers the necessary skills to successfully and intelligently break the rules and develop all those new sounds and techniques.


When you listen to recordings from a generation or two ago, though, you often hear all sorts of rough edges: large dynamic transitions between loud and quiet, the sounds of oversaturated tape and tubes, instruments bleeding together. Chunked notes. Vocals that are out of pitch. Drums that drift in and out of time. Mistakes. Lots of mistakes.

Today’s creative paradox is that this human element, which often makes a song distinct or artistically interesting, is the thing which is almost always erased from modern productions.

“Do mistakes make music better?” I asked Kehew. Not really, he responded. It’s just that, when it comes to what people like about music, there was actually only one thing worse than these imperfections: perfection.

“I’ve done it and seen it many times,” he said. “Take something flawed, work on it ’til every part is ‘improved’ then listen. It’s worse. How could that be? Every piece is now better. But it’s a worse final product.”

This tendency towards incessant improvement has been encouraged by the power of modern tools. These days, sounds are almost always passed through a computer at some point in the recording process. These computers have their own working paradigms—things like cutting-and-pasting, the automated repetition of tasks, and “infinite undo”—which gives them incredible power to alter performances. It also adds more potential for overpolishing and something recording engineers refer to as “option paralysis,” a state where the sheer number of choices available prevents decisions from being made. Almost any element of a recording can be changed, right up until the moment that a song is released to the public.

The limitations of Beatles-era technology were substantial by comparison, and they forced a commitment to creative choices at earlier stages of the recording process. If, for example, an engineer wanted to exceed the number of recorded tracks that their tape machine allowed, two or more tracks had to be mixed together and “bounced” to an open track elsewhere. Cuts were physical, done with razor blades and tape. Mixes were performed by engineers in real time. Big mistakes at any point in the process could force an entire recording to be scrapped.

It was because artists were often stuck with the mistakes they made that they sometimes decided to embrace them. Once while recording a Beatles song called “Glass Onion” Scott accidentally erased a large number of drum parts that had been painstakingly overdubbed. Certain that he’d be fired, he played the tape to John Lennon. To Scott’s surprise, Lennon said that he liked the unexpected effect created by the glitch—and both the track and Scott stayed.

Scott was clear in his opinion: It isn’t so much the use of these new tools as it is their overuse that serves to undermine musicality.

“The trick,” Kehew says, “is a savvy or talented producer or engineer knows when to be bold and stop. To let character and roughness and lack of polish exist. I can bet most people spend more time polishing something than writing or creating the substance of it. The only cure is to work faster, more often, so you don’t treat every damn thing as being so precious that ‘It Must Be Perfect For All Time.’”

I asked Kevin Ryan if he was able to heed Scott’s warning in his own work. He laughed and acknowledged that knowing the risks of overusing digital tools didn’t make it any easier for him to resist that temptation. Kehew’s final word on the subject was, I thought, an especially Beatle-like principle for not overworking something: “Let it be what it was,” he says. “If it’s not that good, you shouldn’t be recording it.”


Today, Abbey Road straddles a line between modern culture and English Heritage. It has become Pop Music’s Westminster Abbey: partly a tourist attraction, partly a working cathedral where all the traditional rites and rituals are still observed.

Abbey Road is still producing hits though—even as tighter budgets and rising costs have caused many other recording facilities to close. An almost unbelievable number of influential artists and projects have worked (and continue to work) at the studio. Even if you eliminated the entire Beatles oeuvre the list is impressive. Pink Floyd’s “Dark Side of the Moon” was tracked there. Acts like Kate Bush, Elton John, Oasis, Nick Cave and the Bad Seeds, Green Day, U2, Radiohead, and Kanye West have all recorded there. Countless film scores, too—Star Wars, Raiders of the Lost Ark, Lord of the Rings.


What’s Up With That: Why It’s So Hard to Catch Your Own Typos | Science | WIRED

What’s Up With That: Why It’s So Hard to Catch Your Own Typos | Science | WIRED.


Typos suck. They are saboteurs, undermining your intent, causing your resume to land in the “pass” pile, or providing sustenance for an army of pedantic critics. Frustratingly, they are usually words you know how to spell, but somehow skimmed over in your rounds of editing. If we are our own harshest critics, why do we miss those annoying little details?


“When you’re writing, you’re trying to convey meaning. It’s a very high level task,” he said.

As with all high level tasks, your brain generalizes simple, component parts (like turning letters into words and words into sentences) so it can focus on more complex tasks (like combining sentences into complex ideas). “We don’t catch every detail, we’re not like computers or NSA databases,” said Stafford. “Rather, we take in sensory information and combine it with what we expect, and we extract meaning.” When we’re reading other peoples’ work, this helps us arrive at meaning faster by using less brain power. When we’re proof reading our own work, we know the meaning we want to convey. Because we expect that meaning to be there, it’s easier for us to miss when parts (or all) of it are absent. The reason we don’t see our own typos is because what we see on the screen is competing with the version that exists in our heads.


Generalization is the hallmark of all higher-level brain functions. It’s similar to how our brains build maps of familiar places, compiling the sights, smells, and feel of a route. That mental map frees your brain up to think about other things. Sometimes this works against you, like when you accidentally drive to work on your way to a barbecue, because the route to your friend’s house includes a section of your daily commute. We can become blind to details because our brain is operating on instinct. By the time you proof read your own work, your brain already knows the destination.

This explains why your readers are more likely to pick up on your errors. Even if you are using words and concepts that they are also familiar with, their brains are on this journey for the first time, so they are paying more attention to the details along the way and not anticipating the final destination.

But even if familiarization handicaps your ability to pick out mistakes in the long run, we’re actually pretty awesome at catching ourselves in the act. (According to Microsoft, backspace is the third-most used button on the keyboard.) In fact, touch typists—people who can type without looking at their fingers—know they’ve made a mistake even before it shows up on the screen. Their brain is so used to turning thoughts into letters that it alerts them when they make even minor mistakes, like hitting the wrong key or transposing two characters. In a study published earlier this year, Stafford and a colleague covered both the screen and keyboard of typists and monitored their word rate. These “blind” typists slowed down their word rate just before they made a mistake.

Touch typists are working off a subconscious map of the keyboard. As they type, their brains are instinctually preparing for their next move. “But, there’s a lag between the signal to hit the key and the actual hitting of the key,” Stafford said. In that split second, your brain has time to run the signal it sent your finger through a simulation telling it what the correct response will feel like. When it senses an error, it sends a signal to the fingers, slowing them down so they have more time to adjust.

As any typist knows, hitting keys happens too fast to divert a finger when it’s in the process of making a mistake. But, Stafford says this evolved from the same mental mechanism that helped our ancestors’ brains make micro adjustments when they were throwing spears.

Unfortunately, that kind of instinctual feedback doesn’t exist in the editing process. When you’re proof reading, you are trying to trick your brain into pretending that it’s reading the thing for the first time. Stafford suggests that if you want to catch your own errors, you should try to make your work as unfamiliar as possible. Change the font or background color, or print it out and edit by hand. “Once you’ve learned something in a particular way, it’s hard to see the details without changing the visual form,” he said.

The Fasinatng … Frustrating … Fascinating History of Autocorrect | Gadget Lab | WIRED

The Fasinatng … Frustrating … Fascinating History of Autocorrect | Gadget Lab | WIRED.

It’s not too much of an exaggeration to call autocorrect the overlooked underwriter of our era of mobile prolixity. Without it, we wouldn’t be able to compose windy love letters from stadium bleachers, write novels on subway commutes, or dash off breakup texts while in line at the post office. Without it, we probably couldn’t even have phones that look anything like the ingots we tickle—the whole notion of touchscreen typing, where our podgy physical fingers are expected to land with precision on tiny virtual keys, is viable only when we have some serious software to tidy up after us. Because we know autocorrect is there as brace and cushion, we’re free to write with increased abandon, at times and in places where writing would otherwise be impossible. Thanks to autocorrect, the gap between whim and word is narrower than it’s ever been, and our world is awash in easily rendered thought.


I find him in a drably pastel conference room at Microsoft headquarters in Redmond, Washington. Dean Hachamovitch—inventor on the patent for autocorrect and the closest thing it has to an individual creator—reaches across the table to introduce himself.


Hachamovitch, now a vice president at Microsoft and head of data science for the entire corporation, is a likable and modest man. He freely concedes that he types teh as much as anyone. (Almost certainly he does not often type hte. As researchers have discovered, initial-letter transposition is a much rarer error.)


The notion of autocorrect was born when Hachamovitch began thinking about a functionality that already existed in Word. Thanks to Charles Simonyi, the longtime Microsoft executive widely recognized as the father of graphical word processing, Word had a “glossary” that could be used as a sort of auto-expander. You could set up a string of words—like insert logo—which, when typed and followed by a press of the F3 button, would get replaced by a JPEG of your company’s logo. Hachamovitch realized that this glossary could be used far more aggressively to correct common mistakes. He drew up a little code that would allow you to press the left arrow and F3 at any time and immediately replace teh with the. His aha moment came when he realized that, because English words are space-delimited, the space bar itself could trigger the replacement, to make correction … automatic! Hachamovitch drew up a list of common errors, and over the next years he and his team went on to solve many of the thorniest. Seperate would automatically change to separate. Accidental cap locks would adjust immediately (making dEAR grEG into Dear Greg). One Microsoft manager dubbed them the Department of Stupid PC Tricks.


One day Hachamovitch went into his boss’s machine and changed the autocorrect dictionary so that any time he typed Dean it was automatically changed to the name of his coworker Mike, and vice versa. (His boss kept both his computer and office locked after that.) Children were even quicker to grasp the comedic ramifications of the new tool. After Hachamovitch went to speak to his daughter’s third-grade class, he got emails from parents that read along the lines of “Thank you for coming to talk to my daughter’s class, but whenever I try to type her name I find it automatically transforms itself into ‘The pretty princess.’”


On idiom, some of its calls seemed fairly clear-cut: gorilla warfare became guerrilla warfare, for example, even though a wildlife biologist might find that an inconvenient assumption. But some of the calls were quite tricky, and one of the trickiest involved the issue of obscenity. On one hand, Word didn’t want to seem priggish; on the other, it couldn’t very well go around recommending the correct spelling of mothrefukcer. Microsoft was sensitive to these issues. The solution lay in expanding one of spell-check’s most special lists, bearing the understated title: “Words which should neither be flagged nor suggested.”


One day Vignola sent Bill Gates an email. (Thorpe couldn’t recall who Bill Vignola was or what he did.) Whenever Bill Vignola typed his own name in MS Word, the email to Gates explained, it was automatically changed to Bill Vaginal. Presumably Vignola caught this sometimes, but not always, and no doubt this serious man was sad to come across like a character in a Thomas Pynchon novel. His email made it down the chain of command to Thorpe. And Bill Vaginal wasn’t the only complainant: As Thorpe recalls, Goldman Sachs was mad that Word was always turning it into Goddamn Sachs.

Thorpe went through the dictionary and took out all the words marked as “vulgar.” Then he threw in a few anatomical terms for good measure. The resulting list ran to hundreds of entries:

anally, asshole, battle-axe, battleaxe, bimbo, booger, boogers, butthead, Butthead …

With these sorts of master lists in place—the corrections, the exceptions, and the to-be-primly-ignored—the joists of autocorrect, then still a subdomain of spell-check, were in place for the early releases of Word. Microsoft’s dominance at the time ensured that autocorrect became globally ubiquitous, along with some of its idiosyncrasies. By the early 2000s, European bureaucrats would begin to notice what came to be called the Cupertino effect, whereby the word cooperation (bizarrely included only in hyphenated form in the standard Word dictionary) would be marked wrong, with a suggested change to Cupertino. There are thus many instances where one parliamentary back-bencher or another longs for increased Cupertino between nations. Since then, linguists have adopted the word cupertino as a term of art for such trapdoors that have been assimilated into the language.


Autocorrection is no longer an overqualified intern drawing up lists of directives; it’s now a vast statistical affair in which petabytes of public words are examined to decide when a usage is popular enough to become a probabilistically savvy replacement. The work of the autocorrect team has been made algorithmic and outsourced to the cloud.

A handful of factors are taken into account to weight the variables: keyboard proximity, phonetic similarity, linguistic context. But it’s essentially a big popularity contest. A Microsoft engineer showed me a slide where somebody was trying to search for the long-named Austrian action star who became governor of California. Schwarzenegger, he explained, “is about 10,000 times more popular in the world than its variants”—Shwaranegar or Scuzzynectar or what have you. Autocorrect has become an index of the most popular way to spell and order certain words.

When English spelling was first standardized, it was by the effective fiat of those who controlled the communicative means of production. Dictionaries and usage guides have always represented compromises between top-down prescriptivists—those who believe language ought to be used a certain way—and bottom-up descriptivists—those who believe, instead, that there’s no ought about it.

The emerging consensus on usage will be a matter of statistical arbitration, between the way “most” people spell something and the way “some” people do. If it proceeds as it has, it’s likely to be a winner-take-all affair, as alternatives drop out. (Though Apple’s recent introduction of personalized, “contextual” autocorrect—which can distinguish between the language you use with your friends and the language you use with your boss—might complicate that process of standardization and allow us the favor of our characteristic errors.)


The possibility of linguistic communication is grounded in the fact of what some philosophers of language have called the principle of charity: The first step in a successful interpretation of an utterance is the belief that it somehow accords with the universe as we understand it. This means that we have a propensity to take a sort of ownership over even our errors, hoping for the possibility of meaning in even the most perverse string of letters. We feel honored to have a companion like autocorrect who trusts that, despite surface clumsiness or nonsense, inside us always smiles an articulate truth.


Today the influence of autocorrect is everywhere: A commenter on the Language Log blog recently mentioned hearing of an entire dialect in Asia based on phone cupertinos, where teens used the first suggestion from autocomplete instead of their chosen word, thus creating a slang that others couldn’t decode. (It’s similar to the Anglophone teenagers who, in a previous texting era, claimed to have replaced the term of approval cool with that of book because of happenstance T9 input priority.) Surrealists once encouraged the practice of écriture automatique, or automatic writing, in order to reveal the peculiar longings of the unconscious. The crackpot suggestions of autocorrect have become our own form of automatic writing—but what they reveal are the peculiar statistics of a world id.

An Architect’s Dream – Kate Bush

That bit there, it was an accident

But he’s so pleased

It’s the best mistake, he could make

And it’s my favourite piece

It’s just great

The flick of a wrist

Twisting down to the hips

So the lovers begin, with a kiss

In a tryst

It’s just a smudge

But what it becomes

In his hands…

Curving and sweeping

Rising and reaching

I could feel what he was feeling

Lines like these have got to be

An architect’s dream