Tag Archives: memory

The ‘Memoire’ Typeface Changes Like a Memory as You Use It | WIRED

Memoire is a typeface that degrades with every use.

Source: The ‘Memoire’ Typeface Changes Like a Memory as You Use It | WIRED

A MEMORY IS a mutable thing. One moment it’s in our minds, hard as concrete. The next, it’s still there, yet different. With each passing day, the outline of what we remember softens; the veracity of what we’ve experienced gradually takes on a surreal filter.

The poignancy of this fact—that memories transform with time—is a frequent source of inspiration for artists and designers, who have long grappled with how to best convey the tenuous relationship between reality and perception. A new typeface called Memoire is designed to reflect the ever-shifting shapes memories take as we replay them in our minds. Designers Ryan Bugden and Michelle Wainer created the custom typeface for La Petite Mort, a biannual magazine produced by New York creative agency Sub Rosa. The font, used as the headline typeface for each of the magazine’s 16 stories, evolves from page to page. “The core idea was that it would change over time—similar to how every time you revisit a memory, it in fact changes based on the current context you’re in,” Wainer says.

It’s hard to see the changes at first. The sharpness of the serifs softens almost imperceptibly with every use. On the first page, edges are knife-like; by the last, they are almost friendly in their roundness. “The experience we had in mind was very subtle, something you feel before you notice,” Bugden says.

Memoire is loosely based on De Vinne, a peculiar metal typeface Gustav Schroeder designed in the 19th century. In the days of metal type, degradation was an inevitable part of the process. Every time a letter was pressed, its edges softened “like eroding mountains,” Wainer says. Of course, this process played out more gradually than it does with Memoire, where the degeneration is guided and intentional, facilitated by technology.

Bugden started by drawing two master fonts: The first a crisp-edged serif, the latter, a softer variation. From there he used software to generate fonts two through 15, which change by subtle degrees. Any alteration to the intermediary stages required a change to the master fonts.

At first, it appears that the typefaces increase in weight, but that’s not so. “What’s interesting about this typeface is it’s not the weight of the typeface we’re looking at, it’s the quality of the curves,” he says.

Memoire was designed for print, though it lives beautifully as a digital file, where you can see the transformation in hyper speed. Reading it in print is more of an exercise in perception, and in many ways, the typeface is a metaphor for memory layered upon yet another metaphor: By its very nature, the print page blurs and fades, each time it’s touched.

Advertisements

“Tip-of-the-Tongue Syndrome,” Transactive Memory, and How the Internet Is Making Us Smarter | Brain Pickings

“Tip-of-the-Tongue Syndrome,” Transactive Memory, and How the Internet Is Making Us Smarter | Brain Pickings.

Vannevar Bush’s ‘memex’ — short for ‘memory index’ — a primitive vision for a personal hard drive for information storage and management.

“At their best, today’s digital tools help us see more, retain more, communicate more. At their worst, they leave us prey to the manipulation of the toolmakers. But on balance, I’d argue, what is happening is deeply positive. This book is about the transformation.”

[…]

One of his most fascinating and important points has to do with our outsourcing of memory — or, more specifically, our increasingly deft, search-engine-powered skills of replacing the retention of knowledge in our own brains with the on-demand access to knowledge in the collective brain of the internet. Think, for instance, of those moments when you’re trying to recall the name of a movie but only remember certain fragmentary features — the name of the lead actor, the gist of the plot, a song from the soundtrack. Thompson calls this “tip-of-the-tongue syndrome” and points out that, today, you’ll likely be able to reverse-engineer the name of the movie you don’t remember by plugging into Google what you do remember about it.

[…]

“Tip-of-the-tongue syndrome is an experience so common that cultures worldwide have a phrase for it. Cheyenne Indians call it navonotootse’a, which means “I have lost it on my tongue”; in Korean it’s hyeu kkedu-te mam-dol-da, which has an even more gorgeous translation: “sparkling at the end of my tongue.” The phenomenon generally lasts only a minute or so; your brain eventually makes the connection. But … when faced with a tip-of-the-tongue moment, many of us have begun to rely instead on the Internet to locate information on the fly. If lifelogging … stores “episodic,” or personal, memories, Internet search engines do the same for a different sort of memory: “semantic” memory, or factual knowledge about the world. When you visit Paris and have a wonderful time drinking champagne at a café, your personal experience is an episodic memory. Your ability to remember that Paris is a city and that champagne is an alcoholic beverage — that’s semantic memory.”

[…]

“Writing — the original technology for externalizing information — emerged around five thousand years ago, when Mesopotamian merchants began tallying their wares using etchings on clay tablets. It emerged first as an economic tool. As with photography and the telephone and the computer, newfangled technologies for communication nearly always emerge in the world of commerce. The notion of using them for everyday, personal expression seems wasteful, risible, or debased. Then slowly it becomes merely lavish, what “wealthy people” do; then teenagers take over and the technology becomes common to the point of banality.”

Thompson reminds us of the anecdote, by now itself familiar “to the point of banality,” about Socrates and his admonition that the “technology” of writing would devastate the Greek tradition of debate and dialectic, and would render people incapable of committing anything to memory because “knowledge stored was not really knowledge at all.” He cites Socrates’s parable of the Egyptian god Theuth and how he invented writing, offering it as a gift to the king of Egypt,

“This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

That resistance endured as technology changed shape, across the Middle Ages and past Gutenberg’s revolution, but it wasn’t without counter-resistance: Those who recorded their knowledge in writing and, eventually, collected it in the form of books argued that it expanded the scope of their curiosity and the ideas they were able to ponder, whereas the mere act of rote memorization made no guarantees of deeper understanding.

Ultimately, however, Thompson points out that Socrates was both right and wrong: It’s true that, with some deliberately cultivated exceptions and neurological outliers, few thinkers today rely on pure memorization and can recite extensive passages of text from memory. But what Socrates failed to see was the extraordinary dot-connecting enabled by access to knowledge beyond what our own heads can hold — because, as Amanda Palmer poignantly put it, “we can only connect the dots that we collect,” and the outsourcing of memory has exponentially enlarged our dot-collections.

With this in mind, Thompson offers a blueprint to this newly developed system of knowledge management in which access is critical:

“If you are going to read widely but often read books only once; if you going to tackle the ever-expanding universe of ideas by skimming and glancing as well as reading deeply; then you are going to rely on the semantic-memory version of gisting. By which I mean, you’ll absorb the gist of what you read but rarely retain the specifics. Later, if you want to mull over a detail, you have to be able to refind a book, a passage, a quote, an article, a concept.”

This, he argues, is also how and why libraries were born — the death of the purely oral world and the proliferation of print after Gutenberg placed new demands on organizing and storing human knowledge. And yet storage and organization soon proved to be radically different things:

“The Gutenberg book explosion certainly increased the number of books that libraries acquired, but librarians had no agreed-upon ways to organize them. It was left to the idiosyncrasies of each. A core job of the librarian was thus simply to find the book each patron requested, since nobody else knew where the heck the books were. This created a bottleneck in access to books, one that grew insufferable in the nineteenth century as citizens began swarming into public venues like the British Library. “Complaints about the delays in the delivery of books to readers increased,” as Matthew Battles writes in Library: An Unquiet History, “as did comments about the brusqueness of the staff.” Some patrons were so annoyed by the glacial pace of access that they simply stole books; one was even sentenced to twelve months in prison for the crime. You can understand their frustration. The slow speed was not just a physical nuisance, but a cognitive one.”

The solution came in the late 19th century by way of Melville Dewey, whose decimal system imposed order by creating a taxonomy of book placement, eventually rendering librarians unnecessary — at least in their role as literal book-retrievers. They became, instead, curiosity sherpas who helped patrons decide what to read and carry out comprehensive research. In many ways, they came to resemble the editors and curators who help us navigate the internet today, framing for us what is worth attending to and why.

[…]

“The history of factual memory has been fairly predictable up until now. With each innovation, we’ve outsourced more information, then worked to make searching more efficient. Yet somehow, the Internet age feels different. Quickly pulling up [the answer to a specific esoteric question] on Google seems different from looking up a bit of trivia in an encyclopedia. It’s less like consulting a book than like asking someone a question, consulting a supersmart friend who lurks within our phones.”

And therein lies the magic of the internet — that unprecedented access to humanity’s collective brain. Thompson cites the work of Harvard psychologist Daniel Wegner, who first began exploring this notion of collective rather than individual knowledge in the 1980s by observing how partners in long-term relationships often divide and conquer memory tasks in sharing the household’s administrative duties:

“Wegner suspected this division of labor takes place because we have pretty good “metamemory.” We’re aware of our mental strengths and limits, and we’re good at intuiting the abilities of others. Hang around a workmate or a romantic partner long enough and you begin to realize that while you’re terrible at remembering your corporate meeting schedule, or current affairs in Europe, or how big a kilometer is relative to a mile, they’re great at it. So you begin to subconsciously delegate the task of remembering that stuff to them, treating them like a notepad or encyclopedia. In many respects, Wegner noted, people are superior to these devices, because what we lose in accuracy we make up in speed.

[…]

Wegner called this phenomenon “transactive” memory: two heads are better than one. We share the work of remembering, Wegner argued, because it makes us collectively smarter — expanding our ability to understand the world around us.”

[…]

This very outsourcing of memory requires that we learn what the machine knows — a kind of meta-knowledge that enables us to retrieve the information when we need it. And, reflecting on Sparrow’s findings, Thomspon points out that this is neither new nor negative:

“We’ve been using transactive memory for millennia with other humans. In everyday life, we are only rarely isolated, and for good reason. For many thinking tasks, we’re dumber and less cognitively nimble if we’re not around other people. Not only has transactive memory not hurt us, it’s allowed us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone.”

[…]

Outsourcing our memory to machines rather than to other humans, in fact, offers certain advantages by pulling us into a seemingly infinite rabbit hole of indiscriminate discovery:

“In some ways, machines make for better transactive memory buddies than humans. They know more, but they’re not awkward about pushing it in our faces. When you search the Web, you get your answer — but you also get much more. Consider this: If I’m trying to remember what part of Pakistan has experienced many U.S. drone strikes and I ask a colleague who follows foreign affairs, he’ll tell me “Waziristan.” But when I queried this once on the Internet, I got the Wikipedia page on “Drone attacks in Pakistan.” A chart caught my eye showing the astonishing increase of drone attacks (from 1 a year to 122 a year); then I glanced down to read a précis of studies on how Waziristan residents feel about being bombed. (One report suggested they weren’t as opposed as I’d expected, because many hated the Taliban, too.) Obviously, I was procrastinating. But I was also learning more, reinforcing my schematic understanding of Pakistan.”

[…]

“The real challenge of using machines for transactive memory lies in the inscrutability of their mechanics. Transactive memory works best when you have a sense of how your partners’ minds work — where they’re strong, where they’re weak, where their biases lie. I can judge that for people close to me. But it’s harder with digital tools, particularly search engines. You can certainly learn how they work and develop a mental model of Google’s biases. … But search companies are for-profit firms. They guard their algorithms like crown jewels. This makes them different from previous forms of outboard memory. A public library keeps no intentional secrets about its mechanisms; a search engine keeps many. On top of this inscrutability, it’s hard to know what to trust in a world of self-publishing. To rely on networked digital knowledge, you need to look with skeptical eyes. It’s a skill that should be taught with the same urgency we devote to teaching math and writing.

Thompson’s most important point, however, has to do with how outsourcing our knowledge to digital tools actually hampers the very process of creative thought, which relies on our ability to connect existing ideas from our mental pool of resources into new combinations, or what the French polymath Henri Poincaré has famously termed “sudden illuminations.” Without a mental catalog of materials which to mull and let incubate in our fringe consciousness, our capacity for such illuminations is greatly deflated. Thompson writes:

“These eureka moments are familiar to all of us; they’re why we take a shower or go for a walk when we’re stuck on a problem. But this technique works only if we’ve actually got a lot of knowledge about the problem stored in our brains through long study and focus. … You can’t come to a moment of creative insight if you haven’t got any mental fuel. You can’t be googling the info; it’s got to be inside you.”

[…]

“Evidence suggests that when it comes to knowledge we’re interested in — anything that truly excites us and has meaning — we don’t turn off our memory. Certainly, we outsource when the details are dull, as we now do with phone numbers. These are inherently meaningless strings of information, which offer little purchase on the mind. … It makes sense that our transactive brains would hand this stuff off to machines. But when information engages us — when we really care about a subject — the evidence suggests we don’t turn off our memory at all.”

[…]

“In an ideal world, we’d all fit the Renaissance model — we’d be curious about everything, filled with diverse knowledge and thus absorbing all current events and culture like sponges. But this battle is age-old, because it’s ultimately not just technological. It’s cultural and moral and spiritual; “getting young people to care about the hard stuff” is a struggle that goes back centuries and requires constant societal arguments and work. It’s not that our media and technological environment don’t matter, of course. But the vintage of this problem indicates that the solution isn’t merely in the media environment either.”

[…]

“A tool’s most transformative uses generally take us by surprise.”

[…]

“How should you respond when you get powerful new tools for finding answers?

Think of harder questions.”

All You Have Eaten: On Keeping a Perfect Record | Longreads

All You Have Eaten: On Keeping a Perfect Record | Longreads.

Screen Shot 2014-07-07 at 4.45.51 PM

Over the course of his or her lifetime, the average person will eat 60,000 pounds of food, the weight of six elephants.

The average American will drink over 3,000 gallons of soda. He will eat about 28 pigs, 2,000 chickens, 5,070 apples, and 2,340 pounds of lettuce. How much of that will he remember, and for how long, and how well?

[…]

The human memory is famously faulty; the brain remains mostly a mystery. We know that comfort foods make the pleasure centers in our brains light up the way drugs do. We know, because of a study conducted by Northwestern University and published in the Journal of Neuroscience, that by recalling a moment, you’re altering it slightly, like a mental game of Telephone—the more you conjure a memory, the less accurate it will be down the line. Scientists have implanted false memories in mice and grown memories in pieces of brain in test tubes. But we haven’t made many noteworthy strides in the thing that seems most relevant: how not to forget.

Unless committed to memory or written down, what we eat vanishes as soon as it’s consumed. That’s the point, after all. But because the famous diarist Samuel Pepys wrote, in his first entry, “Dined at home in the garret, where my wife dressed the remains of a turkey, and in the doing of it she burned her hand,” we know that Samuel Pepys, in the 1600s, ate turkey. We know that, hundreds of years ago, Samuel Pepys’s wife burned her hand. We know, because she wrote it in her diary, that Anne Frank at one point ate fried potatoes for breakfast. She once ate porridge and “a hash made from kale that came out of the barrel.”

For breakfast on January 2, 2008, I ate oatmeal with pumpkin seeds and brown sugar and drank a cup of green tea.

I know because it’s the first entry in a food log I still keep today. I began it as an experiment in food as a mnemonic device. The idea was this: I’d write something objective every day that would cue my memories into the future—they’d serve as compasses by which to remember moments.

Andy Warhol kept what he called a “smell collection,” switching perfumes every three months so he could reminisce more lucidly on those months whenever he smelled that period’s particular scent. Food, I figured, took this even further. It involves multiple senses, and that’s why memories that surround food can come on so strong.

What I’d like to have is a perfect record of every day. I’ve long been obsessed with this impossibility, that every day be perfectly productive and perfectly remembered. What I remember from January 2, 2008 is that after eating the oatmeal I went to the post office, where an old woman was arguing with a postal worker about postage—she thought what she’d affixed to her envelope was enough and he didn’t.

I’m terrified of forgetting. My grandmother has battled Alzheimer’s for years now, and to watch someone battle Alzheimer’s—we say “battle,” as though there’s some way of winning—is terrifying. If I’m always thinking about dementia, my unscientific logic goes, it can’t happen to me (the way an earthquake comes when you don’t expect it, and so the best course of action is always to expect it). “Really, one might almost live one’s life over, if only one could make a sufficient effort of recollection” is a sentence I once underlined in John Banville’s The Sea (a book that I can’t remember much else about). But effort alone is not enough and isn’t particularly reasonable, anyway. A man named Robert Shields kept the world’s longest diary: he chronicled every five minutes of his life until a stroke in 2006 rendered him unable to. He wrote about microwaving foods, washing dishes, bathroom visits, writing itself. When he died in 2007, he left 37.5 million words behind—ninety-one boxes of paper. Reading his obituary, I wondered if Robert Shields ever managed to watch a movie straight through.

Last spring, as part of a NASA-funded study, a crew of three men and three women with “astronaut-like” characteristics spent four months in a geodesic dome in an abandoned quarry on the northern slope of Hawaii’s Mauna Loa volcano.

For those four months, they lived and ate as though they were on Mars, only venturing outside to the surrounding Mars-like, volcanic terrain, in simulated space suits.[1] Hawaii Space Exploration Analog and Simulation (HI-SEAS) is a four-year project: a series of missions meant to simulate and study the challenges of long-term space travel, in anticipation of mankind’s eventual trip to Mars. This first mission’s focus was food.

Getting to Mars will take roughly six to nine months each way, depending on trajectory; the mission itself will likely span years. So the question becomes: How do you feed astronauts for so long? On “Mars,” the HI-SEAS crew alternated between two days of pre-prepared meals and two days of dome-cooked meals of shelf-stable ingredients. Researchers were interested in the answers to a number of behavioral issues: among them, the well-documented phenomenon of menu fatigue (when International Space Station astronauts grow weary of their packeted meals, they tend to lose weight). They wanted to see what patterns would evolve over time if a crew’s members were allowed dietary autonomy, and given the opportunity to cook for themselves (“an alternative approach to feeding crews of long term planetary outposts,” read the open call).

Everything was hyper-documented. Everything eaten was logged in painstaking detail: weighed, filmed, and evaluated. The crew filled in surveys before and after meals: queries into how hungry they were, their first impressions, their moods, how the food smelled, what its texture was, how it tasted. They documented their time spent cooking; their water usage; the quantity of leftovers, if any. The goal was to measure the effect of what they ate on their health and morale, along with other basic questions concerning resource use. How much water will it take to cook on Mars? How much water will it take to wash dishes? How much time is required; how much energy? How will everybody feel about it all?

[…]

The main food study had a big odor identification component to it: the crew took scratch-n-sniff tests, which Kate said she felt confident about at the mission’s start, and less certain about near the end. “The second-to-last test,” she said, “I would smell grass and feel really wistful.” Their noses were mapped with sonogram because, in space, the shape of your nose changes. And there were, on top of this, studies unrelated to food. They exercised in anti-microbial shirts (laundry doesn’t happen in space), evaluated their experiences hanging out with robot pets, and documented their sleep habits.

[…]

“We all had relationships outside that we were trying to maintain in some way,” Kate said. “Some were kind of new, some were tenuous, some were old and established, but they were all very difficult to maintain. A few things that could come off wrong in an e-mail could really bum you out for a long time.”

She told me about another crew member whose boyfriend didn’t email her at his usual time. This was roughly halfway through the mission. She started to get obsessed with the idea that maybe he got into a car accident. “Like seriously obsessed,” Kate said. “I was like, ‘I think your brain is telling you things that aren’t actually happening. Let’s just be calm about this,’ and she was like, ‘Okay, okay.’ But she couldn’t sleep that night. In the end he was just like, ‘Hey, what’s up?’ I knew he would be fine, but I could see how she could think something serious had happened.”

“My wife sent me poems every day but for a couple days she didn’t,” Kate said. “Something was missing from those days, and I don’t think she could have realized how important they were. It was weird. Everything was bigger inside your head because you were living inside your head.”

[…]

When I look back on my meals from the past year, the food log does the job I intended more or less effectively.

I can remember, with some clarity, the particulars of given days: who I was with, how I was feeling, the subjects discussed. There was the night in October I stress-scarfed a head of romaine and peanut butter packed onto old, hard bread; the somehow not-sobering bratwurst and fries I ate on day two of a two-day hangover, while trying to keep things light with somebody to whom, the two nights before, I had aired more than I meant to. There was the night in January I cooked “rice, chicken stirfry with bell pepper and mushrooms, tomato-y Chinese broccoli, 1 bottle IPA” with my oldest, best friend, and we ate the stirfry and drank our beers slowly while commiserating about the most recent conversations we’d had with our mothers.

But reading the entries from 2008, that first year, does something else to me: it suffuses me with the same mortification as if I’d written down my most private thoughts (that reaction is what keeps me from maintaining a more conventional journal). There’s nothing especially incriminating about my diet, except maybe that I ate tortilla chips with unusual frequency, but the fact that it’s just food doesn’t spare me from the horror and head-shaking that comes with reading old diaries. Mentions of certain meals conjure specific memories, but mostly what I’m left with are the general feelings from that year. They weren’t happy ones. I was living in San Francisco at the time. A relationship was dissolving.

It seems to me that the success of a relationship depends on a shared trove of memories. Or not shared, necessarily, but not incompatible. That’s the trouble, I think, with parents and children: parents retain memories of their children that the children themselves don’t share. My father’s favorite meal is breakfast and his favorite breakfast restaurant is McDonald’s, and I remember—having just read Michael Pollan or watched Super Size Me—self-righteously not ordering my regular egg McMuffin one morning, and how that actually hurt him.

When a relationship goes south, it’s hard to pinpoint just where or how—especially after a prolonged period of it heading that direction. I was at a loss with this one. Going forward, I didn’t want not to be able to account for myself. If I could remember everything, I thought, I’d be better equipped; I’d be better able to make proper, comprehensive assessments—informed decisions. But my memory had proved itself unreliable, and I needed something better. Writing down food was a way to turn my life into facts: if I had all the facts, I could keep them straight. So the next time this happened I’d know exactly why—I’d have all the data at hand.

In the wake of that breakup there were stretches of days and weeks of identical breakfasts and identical dinners. Those days and weeks blend into one another, become indistinguishable, and who knows whether I was too sad to be imaginative or all the unimaginative food made me sadder.

[…]

“I’m always really curious about who you are in a different context. Who am I completely removed from Earth—or pretending to be removed from Earth? When you’re going further and further from this planet, with all its rules and everything you’ve ever known, what happens? Do you invent new rules? What matters to you when you don’t have constructs? Do you take the constructs with you? On an individual level it was an exploration of who I am in a different context, and on a larger scale, going to another planet is an exploration about what humanity is in a different context.”

[…]

What I remember is early that evening, drinking sparkling wine and spreading cream cheese on slices of a soft baguette from the fancy Key Biscayne Publix, then spooning grocery-store caviar onto it (“Lumpfish caviar and Prosecco, definitely, on the balcony”). I remember cooking dinner unhurriedly (“You were comparing prices for the seafood and I was impatient”)—the thinnest pasta I could find, shrimp and squid cooked in wine and lots of garlic—and eating it late (“You cooked something good, but I can’t remember what”) and then drinking a café Cubano even later (“It was so sweet it made our teeth hurt and then, for me at least, immediately precipitated a metabolic crisis”) and how, afterward, we all went to the empty beach and got in the water which was, on that warm summer day, not even cold (“It was just so beautiful after the rain”).

“And this wasn’t the same trip,” wrote that wrong-for-me then-boyfriend, “but remember when you and I walked all the way to that restaurant in Bill Baggs park, at the southern tip of the island, and we had that painfully sweet white sangria, and ceviche, and walked back and got tons of mosquito bites, but we didn’t care, and then we were on the beach somehow and we looked at the red lights on top of all the buildings, and across the channel at Miami Beach, and went in the hot Miami ocean, and most importantly it was National Fish Day?”

And it’s heartening to me that I do remember all that—had remembered without his prompting, or consulting the record (I have written down: “D: ceviche; awful sangria; fried plantains; shrimp paella.” “It is National fish day,” I wrote. “There was lightning all night!”). It’s heartening that my memory isn’t as unreliable as I worry it is. I remember it exactly as he describes: the too-sweet sangria at that restaurant on the water, how the two of us had giggled so hard over nothing and declared that day “National Fish Day,” finding him in the kitchen at four in the morning, dipping a sausage into mustard—me taking that other half of the sausage, dipping it into mustard—the two of us deciding to drive the six hours back to Gainesville, right then.

“That is a really happy memory,” he wrote to me. “That is my nicest memory from that year and from that whole period. I wish we could live it again, in some extra-dimensional parallel life.”

Three years ago I moved back to San Francisco, which was, for me, a new-old city.

I’d lived there twice before. The first time I lived there was a cold summer in 2006, during which I met that man I’d be broken up about a couple years later. And though that summer was before I started writing down the food, and before I truly learned how to cook for myself, I can still remember flashes: a dimly lit party and drinks with limes in them and how, ill-versed in flirting, I took the limes from his drink and put them into mine. I remember a night he cooked circular ravioli he’d bought from an expensive Italian grocery store, and zucchini he’d sliced into thin coins. I remembered him splashing Colt 45—leftover from a party—into the zucchini as it was cooking, and all of that charming me: the Colt 45, the expensive ravioli, this dinner of circles.

The second time I lived in San Francisco was the time our thing fell apart. This was where my terror had originated: where I remembered the limes and the ravioli, he remembered or felt the immediacy of something else, and neither of us was right or wrong to remember what we did—all memories, of course, are valid—but still, it sucked. And now I have a record reminding me of the nights I came home drunk and sad and, with nothing else in the house, sautéed kale; blanks on the days I ran hungry to Kezar Stadium from the Lower Haight, running lap after lap after lap to turn my brain off, stopping to read short stories at the bookstore on the way home, all to turn off the inevitable thinking, and at home, of course, the inevitable thinking.

[…]

I’m not sure what to make of this data—what conclusions, if any, to draw. What I know is that it accumulates and disappears and accumulates again. No matter how vigilantly we keep track—even if we spend four months in a geodesic dome on a remote volcano with nothing to do but keep track—we experience more than we have the capacity to remember; we eat more than we can retain; we feel more than we can possibly carry with us. And maybe forgetting isn’t so bad. I know there is the “small green apple” from the time we went to a moving sale and he bought bricks, and it was raining lightly, and as we were gathering the bricks we noticed an apple tree at the edge of the property with its branches overhanging into the yard, and we picked two small green apples that’d been washed by the rain, and wiped them off on our shirts. They surprised us by being sweet and tart and good. We put the cores in his car’s cup holders. There was the time he brought chocolate chips and two eggs and a Tupperware of milk to my apartment, and we baked cookies. There are the times he puts candy in my jacket’s small pockets—usually peppermints so ancient they’ve melted and re-hardened inside their wrappers—which I eat anyway, and then are gone, but not gone.