Monday, May 23, 2016

Which Rock Star Will Historians of the Future Remember?

Classifying anyone as the “most successful” at anything tends to reflect more on the source than the subject. So keep that in mind when I make the following statement: John Philip Sousa is the most successful American musician of all time.

Marching music is a maddeningly durable genre, recognizable to pretty much everyone who has lived in the United States for any period. It works as a sonic shorthand for any filmmaker hoping to evoke the late 19th century and serves as the auditory backdrop for national holidays, the circus and college football. It’s not “popular” music, but it’s entrenched within the popular experience. It will be no less fashionable tomorrow than it is today.

And this entire musical idiom is now encapsulated in one person: John Philip Sousa. Even the most cursory two-sentence description of marching music inevitably cites him by name. I have no data on this, but I would assert that if we were to ask the entire population of the United States to name every composer of marching music they could think of, 98 percent of the populace would name either one person (Sousa) or no one at all. There’s just no separation between the awareness of this person and the awareness of this music, and it’s hard to believe that will ever change.

Now, the reason this happened — or at least the explanation we’ve decided to accept — is that Sousa was simply the best at this art. He composed 136 marches over a span of six decades and is regularly described as the most famous musician of his era. The story of his life and career has been shoehorned into the U.S. education curriculum at a fundamental level. (I first learned of Sousa in fourth grade, a year before we memorized the state capitals.) And this, it seems, is how mainstream musical memory works. As the timeline moves forward, tangential artists in any field fade from the collective radar, until only one person remains; the significance of that individual is then exaggerated, until the genre and the person become interchangeable. Sometimes this is easy to predict: I have zero doubt that the worldwide memory of Bob Marley will eventually have the same tenacity and familiarity as the worldwide memory of reggae itself.

But envisioning this process with rock music is harder. Almost anything can be labeled “rock”: Metallica, ABBA, Mannheim Steamroller, a haircut, a muffler. If you’re a successful tax lawyer who owns a hot tub, clients will refer to you as a “rock-star C.P.A.” when describing your business to less-hip neighbors. The defining music of the first half of the 20th century was jazz; the defining music of the second half of the 20th century was rock, but with an ideology and saturation far more pervasive. Only television surpasses its influence.

And pretty much from the moment it came into being, people who liked rock insisted it was dying. The critic Richard Meltzer supposedly claimed that rock was already dead in 1968. And he was wrong to the same degree that he was right. Meltzer’s wrongness is obvious and does not require explanation, unless you honestly think “Purple Rain” is awful. But his rightness is more complicated: Rock is dead, in the sense that its “aliveness” is a subjective assertion based on whatever criteria the listener happens to care about.

This is why the essential significance of rock remains a plausible thing to debate, as does the relative value of major figures within that system (the Doors, R.E.M., Radiohead). It still projects the illusion of a universe containing multitudes. But it won’t seem that way in 300 years.

The symbolic value of rock is conflict-based: It emerged as a byproduct of the post-World War II invention of the teenager, soundtracking a 25-year period when the gap between generations was utterly real and uncommonly vast. That dissonance gave rock music a distinctive, nonmusical importance for a long time. But that period is over. Rock — or at least the anthemic, metaphoric, Hard Rock Cafe version of big rock — has become more socially accessible but less socially essential, synchronously shackled by its own formal limitations. Its cultural recession is intertwined with its cultural absorption. As a result, what we’re left with is a youth-oriented music genre that a) isn’t symbolically important; b) lacks creative potential; and c) has no specific tie to young people. It has completed its historical trajectory. Which means, eventually, it will exist primarily as an academic pursuit. It will exist as something people have to be taught to feel and understand.

I imagine a college classroom in 300 years, in which a hip instructor is leading a tutorial filled with students. These students relate to rock music with no more fluency than they do the music of Mesopotamia: It’s a style they’ve learned to recognize, but just barely (and only because they’ve taken this specific class). Nobody in the room can name more than two rock songs, except the professor. He explains the sonic structure of rock, its origins, the way it served as cultural currency and how it shaped and defined three generations of a global superpower. He shows the class a photo, or perhaps a hologram, of an artist who has been intentionally selected to epitomize the entire concept. For these future students, that singular image defines what rock was.

So what’s the image?

Certainly, there’s one response to this hypothetical that feels immediate and sensible: the Beatles. All logic points to their dominance. They were the most popular band in the world during the period they were active and are only slightly less popular now, five decades later. The Beatles defined the concept of what a “rock group” was supposed to be, and all subsequent rock groups are (consciously or unconsciously) modeled upon the template they naturally embodied. Their 1964 appearance on “The Ed Sullivan Show” is so regularly cited as the genesis for other bands that they arguably invented the culture of the 1970s, a decade when they were no longer together. The Beatles arguably invented everything, including the very notion of a band’s breaking up. There are still things about the Beatles that can’t be explained, almost to the point of the supernatural: the way their music resonates with toddlers, for example, or the way it resonated with Charles Manson. It’s impossible to imagine another rock group where half its members faced unrelated assassination attempts. In any reasonable world, the Beatles are the answer to the question “Who will be the Sousa of rock?”

But our world is not reasonable. And the way this question will be asked tomorrow is (probably) not the same way we would ask it today.

In Western culture, virtually everything is understood through the process of storytelling, often to the detriment of reality. When we recount history, we tend to use the life experience of one person — the “journey” of a particular “hero,” in the lingo of the mythologist Joseph Campbell — as a prism for understanding everything else. That inclination works to the Beatles’ communal detriment. But it buoys two other figures: Elvis Presley and Bob Dylan. The Beatles are the most meaningful group, but Elvis and Dylan are the towering individuals, so eminent that I wouldn’t necessarily need to use Elvis’s last name or Dylan’s first.

Still, neither is an ideal manifestation of rock as a concept.

It has been said that Presley invented rock and roll, but he actually staged a form of primordial “prerock” that barely resembles the post-“Rubber Soul” aesthetics that came to define what this music is. He also exited rock culture relatively early; he was pretty much out of the game by 1973. Conversely, Dylan’s career spans the entirety of rock. Yet he never made an album that “rocked” in any conventional way (the live album “Hard Rain” probably comes closest). Still, these people are rock people. Both are integral to the core of the enterprise and influenced everything we have come to understand about the form (including the Beatles themselves, a group that would not have existed without Elvis and would not have pursued introspection without Dylan).

In 300 years, the idea of “rock music” being represented by a two‑pronged combination of Elvis and Dylan would be equitable and oddly accurate. But the passage of time makes this progressively more difficult. It’s always easier for a culture to retain one story instead of two, and the stories of Presley and Dylan barely intersect (they supposedly met only once, in a Las Vegas hotel room). As I write this sentence, the social stature of Elvis and Dylan feels similar, perhaps even identical. But it’s entirely possible one of them will be dropped as time plods forward. And if that happens, the consequence will be huge. If we concede that the “hero’s journey” is the de facto story through which we understand history, the differences between these two heroes would profoundly alter the description of what rock music supposedly was.

If Elvis (minus Dylan) is the definition of rock, then rock is remembered as showbiz. Like Frank Sinatra, Elvis did not write songs; he interpreted songs that were written by other people (and like Sinatra, he did this brilliantly). But removing the centrality of songwriting from the rock equation radically alters it. Rock becomes a performative art form, where the meaning of a song matters less than the person singing it. It becomes personality music, and the dominant qualities of Presley’s persona — his sexuality, his masculinity, his larger‑than‑life charisma — become the dominant signifiers of what rock was. His physical decline and reclusive death become an allegory for the entire culture. The reminiscence of the rock genre adopts a tragic hue, punctuated by gluttony, drugs and the conscious theft of black culture by white opportunists.

But if Dylan (minus Elvis) becomes the definition of rock, everything reverses. In this contingency, lyrical authenticity becomes everything; rock is somehow calcified as an intellectual craft, interlocked with the folk tradition. It would be remembered as far more political than it actually was, and significantly more political than Dylan himself. The fact that Dylan does not have a conventionally “good” singing voice becomes retrospective proof that rock audiences prioritized substance over style, and the portrait of his seven‑decade voyage would align with the most romantic version of how an eclectic collection of autonomous states eventually became a place called “America.”

These are the two best versions of this potential process. And both are flawed.

There is, of course, another way to consider how these things might unspool, and it might be closer to the way histories are actually built. I’m creating a binary reality where Elvis and Dylan start the race to posterity as equals, only to have one runner fall and disappear. The one who remains “wins” by default (and maybe that happens). But it might work in reverse. A more plausible situation is that future people will haphazardly decide how they want to remember rock, and whatever they decide will dictate who is declared its architect. If the constructed memory is a caricature of big‑hair arena rock, the answer is probably Elvis; if it’s a buoyant, unrealistic apparition of punk hagiography, the answer is probably Dylan. But both conclusions direct us back to the same recalcitrant question: What makes us remember the things we remember?

In 2014, the jazz historian Ted Gioia published a short essay about music criticism that outraged a class of perpetually outraged music critics. Gioia’s assertion was that 21st‑century music writing has devolved into a form of lifestyle journalism that willfully ignores the technical details of the music itself. Many critics took this attack personally and accused Gioia of devaluing their vocation. Which is odd, considering the colossal degree of power Gioia ascribes to record reviewers: He believes specialists are the people who galvanize history. Critics have almost no impact on what music is popular at any given time, but they’re extraordinarily well positioned to dictate what music is reintroduced after its popularity has waned.

“Over time, critics and historians will play a larger role in deciding whose fame endures,” Gioia wrote me in an email. “Commercial factors will have less impact. I don’t see why rock and pop will follow any different trajectory from jazz and blues.” He rattled off several illustrative examples: Ben Selvin outsold Louis Armstrong in the 1920s. In 1956, Nelson Riddle and Les Baxter outsold “almost every rock ’n’ roll star not named Elvis,” but they’ve been virtually erased from the public record. A year after that, the closeted gay crooner Tab Hunter was bigger than Jerry Lee Lewis and Fats Domino, “but critics and music historians hate sentimental love songs. They’ve constructed a perspective that emphasizes the rise of rock and pushes everything else into the background. Transgressive rockers, in contrast, enjoy lasting fame.” He points to a contemporary version of that phenomenon: “Right now, electronic dance music probably outsells hip‑hop. This is identical to the punk‑versus‑disco trade‑off of the 1970s. My prediction: edgy hip‑hop music will win the fame game in the long run, while E.D.M. will be seen as another mindless dance craze.”

Gioia is touching on a variety of volatile ideas here, particularly the outsize memory of transgressive art. His example is the adversarial divide between punk and disco: In 1977, the disco soundtrack to “Saturday Night Fever” and the Sex Pistols’ “Never Mind the Bollocks, Here’s the Sex Pistols” were both released. The soundtrack to “Saturday Night Fever” has sold more than 15 million copies; it took “Never Mind the Bollocks” 15 years to go platinum. Yet virtually all pop historiographers elevate the importance of the Pistols above that of the Bee Gees. The same year the Sex Pistols finally sold the millionth copy of their debut, SPIN magazine placed them on a list of the seven greatest bands of all time. “Never Mind the Bollocks” is part of the White House record library, supposedly inserted by Amy Carter just before her dad lost to Ronald Reagan. The album’s reputation improves by simply existing: In 1985, the British publication NME classified it as the 13th‑greatest album of all time; in 1993, NME made a new list and decided it now deserved to be ranked third. This has as much to do with its transgressive identity as its musical integrity. The album is overtly transgressive (and therefore memorable), while “Saturday Night Fever” has been framed as a prefab totem of a facile culture (and thus forgettable). For more than three decades, that has been the overwhelming consensus.

But I’ve noticed — just in the last four or five years — that this consensus is shifting. Why? Because the definition of “transgressive” is shifting. It’s no longer appropriate to dismiss disco as superficial. More and more, we recognize how disco latently pushed gay, urban culture into white suburbia, which is a more meaningful transgression than going on a British TV talk show and swearing at the host. So is it possible that the punk‑disco polarity will eventually flip? Yes. It’s possible everyone could decide to reverse how we remember 1977. But there’s still another stage here, beyond that hypothetical inversion: the stage in which everybody who was around for punk and disco is dead and buried, and no one is left to contradict how that moment felt. When that happens, the debate over transgressions freezes and all that is left is the music. Which means the Sex Pistols could win again or maybe they lose bigger, depending on the judge.

by Chuck Klosterman, NY Times |  Read more:
Image: Sagmeister & Walsh

Sunday, May 22, 2016

The Train That Saved Denver

A decade ago, travelers arriving at Denver’s sprawling new airport would look out over a vast expanse of flat, prairie dog-infested grassland and wonder if their plane had somehow fallen short of its destination. The $4.9 billion airport—at 53 square miles, larger than Manhattan—was derided as being “halfway to Kansas,” and given the emptiness of the 23-mile drive to the city, it felt that way.

Last month, arriving visitors boarded the first trains headed for downtown, a journey that zips past a new Japanese-style “smart city” emerging from the prairie before depositing passengers 37 minutes later in a bustling urban hive of restaurants, shops and residential towers that only six years ago was a gravelly no man’s land—an entire $2 billion downtown neighborhood that’s mushroomed up around the hub of Denver’s rapidly expanding light rail system.

The 22.8-mile spur from the airport to downtown is the latest addition to a regional rail system that has transformed Denver and its suburbs. Using an unprecedented public-private partnership that combines private funding, local tax dollars and federal grants, Denver has done something no other major metro area has accomplished in the past decade, though a number of cities have tried. At a moment when aging mass transit systems in several major cities are capturing headlines for mismanagement, chronic delays and even deaths, Denver is unveiling a shiny new and widely praised network: 68 stations along 10 different spurs, covering 98 miles, with another 15 miles still to come. Even before the new lines opened, 77,000 people were riding light rail each day, making it the eighth-largest system in the country even though Denver is not in the top 20 cities for population. The effects on the region’s quality of life have been measurable and also surprising, even to the project’s most committed advocates. Originally intended to unclog congested highways and defeat a stubborn brown smog that was as unhealthy as it was ugly, the new rail system has proven that its greatest value is the remarkable changes in land use its stations have prompted, from revitalizing moribund neighborhoods, like the area around Union Station, to creating new communities where once there was only sprawl or buffalo grass.

“We are talking about a culture-transforming moment,” says Denver mayor Michael Hancock. “Light rail has really moved Denver into the 21st century.”

“Our adolescence is over, and we’ve matured to adulthood,” he adds.

How the $7.6 billion FasTracks project saved Denver from a dreaded fate locals call “Houstonization” is the story of regional cooperation that required the buy-in of businesspeople, elected officials, civil servants and environmentalists across a region the size of Delaware. Their ability to work collectively—and the public’s willingness to approve major taxpayer investments—has created a transit system that is already altering Denver’s perception of itself, turning an auto-centric city into a higher-density, tightly-integrated urban center that aims to outcompete the bigger, older coastal cities on the global stage.

by Colin Woodard, Politico | Read more:
Image: Mark Peterson/Redux Pictures

Honda Grom: Big Thrills, Tiny Bike


The Honda Grom: Big Thrills, Tiny Bike

[ed. Very cool. Expect to see more of these (and others) on a street near you soon.]

Fractured: A First Date

I was going out. I deserved it. I’d had lunch — one Diet Coke, two Marlboro Lights and a Chef’s Signature Lean Cuisine. I’d even done two luxurious miles in 24 minutes on the treadmill at the gym down the block. My stomach growled, angry for being empty, but I felt thin and attractive. There’s nothing more dangerous than a girl who feels thin and attractive.

I hailed a cab to Union Hall in south Park Slope. It was warm for late October. I was meeting friends. The top floor of Union Hall has fireplaces, leather couches, an indoor bocce court and a library with actual books, where pseudo intellectuals discussed the same three writers (Hemingway, Kerouac, Salinger) between Jaeger bombs. Wrinkle-free gingham button-downs, Wayfarers even though it was dark, boat shoes because we were close to the Gowanus. These guys all went to honorary Ivies and had entry levels at their dads’ companies. The suit factory can produce a fun night. Just don’t expect them to go Dutch on your Plan B.

The line to the bar was long, so when I arrived I ordered two Jack and Diets for myself. The best investment you’ll ever make is a large tip on your first drink.

I made my way downstairs. My favorite kind of dance floor is so dark and crowded that no one notices that I can’t dance, and this was that. I swayed to the beat without spilling either drink (talent) and scanned the crowd.

Once I had a few (eight) drinks in me I had no use for the friends I came with. I talked to them all the time. I knew their deal. Whiskey made the world a warm hug. Everything and everyone was nice. Addiction, pollution, violence — these were things to worry about tomorrow.

I pushed away some young thugs wearing flat brims who were coming on stronger than my drink. It was clear they were working me together. Two guys? Maybe, but these were not my type. The night was just starting. It was too early to lower my expectations.

I walked back upstairs and outside to the patio and lit up. I saw him. He was standing alone with a beer, smoking. He was looking at his phone, the way you look at your phone when you have no one to talk to. It was easier to pick a guy up when they were separate from their bro herd.

“Hey, I’m Jessica. What’s that monkey doing on your glass?”

He had a little toy monkey hanging off the side of his pint. “It was trivia night,” he smiled.

The next two hours were a blur. So let’s call him Rick. He worked at an app. Everyone worked at an app. Liquor before beer or beer before liquor? I couldn’t remember which one was supposed to come first so I just went back and forth.

We were smoking again, out on the patio. My self-confidence was directly proportionate to my blood-alcohol level. I was Kate Moss. I pulled a bar stool between us. “Let’s arm wrestle,” I said. A command, not a question.

“Uhh, I don’t know. I bench 220. I was a high school quarterback …” he backpedaled.

“Stop making excuses,” I said.

We knelt down on the cement and put up our arms on the small uneven surface of the bar stool. He won the first round. We went left the second round.

“You’re letting me win,” I said. He just laughed. “Come on, arm wrestle me for real. Don’t be a girl!”

We went right again. My arm snapped in half. Rick had broken my humerus in two, a clean fracture. I bent my arm 90 degrees and watched my arm leave my arm. I took my left hand and held the two pieces of my right arm together. “Oh my God,” he said. He was freaking out. “Go close my tab and get us a cab. We’re going to the E.R.,” I said. Cool as a cucumber.

by Jessica Caldwell, NY Times | Read more:
Image: Maelle Doliveux

Saturday, May 21, 2016

Music of the Unquiet Mind


Through Cage and his take on Zen philosophy, I have made a truce with my O.C.D. I recognize that it is integral to who I am and have come to accept myself, warts and all. Obsessive-compulsives are, not surprisingly, perfectionists. Yet, I have learned to relinquish the grand illusion of the goal and relish, instead, the unfolding of the process. Cage’s highly forgiving definition of error, as “simply a failure to adjust immediately from a preconception to an actuality,” has helped temper my self-judgmental parameters of right and wrong, all or nothing. (...)

O.C.D.’s most salient feature is its viselike hold on the mind, imbuing unwanted thoughts with a ferocious, pitiless tenacity. Cage’s Zen-inspired text “Lecture on Nothing” is balm to an obsessive-compulsive: “Regard it as something seen momentarily, as though from a window while traveling … at any instant, one may leave it, and whenever one wishes one may return to it. Or you may leave it forever and never return to it, for we possess nothing. …Anything therefore is a delight (since we do not possess it) and thus need not fear its loss.”

Fear of loss rules the life of an obsessive-compulsive — fear of loss of control, fear of loss in both physical and metaphysical realms (paradoxically, the fear of losing worthwhile thoughts), and the ultimate fear — fear over the loss of time when consumed by compulsive rituals; I live in a constant race with time to make up for the time lost to the dictates of the disease.

by Margaret Leng Tan, NY Times | Read more:
Image: Karen Barbour
repost ( ..."I would prefer not to wear holes in the carpet of my mind.")

First Month, Last Month and Everything in Between

In the summer of 2013 my job at an advertising agency in London was transferred to New York. I was excited; who wouldn’t grab the opportunity to work in one of the world’s most exciting cities? What’s more, for a short period my company even paid for an apartment for myself and a colleague while we explored different neighbourhoods and found our own place to stay.

When the time came to navigate my way through New York’s real estate market, however, I was shocked to be confronted with prices and practices even London’s letting agents would shy away from.

I had previously lived in four different flats and house shares in and around King’s Cross and Angel over a period of six years. My last flat was on a Camden council estate and was a two-bed converted into a three bed, for which I was paying £550 a month before bills.

I thought the capital’s rental market was fast and aggressive – decisions had to be made on the spot and deposits paid quickly in order to secure a place. But that was before I moved to New York. Compared to London it felt like the wild west: I was shown apartments that didn’t seem legal or safe, usually by agents who were utterly clueless.

Learning the jargon was the first hurdle. When I arrived I had no idea what “pre-war” or “rail-road” meant. Pre-war sounds classy; you imagine big parlours and arches – but in reality it means a large room split up to cram in more people. This means one of you will have a sunlit bedroom but the other gets a tiny room with no windows. We saw several pre-war apartments. Some landlords, in an effort to improve the second room, had added a window … looking into another room – not ideal for privacy.

Rail-road style apartments are worse. There is no corridor. Instead, one room leads directly into the next, the ones in the middle having two doors. That means walking through your roommate’s bedroom every time you need to use the bathroom or leave the house.

And if an advert does not have “two true bedrooms” then prepare for the worst. In London two bedrooms means two bedrooms, each with four walls, a window and a door. I wasted so much time in New York being shown apartments where one of the rooms was in fact a corridor. Or have a window looking into the living room, or no window at all. Sometimes it would be no more than a glorified closet.

I learned that “walk-up” is also something to be wary of. It means no elevator, and stairs are no fun if you live in a high rise and have heavy shopping bags or a pile of laundry – New Yorkers don’t have washing machines in their homes. All this is before you even start thinking about the hassle of moving furniture in or out. I discovered that if an agency had taken professional photos, this usually meant something was wrong with the flat and they could not get it rented.

Eventually we found a place, and once we agreed to take it we were whisked to an office and told to put in an application. This could only be completed if we could give them a cheque for the first and last month’s rent, and our guarantor’s bank statement.

In New York the standard is for landlords to request evidence that you earn more than 40 times the monthly rent to prove you can comfortably afford it. I wasn’t even close. No one outside the US is allowed to be a guarantor, so I had to ask our company’s vice president to be mine.

There was no guarantee our application would be successful – multiple applications are accepted at once and you cannot be sure you will get back all of your money if you are unsuccessful. Although the deposit and rent are returned if you don’t secure the apartment, most realtors will keep some of the money to cover their admin charges, including the cost of background checks. I doubt very much that the background checks were even carried out.

by Alex Wolynski, The Guardian | Read more:
Image: Stan Honda/AFP/Getty

Robert Redford, Golden Boy

Robert Redford still does it for me. He did it for me when I first saw him in Butch Cassidy, he did it for me when he was washing Meryl Streep’s hair in Out of Africa. He did it for me in uniform in The Way We Were and with full hippie beard in Jeremiah Johnson. He’s classically handsome — the type of handsome on which you, your mom, your grandmother, and your best gay friend can all agree — with a flatness of expression that morphs sardonic when you least expect it. He has a storytime voice, the perfect level of tan, and haphazardly spaced highlights that betray a life lived en plein air. I love him for his palpable Westernness, his ease with open spaces, the scent of high altitude that seems to waft from him. He looks as good in jean cut-offs as he does in a well-tailored suit. And for nearly 40 years, he’s been Hollywood’s golden boy: likable and bankable, if a bit self-serious.

Redford belongs to the class of actors I think of in my head as the silver foxes: indigenous to the ‘60s and ‘70s, they’ve ripened before our eyes. Most of them have semi- or totally retired, some have passed away; all live in my memory both as their original, gorgeous selves and their well-lined, refined later-in-life iterations. Newman and Beatty, of course, but also De Niro and Hackman, Dustin Hoffman and Jack Nicholson, Jane Fonda and Julie Christie.

They’re not classic Hollywood, per se. They never had to deal with studio contracts. They got to use their real names and marry whom they pleased. They shunned publicity, or at least pretended to shun publicity as they posed for the cover of Life. They were a different type of star, in terms of interaction with the industry at large, but stars nonetheless — embodiments of what mattered to Americans at various cultural moments. And Redford, I realize now, was proof positive that beautiful American men could still exist amidst the turmoil of the age. Turns out he was a bit of a true liberal, but at the time, he had the looks of a jock, the demeanor of a respectable man, and just enough zest to titillate. I can’t quite decide whether he’s a good actor or a perfect star — which, if you think about it, is true of the most memorable of our idols.

Throughout his career, Redford’s split critics: maybe he could act, but could he act other than himself? And isn’t that the very hallmark of a star? David Thomson called him a waste, while Pauline Kael understood that his golden diffidence was part of his allure, what drew us back to watch him over and over again. He never gave himself over fully the way that his co-stars did — not because he wasn’t acting, but because that was the point. He was, and remains, too cool. Which is part of why his enduring popularity is so surprising — is it his looks that bring us back, again and again? Or is the semi-masochistic desire to watch him maintain that aloofness?

Redford was born in Santa Monica, because of course he was. His father was a milkman but, in a page straight from the American Dream handbook, eventually became an accountant, moving his family to Van Nuys. There, Redford played on the high school baseball team and, if looks are to be believed, slayed the entire female population. A baseball scholarship to the University of Colorado followed, and I can just imagine him skamping all around Boulder, drinking beer and breaking hearts.

I mean, please. But he had better things to do than play beer pong, so he traveled Europe, took painting at Pratt, and eventually got caught up in acting, making his way into the live television scene in New York, which was thriving in the ‘50s. At some point between flipflopping his way through Europe and going to Pratt, he married Lola Van Wagenen, whose native Utah he would gradually make his own.

by Anne Helen Petersen, The Hairpin |  Read more:
Image: uncredited

Mads Berg
via:

What Do Clothes Say?

Where language falls short though, clothes might speak. Ideas, we languidly suppose, are to be found in books and poems, visualised in buildings and paintings, exposited in philosophical propositions and mathematical deductions. They are taught in classrooms; expressed in language, number and diagram. Much trickier to accept is that clothes might also be understood as forms of thought, reflections and meditations as articulate as any poem or equation. What if the world could open up to us with the tug of a thread, its mysteries disentangling like a frayed hemline? What if clothes were not simply reflective of personality, indicative of our banal preferences for grey over green, but more deeply imprinted with the ways that human beings have lived: a material record of our experiences and an expression of our ambition? What if we could understand the world in the perfect geometry of a notched lapel, the orderly measures of a pleated skirt, the stilled, skin-warmed perfection of a circlet of pearls?

Some people love clothes: they collect them, care for and clamour over them, taking pains to present themselves correctly and considering their purchases with great seriousness. For some, the making and wearing of clothes is an art form, indicative of their taste and discernment: clothes signal their distinction. For others, clothes fulfill a function, or provide a uniform, barely warranting a thought beyond the requisite specifications of decency, the regulation of temperature and the unremarkable meeting of social mores. But clothes are freighted with memory and meaning: the ties, if you like, that bind. In clothes, we are connected to other people and other places in complicated, powerful and unyielding ways, expressed in an idiom that is found everywhere, if only we care to read it.

If dress claims our attention as a mode of understanding, it’s because, for all the abstract and elevated formulations of selfhood and the soul, our interior life is so often clothed. How could we ever pretend that the ways we dress are not concerned with our impulses to desire and deny, the fever and fret with which we love and are loved? The garments we wear bear our secrets and betray us at every turn, revealing more than we can know or intend. If through them we seek to declare our place in the world, our confidence and belonging, we do so under a veil of deception.

Old, favoured clothes can be loyal as lovers, when newer ones dazzle then betray us, treacherous in our moments of greatest need. There is a naivety in the perilous ways we trust in clothes. Shakespeare knew this. King Lear grandly insists to ragged Poor Tom that: ‘Through tattered clothes great vices do appear;/ Robes and furred gowns hide all’ – even when his own opulence can no longer obscure his moral bankruptcy. Emerson too, mockingly corrects us when he writes: ‘There is one other reason for dressing well… namely that dogs respect it [and] will not attack you.’ (...)

We’re prone to disparage as crude the analogue of self with stuff, as though the substance of the soul might only be short-changed by the material things with which we seek to express it. And it is difficult not to read the diminution of dress in philosophy as part of a more general disdain for matters regarded as maternal, domestic or feminine. ‘Surface,’ wrote Nietzsche, is a ‘woman’s soul, a mobile, stormy film on shallow water.’ Relegating women to the shallows is, of course, to deny them depth, but the surface to which they are condemned is not without its own qualities: fluidity, responsiveness, sensitivity to a given moment or sensation. Female writers have always understood this. In Edith Wharton’s novel The House of Mirth (1905), Lily Bart concedes to herself the powerful truth of her passion for Lawrence Selden:
She was very near hating him now; yet the sound of his voice, the way the light fell on his thin, dark hair, the way he sat and moved and wore his clothes – she was conscious that even these trivial things were inwoven with her deepest life.
When we speak of things being ‘woven together’, we mean affinity, association, inseparability, but Wharton’s ‘inwoven’ intimates more: an intimacy so close that it is constitutive. Wharton’s contemporary Oscar Wilde quipped in The Picture of Dorian Grey (1890) that ‘it is only shallow people who do not judge by appearances. The true mystery of the world is the visible, not the invisible.’ With his dandyish green coats and carnations, Wilde teasingly nudges us towards the stark secularity of a new world in which divinity could be as readily located in dresses as in deities.

But Wharton’s insight goes beyond reference to a newly outward-facing modernity, and is more profound than the upturned paradox of a surface that could speak of the inner self. Lily is bound to Lawrence not simply by some romantic pledge of affection but in the particularities of his being, as though the tightest seam ran back and forth between her slow-gathered sense impressions (his voice, his hair, his clothes) and the interior life to which they seem to reach. As Lily is to Lawrence, we too are inwoven with the stuff of the world and the people to whom those things belong or refer. (...)

This is not to say that clothes are the self, but to suggest, exploratively, that our experiences of selfhood are contoured and adumbrated by many things, including clothes, and that the prejudices by which we disregard the concern for appearances or relegate dress to the domain of vanity, are an obstacle to a significant kind of understanding. As Susan Sontag says, contra Plato, perhaps there is ‘[no] opposition between a style one assumes and one’s “true” being… In almost every case, our manner of appearing is our manner of being.’

Perhaps we simply are in clothes. And in clothes, our various selves are subject to modification, alteration and wear. This happens in clumsy ways – the glasses you hope might lend you new seriousness, the reddened lips that mimic arousal – but also in innumerably subtle ways: the heel that cants the body, contracting your stride, the tie that stiffens your neck and straightens your spine. Some garments constrict and reshape us physically, but also, sometimes, emotionally. And there are garments we can feel, that itch and chafe, that make apparent the difference of their textures to that of the surface of our skin, as though we and they are not one. In these, we are alert to the experience of being in our bodies, in a way that seems at odds with the rest of the world gliding past, apparently immune to discomfort. In such garments, too, we are always alert to the ever-present physicality of our bodies.

By contrast, there are clothes we wear almost imperceptibly, that are light or diaphanous to the point of being hardly seen or felt, as though we are sheathed in air. There are clothes that we are so accustomed to that we go about our business with barely a thought to the bodies they encase. If the self is somehow experienced, then perhaps there are moments when we strive to be seen and others when we seek a certain kind of invisibility. We wax and wane in the things that we wear. In clothes, there are always possibilities for difference and transformation.

by Shahidha Bari, Aeon | Read more:
Image: via:

Who Is A Millennial?

Given that generational labels are at best approximate and especially mutable around the edges it is of course impossible to fix a time frame on what age range defines millennials—let’s not even get into the tricky subject of who exactly we are speaking of when we use the word—but I will propose a rough guide to the characteristics common to the generations in order to better help you establish your place in our advertising-friendly cosmology.

• If your cohort is marked by self-obsession, a belief that you changed the world for the better and an inability to recognize that you got the best out of the economy, the environment and technology and you are going to die in comfort right before the negative effects that your reckless plunder on all of those things makes life unbearable for everyone who came after, you are a Boomer.

• If your cohort is marked by self-disgust, a belief that you never had a fair shot and an inability to shut up about how people used to talk to each other in person plus a vestigial obsession with shitty rock music from the Pacific Northwest, you are Generation X.

• If your cohort is marked by the belief that not enough people are making it about you, you are Generation Catalano.

• If your cohort is marked by self-obsession, a belief that you will change the world for the better and an inability to recognize that GIFs are not conversation, Tweeting is not marching, and quizzes that are almost comically transparent in their desire to turn you into a marketable commodity are not an actual ratification of how special you are, you are a millennial.

• If your cohort is marked by an astounding amount of potential and an already-notable lack of annoying self-regard you are whatever we would call the next generation. Unfortunately we are not going to come up with a name, because due to the way the Boomers fucked up the future for everyone you are going to spend most of your life running from fires and hiding from the killer robots that want to eat you. Sorry. Life isn’t fair (unless you’re a Boomer, in which case man did you really get away with one).

I hope that helps! But if you want something a little shorter so that you can share on social media here’s a brief takeaway that should settle the question at hand: If during the time it took to read this you started to get agitated about how long it had been since you last received praise from your supervisor, you are a millennial. Good for you!

by Alex Balk, The Awl |  Read more:
Image: The Awl

Are You Sure You Want to Unsubscribe From This Relationship?

Please select your reason for unsubscribing:

❏ This is temporary. I’ll be back!
❏ It’s not you; it’s me.
❏ I never signed up for this level of commitment.
❏ I met someone else.

If you chose ANY OF THESE, please explain in as many words as possible:

____________________________________________

____________________________________________.

Would you like to continue to receive calls and texts?

❏ Yes. We are both going to need some time to get closure.
❏ Only on my birthday/other major holidays.
❏ Not from you.
❏ Who is this? New phone.

What about our plans to go to Cabo?

❏ I’m sure as hell not going.
❏ You can still go, if you want to.
❏ I was drunk when we talked about that.

Was quitting smoking not good enough?


❏ No.
❏ It was only a start.
❏ It was never about the smoking, Alan.

Did you block me on Facebook or is there something wrong with my account?


❏ I blocked you.

Can I continue to use your Netflix log-in?

❏ No, that privilege has been revoked.
❏ Wait, I never said you could use my log-in.
❏ Fuck it. All right.

by Nick Bateman and Hallie Bateman, New Yorker |  Read more:
Image: Tim Robberts, Getty

Friday, May 20, 2016

Stevie Wonder

All LinkedIn with Nowhere to Go

In a jobs economy that has become something of a grim joke, nothing seems quite so bleak as the digital job seeker’s all-but-obligatory LinkedIn account. In the decade since the site launched publicly with a mission “to connect the world’s professionals to make them more productive and successful,” the glorified résumé-distribution service has become an essential stop for the professionally dissatisfied masses. The networking site burrows its way into users’ inboxes with updates spinning the gossamer dream of successful and frictionless advancement up the career ladder. Just add one crucial contact who’s only a few degrees removed from you (users are the perpetual Kevin Bacons in this party game), or update your skill set in a more market-friendly fashion, and one of the site’s 187 million or so users will pluck you from a stalled career and offer professional redemption. LinkedIn promises to harness everything that’s great about a digital economy that so far has done more to limit than expand the professional prospects of its user-citizens.

In reality, though, the job seeker tends to experience the insular world of LinkedIn connectivity as an irksome ritual of digital badgering. Instead of facing the prospect of interfacing professionally with a nine-figure user base with a renewed spring in their step, harried victims of economic redundancy are more likely to greet their latest LinkedIn updates with a muttered variation of, “Oh shit, I’d better send out some more résumés.” At which point, they’ll typically mark the noisome email nudge as “read” and relegate it to the trash folder.

Which is why it’s always been a little tough to figure out what LinkedIn is for. The site’s initial appeal was as a sort of self-updating Rolodex—a way to keep track of ex-coworkers and friends-of-friends you met at networking happy hours. There’s the appearance of openness—you can “connect” with anyone!—but when users try to add a professional contact from whom they’re more than one degree removed, a warning pops up. “Connecting to someone on LinkedIn implies that you know them well,” the site chides, as though you’re a stalker in the making. It asks you to indicate how you know this person. Former coworker? Former classmate? Fine. “LinkedIn lets you invite colleagues, classmates, friends and business partners without entering their email addresses,” the site says. “However, recipients can indicate that they don’t know you. If they do, you’ll be asked to enter an email address with each future invitation.”

You can try to lie your way through this firewall by indicating you’ve worked with someone when you haven’t—the equivalent of name-dropping someone you’ve only read about in management magazines. But odds are, you’ll be found out. I’d been confused, for instance, about numerous LinkedIn requests from publicists saying we’d “worked together” at a particular magazine. But when I clicked through to their profiles, I realized why they’d confidently asserted this professional alliance into being: the way to get to the next rung is to pretend you’re already there. If you don’t already know the person you’re trying to meet, you’re pretty much out of luck.

This frenetic networking-by-vague-association has bred a mordant skepticism among some users of the site. Scott Monty, head of social media for the Ford Motor Company, includes a disclaimer in the first line of his LinkedIn bio that, in any other context, would be a hilarious redundancy: “Note: I make connections only with people whom I have met.” It’s an Escher staircase masquerading as a career ladder.

On one level, of course, this world of aspirational business affiliation is nothing new. LinkedIn merely digitizes the core, and frequently cruel, paradox of networking events and conferences. You show up at such gatherings because you want to know more important people in your line of work—but the only people mingling are those who, like you, don’t seem to know anyone important. You just end up talking to the sad sacks you already know. From this crushing realization, the paradoxes multiply on up through the social food chain: those who are at the top of the field are at this event only to entice paying attendees, soak up the speaking fees, and slip out the back door after politely declining the modest swag bag. They’re not standing around on garish hotel ballroom carpet with a plastic cup of cheap chardonnay in one hand and a stack of business cards in the other.

LinkedIn does have some advantages over the sad old world of the perennially striving, sweating minor characters in Glengarry Glen Ross. After all, it doesn’t require a registration fee or travel to a conference center. Sometimes there are recruiters trolling the profiles on the site. It’s a kinder, gentler experience for the underemployed. It distills the emotionally fraught process of collapsing years of professional experience onto a single 8½ x 11 sheet of paper into the seemingly more manageable format of the online questionnaire. In the past year, the site has made the protocols of networking even more rote, allowing users to select from a list of “skills” and, with a few clicks, declare their proficiency. “You can add up to 50 relevant skills and areas of expertise (like ballet, iPhone and global business development),” chirps an infobox on the site.

A century or so ago, critics worried that the rise of scientific management in the industrial workplace would deskill the American worker; now, in the postindustrial order of social-media-enabled employment, skills (or, you know, quasi-skills) multiply while jobs stagnate. Sure, you probably won’t get hired at most places on the basis of your proficiency in ballet—but if you’re so inclined, you can spend some of your ample downtime on LinkedIn endorsing the iPhone skills of select colleagues and acquaintances.

by Ann Friedman, The Baffler |  Read more:
Image: J.D. King

Thursday, May 19, 2016

The Jefferson Bible

[ed. Things to do in retirement. See also: How Thomas Jefferson Created His Own Bible.]

The Life and Morals of Jesus of Nazareth, commonly referred to as the Jefferson Bible, was a book constructed by Thomas Jefferson in the later years of his life by cutting and pasting with a razor and glue numerous sections from the New Testament as extractions of the doctrine of Jesus. Jefferson's condensed composition is especially notable for its exclusion of all miracles by Jesus and most mentions of the supernatural, including sections of the four gospels which contain the Resurrection and most other miracles, and passages indicating Jesus was divine. (...)

Using a razor and glue, Jefferson cut and pasted his arrangement of selected verses from the King James Version of the books of Matthew, Mark, Luke, and John in chronological order, putting together excerpts from one text to those of another in order to create a single narrative. Thus he begins with Luke 2 and Luke 3, then follows with Mark 1 and Matthew 3. He provides a record of which verses he selected and of the order in which he arranged them in his "Table of the Texts from the Evangelists employed in this Narrative and of the order of their arrangement".

Consistent with his naturalistic outlook and intent, most supernatural events are not included in Jefferson's heavily edited compilation. Paul K. Conkin states that "For the teachings of Jesus he concentrated on his milder admonitions (the Sermon on the Mount) and his most memorable parables. What resulted is a reasonably coherent, but at places oddly truncated, biography. If necessary to exclude the miraculous, Jefferson would cut the text even in mid-verse." Historian Edwin Scott Gaustad explains, "If a moral lesson was embedded in a miracle, the lesson survived in Jeffersonian scripture, but the miracle did not. Even when this took some rather careful cutting with scissors or razor, Jefferson managed to maintain Jesus' role as a great moral teacher, not as ashaman or faith healer."

Therefore, The Life and Morals of Jesus of Nazareth begins with an account of Jesus’s birth without references to angels (at that time), genealogy, or prophecy. Miracles, references to the Trinity and the divinity of Jesus, and Jesus' resurrection are also absent from his collection.

No supernatural acts of Christ are included at all in this regard, while the few things of a supernatural nature include receiving of the Holy Spirit, angels, Noah's Ark and the Great Flood, the Tribulation, the Second Coming, the resurrection of the dead, a future kingdom, and eternal life, Heaven, Hell and punishment in everlasting fire, the Devil, and the soldiers falling backwards to the ground in response to Jesus stating, "I am he."

Rejecting the resurrection of Jesus, the work ends with the words: "Now, in the place where He was crucified, there was a garden; and in the garden a new sepulchre, wherein was never man yet laid. There laid they Jesus. And rolled a great stone to the door of the sepulchre, and departed." These words correspond to the ending of John 19 in the Bible.

Purpose

It is understood by some historians that Jefferson composed it for his own satisfaction, supporting the Christian faith as he saw it. Gaustad states, "The retired President did not produce his small book to shock or offend a somnolent world; he composed it for himself, for his devotion, for his assurance, for a more restful sleep at nights and a more confident greeting of the mornings."

There is no record of this or its successor being for "the Use of the Indians," despite the stated intent of the 1804 version being that purpose. Although the government long supported Christian activity among Indians, and in Notes on the State of Virginia Jefferson supported "a perpetual mission among the Indian tribes," at least in the interest of anthropology, and as President sanctioned financial support for a priest and church for the Kaskaskia Indians, Jefferson did not make these works public. Instead, he acknowledged the existence of The Life and Morals of Jesus of Nazareth to only a few friends, saying that he read it before retiring at night, as he found this project intensely personal and private.

Ainsworth Rand Spofford, Librarian of Congress (1864 –1894) stated: "His original idea was to have the life and teachings of the Saviour, told in similar excerpts, prepared for the Indians, thinking this simple form would suit them best. But, abandoning this, the formal execution of his plan took the shape above described, which was for his individual use. He used the four languages that he might have the texts in them side by side, convenient for comparison. In the book he pasted a map of the ancient world and the Holy Land, with which he studied the New Testament."

Some speculate that the reference to "Indians" in the 1804 title may have been an allusion to Jefferson's Federalist opponents, as he likewise used this indirect tactic against them at least once before, that being in his second inaugural address. Or that he was providing himself a cover story in case this work became public.

Also referring to the 1804 version, Jefferson wrote, "A more beautiful or precious morsel of ethics I have never seen; it is a document in proof that I am a real Christian, that is to say, a disciple of the doctrines of Jesus."

Jefferson's claim to be a Christian was made in response to those who accused him of being otherwise, due to his unorthodox view of the Bible and conception of Christ. Recognizing his rather unusual views, Jefferson stated in a letter (1819) to Ezra Stiles Ely, "You say you are a Calvinist. I am not. I am of a sect by myself, as far as I know."

by Wikipedia |  Read more:
Image: Hugh Talman, Smithsonian Institute

Marc Ribot

Why We Fight About Music

Art isn’t a competition, but one arises nevertheless when two fans on opposite sides of the aisle let loose with the world’s evergreen battle cry: My shit is better than your shit. Music, especially, is filled with warring fanbases trying to assert supremacy. Did anyone really love the Beatles if they didn’t insist, at one point, that they were definitely better than the Rolling Stones? The same goes for Tupac and Biggie, Oasis and Blur, Pavement and the Smashing Pumpkins, and so forth.

It’s always a little silly, this competition: No one can scientifically prove that “Gold Soundz” goes harder than “1979,” and any reasonable person would admit the difference comes down to our individual biases. If you’re the type to sit around and shoot the shit about pop culture, thinking about the those biases is where the fun starts. That's the subject of Steven Hyden's new book, Your Favorite Band Is Killing Me, which traces those biases through 16 music rivalries. When Hyden (formerly of Grantland, The A.V. Club, and yes, Pitchfork) first thought about writing about music rivalries, he knew he didn’t just want to write about music. Starting with the Beatles vs. Stones or Kanye West vs. Taylor Swift was a way to discuss everything, not just whose shit is better.

“Artists often take on characteristics of larger ideas, like how the Cold War was a way for the U.S. and USSR to have a war without actually shooting each other,” he says, over the phone. “Rivalries are a way for people to have arguments about bigger ideas in a fairly harmless form.” Hyden is an effortless writer, and he draws clever connections between artists and cultural phenomena spanning decades. Nirvana and Pearl Jam isn’t just about Kurt Cobain and Eddie Vedder, for example; it’s about Chris Christie. Neil Young and Lynyrd Skynyrd started a North vs. South conversation that can be read in 2014’s #CancelColbert movement, the White Stripes and Black Keys have just as much to do with male friendship as they do revivalist two-piece blues-rock, and Billy Corgan acted a lot more like Richard Nixon than any Siamese Dream fan would think at first listen.

There’s a lot going on, but the read is illuminating and often hilarious. (Nor does he lapse into the modern, web 3.0 trap of glibly suggesting that this is like that—the web is organically woven, and the bar-room tone is just right.) The book’s breezy style pairs well with the intrinsically low stakes: Hyden is wise enough to know that declaring a winner is pointless (and so the book never does), but smart enough to discuss everything that might come with “winning.” In that regard, it doesn’t matter if he never definitively comes down for one side, because exploring the hypothetical is fun enough.

Pitchfork: What was the first rivalry that you had to include?

Steven Hyden: The first thing I wrote was the White Stripes/Black Keys chapter. You know, it's just sort of funny that the two most famous two-person blues-rock bands ended up in this pissing match in the media. It's inevitable that that happened, in a way. But as I started thinking about it, it really became a way for me to talk about friendship. To me, the dynamic between Jack White and Dan Auerbach just reminded me of the dynamic between a lot of men—when they are trying to relate to each other, and they should be able to relate to each other, and they can't, it turns into this competition. Even if you don't care about the White Stripes and the Black Keys, you can still read this and relate to this thing that a lot of men can relate to, which is the weirdness of talking to other men. That was the first one I had to write about, but certainly there are other rivalries that I had to put in the book, like the Beatles and Stones, and Tupac and Biggie. With those, I was a little reluctant to write about them, because they had been so discussed in so many different places that I wasn't sure if I could come up with anything. But at the same time, when you're writing a rivalries book, if you don't talk about those two rivalries in particular, people are going to throw their book in the garbage immediately.

The Tupac/Biggie chapter is interesting, because you avoid drawing some greater cultural lesson from it. It concludes on a more human note of, “This is a tragic death that didn’t need to happen.”

With those guys, it's the same thing that's happened with Kurt Cobain, where everything that they do in their careers now is viewed as a prelude to their death. When we talk about Kurt Cobain, it's like all the music is just as a precursor to his suicide. It's especially true of that "Unplugged" special; you can't hear that now without thinking about how he died. With Biggie and Tupac, I think the same thing is true. With their music, it seems like it's framed through the prism of how they died, which is unfortunate. I think that probably happens with every iconic musician who died young. But if you can remove that filter and just imagine how Biggie and Tupac’s records would sound like now if they hadn't died, I think the messianic aspects that people project wouldn't necessarily be there. That's always hard to figure out: Does that make the music more resonant because of the backstory? Or does it take something away?

I always wished I could listen to Nirvana without the baggage of Kurt Cobain's death. Nevermind is a really fun record. But there's this sort of gloomy thing that's attached to it now that you can't really shake off, which is too bad. When Montage of Heck came out, I just remember thinking, "I wish I could listen to Nevermind as the record that some people didn't think was as good as Teenage Fanclub's Bandwagonesque," you know what I mean? [Ed. note: Bandwagonesque memorably beat out Nevermind on Spin’s 1991 year-end list.]

But, you know, the narrative of records—that becomes overwhelming even for new records. The reaction to Beyoncé's Lemonade—do people really listen to that as a record? Or is the ginormousness of what Beyoncé is—does that just overwhelm everything that she puts out now? The narrative that gets patched in records—I don't know if that's stronger now than it was before the internet. That's always hard to judge. But with the sheer quantity of media that exists, it really is overwhelming. With the cult of personality that exists around huge pop stars right now, it just creates this centripetal force of discussion that just sucks people in. I just don't know if it's even possible to hear what that record sounds like now. Maybe that record should be reviewed ten years from now by a person who wasn't reading any media at this moment. Maybe they can more accurately assess it than we're capable of in the present time.

by Jeremy Gordon, Pitchfork |  Read more:
Image: uncredited

Bill Clinton’s Big Moment: His Health, His Battle Plan for Trump, and What He’ll Do if Hillary Wins

At 69, Bill Clinton is helping Hillary Clinton run her 2016 presidential campaign with the goal of becoming...what, exactly? A kind of über-veep? A chaotic meddler-in-chief? On the trail and across Washington, already there is whispering and wondering: Who is Bill Clinton these days? And who does he intend to be if he—er, if his wife—wins the White House?

To be an ex-president is to live forever in the past. You write books and build museums to preserve your great moments, to commemorate a time when you led the free world. Crowds still gather and men in dark suits still hover protectively nearby. But mostly these are vestiges. You're a historic figure now, and that makes living in the present—or making the case for the future—a bit tricky. Bill Clinton knows this better than anyone.

On a cool spring morning, the 42nd president, almost 16 years removed from the White House, was standing on the blacktop outside a school in a blighted section of Oakland. It was the third and final day of the annual conference he hosts for college students, a powwow for the world's young thought-leaders-in-training held under the auspices of his Clinton Global Initiative. A few hundred of the students had gathered now to prettify some playgrounds, and Clinton—dressed in the politician's community-service-casual uniform of a blue pullover and stiff jeans—walked among them. Mostly, he posed for pictures. As he did, I watched one especially assertive student wade into the scrum and stride up to the former president.

“Hi, my name's Emma,” she said, and then explained that she had a question about the Middle East. Clinton's smile dimmed a bit, as if he were bracing for something. But Emma, it turned out, wasn't there for a debate—just a photo, albeit of a certain kind. “There's a really cool picture of you standing behind Rabin and Arafat,” she said, referring to the famous shot of Clinton pushing the Israeli and Palestinian leaders to shake on the Oslo Accords in 1993, “and I was wondering, Could my boyfriend and I re-create that picture with you?”

For a moment there, Clinton seemed almost let down, as if, having readied himself to consider the intractable dilemmas of the world, he was reduced to a prop—a wax figure in a historical re-enactment. Quickly, however, his grin returned as they struck the pose. Though Emma and her boyfriend didn't ask for one, he proffered a memory, shoehorned into a joke: “It was a lot harder to convince Rabin and Arafat to shake hands than it was to convince you two.”

Moments later, I approached Clinton with a question of my own—an actual one, about the politics of 2016, about his wife's fight to win his old job. I wanted to know what he made of the kids he was hanging out with that day—and if, considering Hillary's notorious struggles with young voters, he thought they were likely to support her over Bernie Sanders. “I don't know,” he told me, betraying no great affection for the question. “It hasn't occurred to me.”

I offered that most of the students I spoke with were pulling for Sanders. That seemed to goad the former president into an answer, and suddenly a stew of frustrations—about his wife's difficulty reaching young people, about Sanders's attacks on her—seemed to simmer over. “The thing that I believe is that unlike in many places, if we had a debate here, they would listen to both of them,” Clinton told me, his words quick and measured. “Most of these students are here because they believe that the best change comes about when people work together and actually do something. So I think they're much more likely to have their eyes and ears open to everybody and every possibility, which is all I would like for everyone.”

I pressed him about what he meant. Was he angry, I wondered, that people had seemingly long ago made up their minds about Hillary? About him, too? Was that fair? He looked at me, his eyes resolute. “I've already told you enough to read between the lines.”

A few weeks earlier, the good people of Bluffton, South Carolina—their minds open to the Clintons or not—were hustling down to a local gym on a Friday afternoon. A woman in medical scrubs led her little girl by the arm, hurrying her toward the doors before the space grew too crowded. They were there, the mother explained to her daughter, who had been dressed in her Sunday best, to glimpse a piece of “living history.”

Inside, the space was festooned with VOTE FOR HILLARY signs, but candidate Clinton would not be in attendance. It was late February, the day before South Carolina's Democratic primary, and her time was better spent elsewhere in the state, in bigger cities with more voters and greater numbers of TV cameras. Instead, the piece of living history that a few hundred of Bluffton's 15,000 citizens had come to see was her husband, the former president of the United States, who now was ambling to the podium.

People craned their necks and held their phones aloft, and Bill Clinton leaned into the microphone. But when he opened his mouth, words failed to tumble forth. Rather, his vocal cords produced a shivers-inducing rattle. He gathered himself. “I apologize for being hoarse,” he finally croaked. “I have lost my voice in the service of my candidate.”

That seemed the least of his maladies. Up close, his appearance was a shock. The imposing frame had shrunk, so that his blue blazer slipped from his shoulders, as if from a dry cleaner's hanger, and the collar of his shirt was like a loose shoelace around his neck. His hair, which long ago had gone white, was now as thin and downy as a gosling's feathers, and his eyes, no longer cornflower blue but now a dull gray, were anchored by bags so dark it looked like he'd been in a fight. He is not a young man anymore—he'll turn 70 in August—but on this afternoon, he looked ancient.

This is Bill Clinton, on the stump circa 2016. The extravagant, manic, globe-trotting nature of a post-presidency lived large—the $500,000 speeches, the trips aboard his billionaire buddies' private planes to his foundation's medical clinics across Africa—has given way to a more quotidian life spent trying to get his wife into the White House. And this time around, more so than in 2008, Clinton is cast in what even he regards as a supporting role. “He's able to go campaign in the places that, because of the schedule and the pressures on her, she can't get to,” John Podesta, Hillary's campaign chairman, told me.

And so Clinton travels to places like Bluffton on small, chartered planes—or takes the occasional commercial flight (albeit in first class with an aide always booked next to him to avoid chatty seatmates). More often than not, the ex-president finds himself staying in hotels with nothing resembling a presidential suite; he typically overnights in Holiday Inn Expresses and Quality Inns. His aides say he's the least prissy member of his small traveling party—caring only that his shower has good water pressure and that the TV has premium cable so that he might watch San Andreas or one of the Fast & Furious movies before he drifts off to sleep. When he wakes, he often makes coffee for himself in his room.

Of course, he still turns out crowds—especially in these hamlets unaccustomed to political royalty. But on that day in Bluffton, as Clinton began to talk, there wasn't much of the old oratorical genius on display. He recalled his college roommate, a Marine who had been stationed nearby; but what seemed like a quick geographical touch point soon spun into a rambling tale about the man's sister-in-law, who had a disabled daughter who now lives in Virginia. “I watched her grow up,” Clinton told the puzzled crowd. His attempts at eloquence—“We don't need to build walls; we need to build ladders of opportunity”—weren't his best, and when he delved into politically relevant topics, like terrorism, he sounded less like a man who used to receive daily intelligence briefings than like an elderly relative at the holiday table. “The people who did San Bernardino,” Clinton explained, “were converted over the social media.” All the while, his hands—those (with apologies to Donald Trump) truly giant instruments that he once used to punctuate his points—now shook with a tremor that he could control only by shoving them into his pants pockets or gripping the lectern as if riding a roller coaster. For more than half an hour, Clinton went on like this, losing more of the crowd's attention as each minute passed, until a few people actually got up from their chairs and tiptoed toward the exits.

Then a young man abruptly stood up, not to leave but to make his own speech. He wore a dark suit and sported a high-and-tight haircut, and, interrupting Clinton mid-sentence, he told the president that he was a Marine, “just like your college friend.” With that, the man began a lecture about buddies lost in Iraq and his concerns about the Department of Veterans Affairs. Clinton looked startled and unsure of himself but soon poked his way into the exchange. “What do you think should be done with the VA?” he asked, seemingly trying to coax the man to a better place. But Clinton's question sailed past, ignored. “... And the thing is,” the man continued, his voice rising, “we had four lives in Benghazi that were killed, and your wife tried to cover it up!” That's when a line was tripped and Clinton, meek and muted until now, suddenly sprang to life.

“Can I answer?” Clinton asked icily. The man raised his voice, but Clinton was now almost shouting and had the advantage of a microphone.

“This is America—I get to answer,” Clinton said, his shoulders thrown back and his eyes now alive. “I heard your speech. They heard your speech. You listen to me. I'm not your commander-in-chief anymore, but if I were, I'd tell you to be more polite and sit down!” As two sheriff's deputies began to hustle the disrupter out, Clinton beseeched the cops to wait. “Do you have the courage to listen to my answer?” he said to the man. “Don't throw him out! If he'll shut up and listen to my answer, I'll answer him.”

But it was too late: The guy was gone, and so Clinton gave his reply to the people who remained. It was a tour de force, a careful explanation not only of what had happened that night in Benghazi but also of the multiple investigations that had absolved his wife of any wrongdoing, as well as a history of past congressional investigations into similar attacks. With precision and clarity, Clinton pressed his case and won the crowd as only Clinton could. It felt as if 30 years had fallen away, and the people of Bluffton—who had come to see a star—roared louder and longer than they had all afternoon.

“You know,” Clinton said, a smile spreading across his face and his voice now honeyed with satisfaction, “I'm really sorry that young man didn't stay.”

Of course, there are flashes of greatness—moments that validate the notion that the supreme politician of his generation has still got it. But in many ways, Clinton's routine these days might be regarded as humbling—an epic comedown not just from his presidency but also from the global celebrity he's enjoyed in the decade and a half since it ended. Making matters worse, there's the sense, even among some friends and supporters, that Clinton's own capabilities have diminished, that the secondary role he now finds himself playing in his wife's campaign is perhaps the only role to which he's now suited. But for Clinton, whose life story is one of suffering and then overcoming (often self-inflicted) setbacks, the current presidential campaign offers him the chance—perhaps the final chance—at a form of redemption: to atone for past mistakes, to prove his doubters wrong, to return to the White House, and, above all else, to be of service. More than anything, Clinton, as his biographer David Maraniss has written, “loves to be needed as much as he needs to be loved.”

by Jason Zengerle, GQ |  Read more:
Image: Dina Litovsky