Thursday, May 26, 2016

Manpris

[ed. So, I learned a new term today: "Manpri" (also called Dude Capris)]. There seems to be quite a bit of angst involving the whole topic. See also: The Great Manpris Debate.]

We used to think that nothing could stump us. Show us a man trying to pass himself off as straight, for whatever reason, and we’d call him out faster than #Ryan Seacrest could ask for a hair straightener.

Enter the Manpri.

There are two kinds of men who wear Manpris. Yard boys and gay hipsters. Ladies shouldn’t be hitting on either of these men anyway. Straight men don’t even know what Manpris are.

What are these ambiguous bottoms anyway? They are calf length trousers, somewhere between a short and a pant. Your grandmother would probably call them peddle pushers. Regardless, they’re fucking fabulous. And they’re fucking gay.

by Anonymous, Straightmendont | Read more:
Image: Straightmendont

The Citizen-Soldier: Moral Risk and the Modern Military

I can’t say that I joined the military because of 9/11. Not exactly. By the time I got around to it the main U.S. military effort had shifted to Iraq, a war I’d supported though one which I never associated with al-Qaida or Osama bin Laden. But without 9/11, we might not have been at war there, and if we hadn’t been at war, I wouldn’t have joined.

It was a strange time to make the decision, or at least, it seemed strange to many of my classmates and professors. I raised my hand and swore my oath of office on May 11, 2005. It was a year and a half after Saddam Hussein’s capture. The weapons of mass destruction had not been found. The insurgency was growing. It wasn’t just the wisdom of the invasion that was in doubt, but also the competence of the policymakers. Then-Secretary of Defense Donald Rumsfeld had been proven wrong about almost every major post-invasion decision, from troop levels to post-war reconstruction funds. Anybody paying close attention could tell that Iraq was spiraling into chaos, and the once jubilant public mood about our involvement in the war, with over 70 percent of Americans in 2003 nodding along in approval, was souring. But the potential for failure, and the horrific cost in terms of human lives that failure would entail, only underscored for me why I should do my part. This was my grand cause, my test of citizenship. (...)

There’s a joke among veterans, “Well, we were winning Iraq when I was there,” and the reason it’s a joke is because to be in the military is to be acutely conscious of how much each person relies on the larger organization. In boot camp, to be called “an individual” is a slur. A Marine on his or her own is not a militarily significant unit. At the Basic School, the orders we were taught to write always included a lost Marine plan, which means every order given carries with it the implicit message: you are nothing without the group. The Bowe Bergdahl case is a prime example of what happens when one soldier takes it upon himself to find the war he felt he was owed—a chance to be like the movie character Jason Bourne, as Bergdahl explained on tapes played by the podcast Serial. The intense anger directed at Bergdahl from rank and file soldiers, an anger sometimes hard for a civilian public raised on notions of American individualism to comprehend, is the anger of a collective whose members depend on each other for their very lives directed toward one who, through sheer self-righteous idiocy, violated the intimate bonds of camaraderie. By abandoning his post in Afghanistan, Bergdahl made his fellow soldiers’ brutally hard, dangerous, and possibly futile mission even harder and more dangerous and more futile, thereby breaking the cardinal rule of military life: don’t be a buddy fucker. You are not the hero of this movie.

But a soldier doesn’t just rely on his squad-mates, or on the leadership of his platoon and company. There’s close air support, communications, and logistics. Reliable weapons, ammunition, and supplies.
The entire apparatus of war—all of it ultimately resting on American industry and on the tax dollars that each of us pays. “The image of war as armed combat merges into the more extended image of a gigantic labor process,” wrote Ernst Jünger, a German writer and veteran of World War I. After the Second World War Kurt Vonnegut would come to a similar conclusion, reflecting not only on the planes and crews, the bullets and bombs and shell fragments, but also where those came from: the factories “operating night and day,” the transportation lines for the raw materials, and the miners working to extract them. Think too hard about the front-line soldier, you end up thinking about all that was needed to put him there.

Today, we’re still mobilized for war, though in a manner perfectly designed to ensure we don’t think about it too much. Since we have an all-volunteer force, participation in war is a matter of choice, not a requirement of citizenship, and those in the military represent only a tiny fraction of the country—what historian Andrew Bacevich calls “the 1 percent army. “ So the average civilian’s chance of knowing any member of the service is correspondingly small.

Moreover, we’re expanding those aspects of warfighting that fly under the radar. Our drone program continues to grow, as does the special operations forces community, which has expanded from 45,600 special forces personnel in 2001 to 70,000 today, with further increases planned. The average American is even less likely to know a drone pilot or a member of a special ops unit—or to know much about what they actually do, either, since you can’t embed a reporter with a drone or with SEAL Team 6. Our Special Operations Command has become, in the words of former Lieutenant Colonel John Nagl, “an almost industrial-scale counterterrorism killing machine.”

Though it’s true that citizens do vote for the leaders who run this machine, we’ve absolved ourselves from demanding a serious debate about it in Congress. We’re still operating under a decade-old Authorization for Use of Military Force issued in the wake of 9/11, before some of the groups we’re currently fighting even existed, and it’s unlikely, despite attempts from Senators Tim Kaine (D-Va.) and Jeff Flake (R-Ariz.), that Congress will issue a new one any time soon. We wage war “with or without congressional action,” in the words of President Obama at his final State of the Union Address, which means that the American public remains insulated from considering the consequences. Even if they voted for the president ordering these strikes, there’s seemingly little reason for citizens to feel personally culpable when they go wrong.

It’s that sense of a personal stake in war that the veteran experiences viscerally, and which is so hard for the civilian to feel. The philosopher Nancy Sherman has explained post-war resentment as resulting from a broken contract between society and the veterans who serve. “They may feel guilt toward themselves and resentment at commanders for betrayals,” she writes, “but also, more than we are willing to acknowledge, they feel resentment toward us for our indifference toward their wars and afterwars, and for not even having to bear the burden of a war tax for over a decade of war. Reactive emotions, like resentment or trust, presume some kind of community—or at least are invocations to reinvoke one or convoke one anew.”

The debt owed them, then, is not simply one of material benefits. There’s a remarkable piece in Harper’s Magazine titled, “It’s Not That I’m Lazy,” published in 1946 and signed by an anonymous veteran, which argues, “There’s a kind of emptiness inside me that tells me that I’ve still got something coming. It’s not a pension that I’m looking for. What I paid out wasn’t money; it was part of myself. I want to be paid back in kind, in something human.”

That sounds right to me: “something human,” though I’m not sure what form it would take. When I first came back from Iraq, I thought it meant a public reckoning with the war, with its costs not just for Americans but for Iraqis as well. As time goes by, and particularly as I watch a U.S. presidential debate in which candidates have offered up carpet bombing, torture, and other kinds of war crimes as the answer to complex problems that the military has long since learned will only worsen if we attempt such simplistic and immoral solutions, I’ve given up on hoping that will happen anytime soon. If the persistence of U.S. military bases named after Confederate generals is any indication, it might not happen in my lifetime. The Holocaust survivor Jean Améry, considering Germany’s post-war rehabilitation, would conclude, “Society … thinks only about its continued existence.” Decades later Ta-Nehisi Coates, considering the difficulty, if not impossibility, of finding solutions for various historic tragedies, would write, “I think we all see our ‘theories and visions’ come to dust in the ‘starving, bleeding, captive land’ which is everywhere, which is politics.”

by Phil Klay, Brookings Institution |  Read more:
Image: Reuters

Elevated Bus Swallows Cars and Straddles Roads


Imagine hovering over city streets at 40 miles per hour, zooming past congested traffic and sidewalks filled with pedestrians. That’s how engineering company, Transit Explore Bus, wants to transport people with its Land Airbus—an electric elevated bus that straddles roads on specially made tracks.

The company unveiled a cute mini model of the Land Airbus at the recent International High-Tech Expo in Beijing. In the video, you can see cars enter the mouth of the bus’s cavernous temporary tunnel, and safely come out the other end. The proposed vehicle can span two roads and is elevated so cars taller than two meters (six-foot-seven) high can drive underneath, China Xinhua News reports in the video.

“The bus will save lots of road space,” Song Youzhou, chief engineer of the Land Airbus project, says in the video. “It has the same function as the subway, but costs only 16 percent of what subway costs. Manufacturing and construction are also much shorter than for the subway.”

by Lauren Young, Atlas Obscura |  Read more:
Image: YouTube

I Have Met the Enemy, and It Is the Airlines

Summer is upon us, and we are facing important travel decisions. Such as who to blame when we get stuck in interminable airport lines.

So many options. There’s the government, but how many times can you can complain about Congress in the course of a lifetime? There’s the public — air traffic up 12 percent since 2011. But really, people, don’t blame yourself.

Let’s pick a rant that’s good for you, good for me, good for the lines in security: Make the airlines stop charging fees for checked baggage.

Seems simple, doesn’t it? Plus, if you do manage to make it to your flight, these are the same people who will be announcing there’s a $3 fee if you want a snack.

The largest airlines charge $25 for the first checked bag, thus encouraging people to drag their belongings through the airport, clogging the X-ray lines and slowing the boarding process as everybody fights to cram one last rolling duffel into the overhead compartment.

The idea that travelers should be hit by an extra charge for, um, having luggage began in 2008, when the cost of fuel went through the roof. We understood the airlines’ pain, sort of. Maybe. But now fuel prices have fallen into the cellar. The airlines are taking in stupendous profits — last year nearly $26 billion after taxes, up from $2.3 billion in 2010.

Yet the baggage fees are still with us. In fact, they’ve gone up by about two-thirds. Last year, the nation’s airlines made more than $3.8 billion off what I believe it is fair to call a scam. It’s also an excellent way to make your prices look lower than they really are when people surf for the cheapest ticket, a number that never includes details like the special fees for bags, food, canceling a reservation, booking by phone, sitting in a minimally more comfortable emergency row or, in some cases, requesting a pillow.

Shouldn’t the airlines offer up the baggage fee as a token of solidarity with their miserable passengers? The idea has come up. Homeland Security Secretary Jeh Johnson asked the airlines to “consider possibly” this modest bow to air travel sanity. Two U.S. senators, Edward Markey of Massachusetts and Richard Blumenthal of Connecticut, wrote a letter to the airlines asking them to just drop the fees during the high-traffic summer months.

We pause now for the sound of silence and crickets chirping.

The airlines have maximized profits by making travel as miserable as possible. The Boeing Company found a way to cram 14 more seats into its largest twin-engine jetliner by reducing the size of the lavatories. Bloomberg quoted a Boeing official as reporting that “the market reaction has been good — really positive.” We presume the market in question does not involve the actual passengers.

by Gail Collins, NY Times |  Read more:
Image: Robert Nickelsberg/Getty

Critique of Humanitarian Reason

Never have there been more refugees in the world as today: an estimated 45 million in total. So what's the current relationship between international law, emancipatory politics and the rights of the rightless?

On 16 February 2014, The New York Times Magazine ran an article entitled Container City." "Container City" refers to the Kilis camp in southern Turkey housing 14, 000 refugees from Syria. Protected by high gates and surrounded by barbed wire, Kilis from the outside shares features with many refugee camps all over the world that make them indistinguishable from prisons or criminal detention centres. Kilis houses its population in 2,053 identical containers, spread in neat rows. The pictures that accompany the article remind one of shipping containers at a harbour. Each container is a 23-by-10-foot trailer with three rooms; and a colour TV with close to 1000 channels, probably picking up programs from all the surrounding countries of the Mediterranean.

Yet there are some unique features of Kilis besides the cleanliness of its streets and the organization of proper electricity, water and sewage services which led one Syrian resident to refer to it as "a five star hotel." There are schools in the camp, sex-segregated according to the wishes of the Syrians; three grocery stores where refugees can buy supplies with a credit card; a beauty salon and a barbershop where refugees get free haircuts and other services; art workshops and gymnastics classes. But despite all this: "Nobody likes living there [...I]t is hard for us," said Basheer Alito, the section leader who was so effusive in his praise for the camps and the Turks. "Inside, we're unhappy. In my heart, it's temporary, not permanent."

The Kilis refugee camp is by now one of hundreds in dozens of countries around the world. A report by the United Nations High Commissioner of Refugees notes that by mid-2014, the number of refugees worldwide stood at the highest level on record, namely at around 45 million; and with no end in sight to conflicts in places such as Syria, Central African Republic and the Democratic Republic of the Congo this number will only continue to increase. As the number of refugees has grown worldwide, not only has the number of camps grown as well, but the camps have ceased to be places where one held people temporarily; rather, they have become semi-permanent. The largest refugee camp in the world, Kenya's Dadaab, is 20 years old and houses 420,000 refugees. The Palestinian refugee camps in Southern Lebanon are in many cases nearly 70 to 50 years old, depending on whether the refugee population was created in 1948 or 1968. The refugees who live in these camps, and who in some cases have spent their entire lives there, become PRSs, that is, those in a "protracted refugee situation."

Refugees, asylees, IDPs (internally displaced persons), PRSs, stateless persons: these are new categories of human beings created by an international state-system in turmoil, human beings who are subject to a special kind of precarious existence. Although they share with other "suffering strangers" the status of victimhood and become the objects of our compassion – or as the UNHCR report puts it, become "persons of concern" – their plight reveals the most fateful disjunction between so-called "human rights" – or "the rights of man", in the older locution – and "the rights of the citizen"; between the universal claims to human dignity and the specificities of indignity suffered by those who possess only human rights. From Hannah Arendt's famous discussion of the "right to have rights" in The Origins of Totalitarianism to Giorgio Agamben's homo sacer to Judith Butler's "precarious lives" and Jacques Rancière's call to "the enactment of rights", the asylum seeker, the stateless and the refugee have become metaphors as well as symptoms of a much deeper malaise in the politics of modernity.

Yet as political fatigue about internationalism has gripped the United States in the wake of the interventions in Afghanistan and Iraq, and president Obama's politics of caution in Syria has created further moral quagmires, we have moved from "the right to have rights" to the "critique of humanitarian reason." Didier Fassin, who for many years worked with Médecins Sans Frontières in a high capacity, and to whom we owe this term, defines it as follows: "Humanitarian reason governs precarious lives: the lives of the unemployed and the asylum seeker, the lives of sick immigrants and people with AIDS, the lives of disaster victims and victims of conflict – threatened and forgotten lives that humanitarian government brings into existence by protecting and revealing them." Subtitled "A Moral History of the Present", Fassin's felicitous book signals a more widespread retreat from the politics of human rights which began shortly after the US invasion of Afghanistan and Iraq to a denunciation of human rights, in the words of the Columbia historian, Samuel Moyn, as an "antipolitics" that survived as a "moral utopia when political utopias died." Some sought to achieve, writes Moyn, in his provocatively titled book, The Last Utopia: Human Rights of History, "through a moral critique of politics the sense of pure cause that had once been sought in politics itself"; further, human rights substituted a "plausible morality for failed politics." Fassin himself is more careful and balanced than Moyn in his critique of human rights discourse and practice, but nonetheless both works and the success they have enjoyed document an important moment at least in the zeitgeschichte of the United State's recent political culture.

This intellectual and political disillusionment was heralded even before Moyn's 2010 book. In a trenchant article of 2004 entitled "Who is the subject of the rights of man?", after the US wars in Afghanistan and Iraq were at their height, Jacques Rancière begins by noting how the Rights of Man, or in more contemporary language, Human Rights, which were rejuvenated by the dissident movements of Eastern Europe and the Soviet Union in the 1970s and '80s, became transformed in the first decade of the twenty-first century into "the rights of the rightless, of the populations hunted out of their homes and land and threatened by ethnic slaughter. They appeared more and more as the rights of the victims, the rights of those who were unable to enact any rights or even any claims in their name, so that eventually their rights had to be upheld by others, at the cost of shattering the edifice of International Rights, in the name of a new right to 'humanitarian interference' – which ultimately boiled down to the right to invasion." "Human rights, the rights of the rightless" became for Ranciere the ideological scaffolding for "humanitarian reason" at best and for "humanitarian intervention" at worst.

This prevalent mood of disillusionment and cynicism among many concerning human rights and humanitarian politics is understandable; but it is not defensible. Developments in international law since 1948 have tried to give new legal meaning to "human dignity" and "human rights". Admittedly, these developments have in turn generated the paradoxes of "humanitarian reason", but the way to work through these paradoxes is not to turn against the jus gentium, the law of nations, of our world; instead, we need a new conceptualization of the relationship between international law and emancipatory politics; a new way of understanding how to negotiate the "facticity" and the "validity" of the law, including international human rights and humanitarian law, such as to create new vistas for the political.

by Seyla Benhabib, Eurozone | Read more:
Image: U.S Dept. of State via Wikipedia

Wednesday, May 25, 2016

UDub Women Clinch National Golf Championship


Washington caps its first NCAA Women's title with some killer celebrations

[ed. The University of Washington Huskies women's golf team defeated UCLA on Tuesday and Stanford today for their first NCAA National Golf Championship. What a great effort, and a real nail-biter! Congratulations to everyone, especially Mary Lou Mulflur, their coach of 33 years.]  

The Big Uneasy

What’s roiling the liberal-arts campus?

At Oberlin, it started in December, when the temperatures ran high, although the weeping willows and the yellow poplars that had flared in the fall were bare already. Problems had a tendency to escalate. There was, to name one thing, the food fight: students had noted the inauthenticity of food at the school’s Afrikan Heritage House, and followed up with an on-site protest. (Some international students, meanwhile, complained that cafeteria dishes such as sushi and bánh mì were prepared with the wrong ingredients, making a mockery of cultural cuisine.) There was scrutiny of the curriculum: a student wanted trigger warnings on “Antigone.” And there was all the world outside. A year earlier, a black boy with a pellet gun named Tamir Rice was killed by a police officer thirty miles east of Oberlin’s campus, and the death seemed to instantiate what students had been hearing in the classroom and across the widening horizons of their lives. Class and race mattered. Power in a system would privilege its authors. After a grand jury declined to indict Rice’s shooter, the prosecutor called the death a “perfect storm of human error.”

Weeks passed. Finals came and went. The media turned its attention to the approaching Iowa caucus, while on campus an unease spread like a cold front coming off the lake. In mid-December, a group of black students wrote a fourteen-page letter to the school’s board and president outlining fifty nonnegotiable demands for changes in Oberlin’s admissions and personnel policies, academic offerings, and the like. “You include Black and other students of color in the institution and mark them with the words ‘equity, inclusion and diversity,’ ” it said, “when in fact this institution functions on the premises of imperialism, white supremacy, capitalism, ableism, and a cissexist heteropatriarchy.”

The letter was delivered by hand, but it leaked onto the Internet, and some of the more than seven hundred students who had signed it were hit with threats and hate speech online from anonymous accounts. The president, Marvin Krislov, rejected the letter’s stance, urging “collaboration.”

All across Oberlin—a school whose norms may run a little to the left of Bernie Sanders—there was instead talk about “allyship”: a more contemporary answer to the challenges of pluralism. If you are a white male student, the thought goes, you cannot know what it means to be, say, a Latina; the social and the institutional worlds respond differently to her, and a hundred aggressions, large and small, are baked into the system. You can make yourself her ally, though—deferring to her experience, learning from her accounts, and supporting her struggles. You can reach for unity in difference. (...)

During this academic year, schools across the country have been roiling with activism that has seemed to shift the meaning of contemporary liberalism without changing its ideals. At Yale, the associate head of a residence balked at the suggestion that students avoid potentially offensive Halloween costumes, proposing in an e-mail that it smothered transgressive expression. Her remarks were deemed insensitive, especially from someone tasked with fostering a sense of community, and the protests that followed escalated to address broader concerns. At Claremont McKenna, a dean sparked outrage when she sent an e-mail about better serving students—those of color, apparently—who didn’t fit the school’s “mold,” and resigned. In mid-November, a thousand students at Ithaca College walked out to demand the resignation of the president, who, they said, hadn’t responded aggressively enough to campus racism. More than a hundred other schools held rallies that week.

Protests continued through the winter. Harvard renamed its “house masters” faculty deans, and changed its law-school seal, which originated as a slaveholder’s coat of arms. Bowdoin students were disciplined for wearing miniature sombreros to a tequila-themed party. The president of Northwestern endorsed “safe spaces,” refuges open only to certain identity groups. At Wesleyan, the Eclectic Society, whose members lived in a large brick colonnaded house, was put on probation for two years, partly because its whimsical scrapbook-like application overstepped a line. And when Wesleyan’s newspaper, the Argus, published a controversial opinion piece questioning the integrity of the Black Lives Matter movement, some hundred and seventy people signed a petition that would have defunded the paper. Sensitivities seemed to reach a peak at Emory when students complained of being traumatized after finding “trump 2016” chalked on sidewalks around campus. The Trump-averse protesters chanted, “Come speak to us, we are in pain!,” until Emory’s president wrote a letter promising to “honor the concerns of these students.”

Such reports flummoxed many people who had always thought of themselves as devout liberals. Wasn’t free self-expression the whole point of social progressivism? Wasn’t liberal academe a way for ideas, good and bad, to be subjected to enlightened reason? Generations of professors and students imagined the university to be a temple for productive challenge and perpetually questioned certainties. Now, some feared, schools were being reimagined as safe spaces for coddled youths and the self-defined, untested truths that they held dear. Disorientingly, too, none of the disputes followed normal ideological divides: both the activists and their opponents were multicultural, educated, and true of heart. At some point, it seemed, the American left on campus stopped being able to hear itself think.

by Nathan Heller, New Yorker |  Read more:
Image: Oliver Munday 

The Monkees

Uncanny Valley

Morale is down. We are making plenty of money, but the office is teeming with salespeople: well-groomed social animals with good posture and dress shoes, men who chuckle and smooth their hair back when they can’t connect to our VPN. Their corner of the office is loud; their desks are scattered with freebies from other start-ups, stickers and koozies and flash drives. We escape for drinks and fret about our company culture. “Our culture is dying,” we say gravely, apocalyptic prophets all. “What should we do about the culture?”

It’s not just the salespeople, of course. It’s never just the salespeople. Our culture has been splintering for months. Members of our core team have been shepherded into conference rooms by top-level executives who proceed to question our loyalty. They’ve noticed the sea change. They’ve noticed we don’t seem as invested. We don’t stick around for in-office happy hour anymore; we don’t take new hires out for lunch on the company card. We’re not hitting our KPIs, we’re not serious about the OKRs. People keep using the wordparanoid. Our primary investor has funded a direct competitor. This is what investors do, but it feels personal: Daddy still loves us, but he loves us less.

We get ourselves out of the office and into a bar. We have more in common than our grievances, but we kick off by speculating about our job security, complaining about the bureaucratic double-downs, casting blame for blocks and poor product decisions. We talk about our IPO like it’s the deus ex machina coming down from on high to save us — like it’s an inevitability, like our stock options will lift us out of our existential dread, away from the collective anxiety that ebbs and flows. Realistically, we know it could be years before an IPO, if there’s an IPO at all; we know in our hearts that money is a salve, not a solution. Still, we are hopeful. We reassure ourselves and one another that this is just a phase; every start-up has its growing pains. Eventually we are drunk enough to change the subject, to remember our more private selves. The people we are on weekends, the people we were for years.

This is a group of secret smokers, and we go in on a communal pack of cigarettes. The problem, we admit between drags, is that we do care. We care about one another. We even care about the executives who can make us feel like shit. We want good lives for them, just like we want good lives for ourselves. We care, for fuck’s sake, about the company culture. We are among the first twenty employees, and we are making something people want. It feels like ours. Work has wedged its way into our identities, and the only way to maintain sanity is to maintain that we are the company, the company is us. Whenever we see a stranger at the gym wearing a T-shirt with our logo on it, whenever we are mentioned on social media or on a client’s blog, whenever we get a positive support ticket, we share it in the company chat room and we’re proud, genuinely proud.

But we see now that we’ve been swimming in the Kool-Aid, and we’re coming up for air. We were lucky and in thrall and now we are bureaucrats, punching at our computers, making other people — some kids — unfathomably rich. We throw our dead cigarettes on the sidewalk and grind them out under our toes. Phones are opened and taxis summoned; we gulp the dregs of our beers as cartoon cars approach on-screen. We disperse, off to terrorize sleeping roommates and lovers, to answer just one, two more emails before bed. Eight hours later we’ll be back in the office, slurping down coffee, running out for congealed breakfast sandwiches, tweaking mediocre scripts and writing halfhearted emails, throwing weary and knowing glances across the table.

I skim recruiter emails and job listings like horoscopes, skidding down to the perks: competitive salary, dental and vision, 401k, free gym membership, catered lunch, bike storage, ski trips to Tahoe, off-sites to Napa, summits in Vegas, beer on tap, craft beer on tap, kombucha on tap, wine tastings, Whiskey Wednesdays, Open Bar Fridays, massage on-site, yoga on-site, pool table, Ping-Pong table, Ping-Pong robot, ball pit, game night, movie night, go-karts, zip line. Job listings are an excellent place to get sprayed with HR’s idea of fun and a 23-year-old’s idea of work-life balance. Sometimes I forget I’m not applying to summer camp. Customized setup: design your ultimate work station with the latest hardware. Change the world around you. Help humanity thrive by enabling — next! We work hard, we laugh hard, we give great high-fives. We have engineers in TopCoder’s Top 20. We’re not just another social web app. We’re not just another project-management tool. We’re not just another payment processor. I get a haircut and start exploring.

Most start-up offices look the same — faux midcentury furniture, brick walls, snack bar, bar cart. Interior designers in Silicon Valley are either brand-conscious or very literal. When tech products are projected into the physical world they become aesthetics unto themselves, as if to insist on their own reality: the office belonging to a home-sharing website is decorated like rooms in its customers’ pool houses and pieds-à-terre; the foyer of a hotel-booking start-up has a concierge desk replete with bell (no concierge); the headquarters of a ride-sharing app gleams in the same colors as the app itself, down to the sleek elevator bank. A book-related start-up holds a small and sad library, the shelves half-empty, paperbacks and object-oriented-programming manuals sloping against one another. It reminds me of the people who dressed like Michael Jackson to attend Michael Jackson’s funeral.

But this office, of a media app with millions in VC funding but no revenue model, is particularly sexy. This is something that an office shouldn’t be, and it jerks my heart rate way, way up. There are views of the city in every direction, fat leather loveseats, electric guitars plugged into amps, teak credenzas with white hardware. It looks like the loft apartment of the famous musician boyfriend I thought I’d have at 22 but somehow never met. I want to take off my dress and my shoes and lie on the voluminous sheepskin rug and eat fistfuls of MDMA, curl my naked body into the Eero Aarnio Ball Chair, never leave.

It’s not clear whether I’m here for lunch or an interview, which is normal. I am prepared for both and dressed for neither. My guide leads me through the communal kitchen, which has the trappings of every other start-up pantry: plastic bins of trail mix and Goldfish, bowls of Popchips and miniature candy bars. There’s the requisite wholesale box of assorted Clif Bars, and in the fridge are flavored water, string cheese, and single-serving cartons of chocolate milk. It can be hard to tell whether a company is training for a marathon or eating an after-school snack. Once I walked into our kitchen and found two Account Mana­gers pounding Shot Bloks, chewy cubes of glucose marketed to endurance athletes.

Over catered Afghan food, I meet the team, including a billionaire who made his fortune from a website that helps people feel close to celebrities and other strangers they’d hate in real life. He asks where I work, and I tell him. “Oh,” he says, not unkindly, snapping a piece of lavash in two, “I know that company. I think I tried to buy you.”

I take another personal day without giving a reason, an act of defiance that I fear is transparent. I spend the morning drinking coffee and skimming breathless tech press, then creep downtown to spend the afternoon in back-to-back interviews at a peanut-size start-up. All of the interviews are with men, which is fine. I like men. I had a boyfriend; I have a brother. The men ask me questions like, “How would you calculate the number of people who work for the United States Postal Service?” and “How would you describe the internet to a medieval farmer?” and “What is the hardest thing you’ve ever done?” They tell me to stand in front of the whiteboard and diagram my responses. These questions are self-conscious and infuriating, but it only serves to fuel me. I want to impress; I refuse to be discouraged by their self-importance. Here is a character flaw, my industry origin story: I have always responded positively to negging.

My third interview is with the technical cofounder. He enters the conference room in a crisp blue button-down, looking confidently unprepared. He tells me — apologetically — that he hasn’t done many interviews before, and as such he doesn’t have a ton of questions to ask me. Nonetheless, the office manager slated an hour for our conversation. This seems OK: I figure we will talk about the company, I will ask routine follow-up questions, and at four they will let me out for the day, like a middle school student, and the city will absorb me and my private errors. Then he tells me that his girlfriend is applying to law school and he’s been helping her prep. So instead of a conventional interview, he’s just going to have me take a section of the LSAT. I search his face to see if he’s kidding.

“If it’s cool with you, I’m just going to hang out here and check my email,” he says, sliding the test across the table and opening a laptop. He sets a timer.

I finish early, ever the overachiever. I check it twice. The cofounder grades it on the spot. “My mother would be so proud,” I joke, feeling brilliant and misplaced and low, lower than low.

by Anna Wiener, N+1 |  Read more:
Image: Jennifer Murphy, Gold and Black Circles. 2007

Tuesday, May 24, 2016

Bob Almighty

Dylan turns 75 on 24th May. For millions of devotees like myself—many of whom consider him the world’s greatest living artist—it is a moment of celebration tinged with apprehension. Joan Baez, his most significant early anointer-disciple (Joan the Baptist), best expresses what might be described as “the Dylan feeling” in the excellent Martin Scorsese 2005 documentary when she says: “There are no veils, curtains, doors, walls, anything, between what pours out of Bob’s hand on to the page and what is somehow available to the core of people who are believers in him. Some people would say, ‘not interested,’ but if you are interested, he goes way, way deep.” I love this for lots of reasons but most of all because it captures not only the religious devotion that many who love him feel, but also the bemused indifference of the sane and secular who do not.

Of course, the first order of business when writing about Dylan is to urge readers to ignore writers who write about Dylan. We are like Jehovah’s Witnesses, forever tramping door to door with our clumsy bonhomie and earnest smudgy leaflets; in all honesty, you would be much better off seeking out the resonant majesty of the actual work. Indeed, you’ll be relieved—and possibly endeared—to hear that Dylan himself considers his disciples to be deranged. “Why is it when people talk about me they have to go crazy?” Dylan asked in a recent interview for Rolling Stone. “What the fuck is the matter with them?”

I should say in passing that I am only mildly afflicted by comparison. There are tens of thousands of Dylan fans who are in a far more advanced state of insanity. Fervent purveyors of set-lists and bootlegs and best-of-performances; the blue-faced blogging battalions; the tens of millions who watch YouTube footage of him changing the lyrics to a song here or performing an unreleased track there. Soon these poor folk will be sifting the brand new 6,000-piece literary archive of his ephemera (acquired in March by the University of Tulsa, Oklahoma, for a rumoured $60m) for clues as to his state of mind sequestered in the addenda to his legal contracts. There are already hundreds of “Dylanologists” who like to listen to individual instrumental tracks of his gazillion bootleg recordings—“stems” as they are called—so as to focus in on his rhythm guitar playing or keyboards. Then there are the serial show-goers stretching all the way back to the Gaslight Café in New York in 1962. There’s no other songwriter that comes anywhere near this kind of… what? Devotion, loyalty, study, analysis, contemplation, regard, fixation.

There are many answers as to why this might be (answers to Dylan’s own question) but the most straightforward is found in the lyrical texture and complexity of his early work. Like TS Eliot or Walt Whitman, his words mesmerise, occlude and invite interpretation. The whole crazy-devotional interpretative approach to his oeuvre began with people simply trying to decode the meaning of his songs—something approximate to parsing “The Waste Land” or “Leaves of Grass.” Over time, that inquisitorial dynamic spread beyond the art and on into the artist himself until it had intensified to the point of absurdity.

And yet not quite absurdity with regard to the songs. Because Dylan rewards meditation and repeated listening like no other. Very few of his stanzas succeed as on-the-page poetry—he is a performing artist—but when he sings, he imbues his words with a significance that is somehow rich with multivalent meanings, many of which feel just out of reach. This is something to do with the startling originality and range of his poetic imagination as expressed through the quality and skill of his word selection and the tone and timbre of his sung delivery. As Baez implies, the listener either wheels away wincing or they must be forever drawn in deeper and deeper seeking to further understand, savour and construe. (In this way, Dylan’s work is a bit like Ludwig van Beethoven in classical music: it’s all or nothing and you can’t have him on in the background.)

To put it another way, there are hundreds of Dylan’s lines that precisely capture or enact deeply personal human feelings that then turn out to be capacious enough to capture or enact entirely different human feelings decades later. Some of this effect is the accidental by-product of his staggering facility with the language, but a lot more than he pretends is consciously designed. Certainly, it’s why people began to study him in the first place. To quote the man himself: “What drives [us] to you is what drives [us] insane.”

But I don’t want to attempt to unpack the mighty genius of Dylan’s writing here; that’s a subject for another lifetime… Instead, by way of celebration and in an attempt to explain to non-believers, I want to offer up for consideration some other aspects of Dylan’s life and work that are not routinely considered: five qualities that I find inspiring and that I have come to admire since his 60th when I last wrote about him in a birthday context. (...)

Self Reliance

Dylan is famously indifferent to what his critics, audience or commentators think, say, feel or want. Actually, indifference is an understatement since it suggests a relationship—even if denied. Dylan’s attitude towards the press and public might be more accurately characterised as being something approximate to the attitude of Pluto as to whether humankind decides to classify it as a planet or not. Indeed, the last time I saw him—at the London shows in autumn 2015—I realised midway through that there was nobody in the Royal Albert Hall who was less interested in Dylan than the man himself. Which is probably why he was singing so much Frank Sinatra.

To my ear, these were the worst concerts I had seen him do for many years. (Contrary to popular perception, Dylan diehards are more acutely aware than the critics about how awful he can be; we know—we were there.) Why croon for two hours when you yourself were the man who rid the world of all this saccharine Sinatran slush the first time around? And, if you must croon, why not deploy your own back catalogue, which contains dozens of far more beautiful and sophisticated love songs? The point is that Dylan doesn’t care what anyone thinks—least of all his audience—and probably hasn’t since roughly 1965.

Those of you who know something of Dylan lore might dimly recall that this is when he “went electric” and fans starting booing and hissing and screaming “Judas!” (In fact he’d been electric before he was folk.) But Dylan didn’t just exasperate and lose his audience once. He’s done the same more or less every five years: he annoyed folk fans with rock music; rock fans with country music; country fans with cover-song crooning (Self Portrait in 1970 was the first time around for the crooning-Bob); cover-song-crooning-lovers with a caustic bitter-sweet divorce album; bitter-sweet-Bob-lovers with a Christian-gospel-rock; Christian-gospel-rock fans with a Zionist phase; the entire Live Aid world audience by using the moment to get drunk and draw attention away from Africa to the plight of American farmers; the remaining loyalists with a “comeback” that then subsided into two albums of finger-picking early blues covers featuring songs like “Froggie Went A Courtin’.” And so on. And so on.

He only really stopped annoying people in 1997 when he released the first of his late masterpieces, Time Out Of Mind. And that’s only because from around that date onward, people finally realised that he was always going to do whatever the hell he liked. In 2009, incidentally, Dylan put out an album of Christmas carols, which in my estimation has strong claims to be the worst album released by any artist in the history of recorded time.

by Edward Docx, Prospect |  Read more:
Image: John Cohen/Getty Images

Monday, May 23, 2016

Luxottica


[ed. The prices of glasses are insane. Here's the reason. See also: At Warby Parker, a Sense of Exclusion in a Low Price.]

Which Rock Star Will Historians of the Future Remember?

Classifying anyone as the “most successful” at anything tends to reflect more on the source than the subject. So keep that in mind when I make the following statement: John Philip Sousa is the most successful American musician of all time.

Marching music is a maddeningly durable genre, recognizable to pretty much everyone who has lived in the United States for any period. It works as a sonic shorthand for any filmmaker hoping to evoke the late 19th century and serves as the auditory backdrop for national holidays, the circus and college football. It’s not “popular” music, but it’s entrenched within the popular experience. It will be no less fashionable tomorrow than it is today.

And this entire musical idiom is now encapsulated in one person: John Philip Sousa. Even the most cursory two-sentence description of marching music inevitably cites him by name. I have no data on this, but I would assert that if we were to ask the entire population of the United States to name every composer of marching music they could think of, 98 percent of the populace would name either one person (Sousa) or no one at all. There’s just no separation between the awareness of this person and the awareness of this music, and it’s hard to believe that will ever change.

Now, the reason this happened — or at least the explanation we’ve decided to accept — is that Sousa was simply the best at this art. He composed 136 marches over a span of six decades and is regularly described as the most famous musician of his era. The story of his life and career has been shoehorned into the U.S. education curriculum at a fundamental level. (I first learned of Sousa in fourth grade, a year before we memorized the state capitals.) And this, it seems, is how mainstream musical memory works. As the timeline moves forward, tangential artists in any field fade from the collective radar, until only one person remains; the significance of that individual is then exaggerated, until the genre and the person become interchangeable. Sometimes this is easy to predict: I have zero doubt that the worldwide memory of Bob Marley will eventually have the same tenacity and familiarity as the worldwide memory of reggae itself.

But envisioning this process with rock music is harder. Almost anything can be labeled “rock”: Metallica, ABBA, Mannheim Steamroller, a haircut, a muffler. If you’re a successful tax lawyer who owns a hot tub, clients will refer to you as a “rock-star C.P.A.” when describing your business to less-hip neighbors. The defining music of the first half of the 20th century was jazz; the defining music of the second half of the 20th century was rock, but with an ideology and saturation far more pervasive. Only television surpasses its influence.

And pretty much from the moment it came into being, people who liked rock insisted it was dying. The critic Richard Meltzer supposedly claimed that rock was already dead in 1968. And he was wrong to the same degree that he was right. Meltzer’s wrongness is obvious and does not require explanation, unless you honestly think “Purple Rain” is awful. But his rightness is more complicated: Rock is dead, in the sense that its “aliveness” is a subjective assertion based on whatever criteria the listener happens to care about.

This is why the essential significance of rock remains a plausible thing to debate, as does the relative value of major figures within that system (the Doors, R.E.M., Radiohead). It still projects the illusion of a universe containing multitudes. But it won’t seem that way in 300 years.

The symbolic value of rock is conflict-based: It emerged as a byproduct of the post-World War II invention of the teenager, soundtracking a 25-year period when the gap between generations was utterly real and uncommonly vast. That dissonance gave rock music a distinctive, nonmusical importance for a long time. But that period is over. Rock — or at least the anthemic, metaphoric, Hard Rock Cafe version of big rock — has become more socially accessible but less socially essential, synchronously shackled by its own formal limitations. Its cultural recession is intertwined with its cultural absorption. As a result, what we’re left with is a youth-oriented music genre that a) isn’t symbolically important; b) lacks creative potential; and c) has no specific tie to young people. It has completed its historical trajectory. Which means, eventually, it will exist primarily as an academic pursuit. It will exist as something people have to be taught to feel and understand.

I imagine a college classroom in 300 years, in which a hip instructor is leading a tutorial filled with students. These students relate to rock music with no more fluency than they do the music of Mesopotamia: It’s a style they’ve learned to recognize, but just barely (and only because they’ve taken this specific class). Nobody in the room can name more than two rock songs, except the professor. He explains the sonic structure of rock, its origins, the way it served as cultural currency and how it shaped and defined three generations of a global superpower. He shows the class a photo, or perhaps a hologram, of an artist who has been intentionally selected to epitomize the entire concept. For these future students, that singular image defines what rock was.

So what’s the image?

Certainly, there’s one response to this hypothetical that feels immediate and sensible: the Beatles. All logic points to their dominance. They were the most popular band in the world during the period they were active and are only slightly less popular now, five decades later. The Beatles defined the concept of what a “rock group” was supposed to be, and all subsequent rock groups are (consciously or unconsciously) modeled upon the template they naturally embodied. Their 1964 appearance on “The Ed Sullivan Show” is so regularly cited as the genesis for other bands that they arguably invented the culture of the 1970s, a decade when they were no longer together. The Beatles arguably invented everything, including the very notion of a band’s breaking up. There are still things about the Beatles that can’t be explained, almost to the point of the supernatural: the way their music resonates with toddlers, for example, or the way it resonated with Charles Manson. It’s impossible to imagine another rock group where half its members faced unrelated assassination attempts. In any reasonable world, the Beatles are the answer to the question “Who will be the Sousa of rock?”

But our world is not reasonable. And the way this question will be asked tomorrow is (probably) not the same way we would ask it today.

In Western culture, virtually everything is understood through the process of storytelling, often to the detriment of reality. When we recount history, we tend to use the life experience of one person — the “journey” of a particular “hero,” in the lingo of the mythologist Joseph Campbell — as a prism for understanding everything else. That inclination works to the Beatles’ communal detriment. But it buoys two other figures: Elvis Presley and Bob Dylan. The Beatles are the most meaningful group, but Elvis and Dylan are the towering individuals, so eminent that I wouldn’t necessarily need to use Elvis’s last name or Dylan’s first.

Still, neither is an ideal manifestation of rock as a concept.

It has been said that Presley invented rock and roll, but he actually staged a form of primordial “prerock” that barely resembles the post-“Rubber Soul” aesthetics that came to define what this music is. He also exited rock culture relatively early; he was pretty much out of the game by 1973. Conversely, Dylan’s career spans the entirety of rock. Yet he never made an album that “rocked” in any conventional way (the live album “Hard Rain” probably comes closest). Still, these people are rock people. Both are integral to the core of the enterprise and influenced everything we have come to understand about the form (including the Beatles themselves, a group that would not have existed without Elvis and would not have pursued introspection without Dylan).

In 300 years, the idea of “rock music” being represented by a two‑pronged combination of Elvis and Dylan would be equitable and oddly accurate. But the passage of time makes this progressively more difficult. It’s always easier for a culture to retain one story instead of two, and the stories of Presley and Dylan barely intersect (they supposedly met only once, in a Las Vegas hotel room). As I write this sentence, the social stature of Elvis and Dylan feels similar, perhaps even identical. But it’s entirely possible one of them will be dropped as time plods forward. And if that happens, the consequence will be huge. If we concede that the “hero’s journey” is the de facto story through which we understand history, the differences between these two heroes would profoundly alter the description of what rock music supposedly was.

If Elvis (minus Dylan) is the definition of rock, then rock is remembered as showbiz. Like Frank Sinatra, Elvis did not write songs; he interpreted songs that were written by other people (and like Sinatra, he did this brilliantly). But removing the centrality of songwriting from the rock equation radically alters it. Rock becomes a performative art form, where the meaning of a song matters less than the person singing it. It becomes personality music, and the dominant qualities of Presley’s persona — his sexuality, his masculinity, his larger‑than‑life charisma — become the dominant signifiers of what rock was. His physical decline and reclusive death become an allegory for the entire culture. The reminiscence of the rock genre adopts a tragic hue, punctuated by gluttony, drugs and the conscious theft of black culture by white opportunists.

But if Dylan (minus Elvis) becomes the definition of rock, everything reverses. In this contingency, lyrical authenticity becomes everything; rock is somehow calcified as an intellectual craft, interlocked with the folk tradition. It would be remembered as far more political than it actually was, and significantly more political than Dylan himself. The fact that Dylan does not have a conventionally “good” singing voice becomes retrospective proof that rock audiences prioritized substance over style, and the portrait of his seven‑decade voyage would align with the most romantic version of how an eclectic collection of autonomous states eventually became a place called “America.”

These are the two best versions of this potential process. And both are flawed.

There is, of course, another way to consider how these things might unspool, and it might be closer to the way histories are actually built. I’m creating a binary reality where Elvis and Dylan start the race to posterity as equals, only to have one runner fall and disappear. The one who remains “wins” by default (and maybe that happens). But it might work in reverse. A more plausible situation is that future people will haphazardly decide how they want to remember rock, and whatever they decide will dictate who is declared its architect. If the constructed memory is a caricature of big‑hair arena rock, the answer is probably Elvis; if it’s a buoyant, unrealistic apparition of punk hagiography, the answer is probably Dylan. But both conclusions direct us back to the same recalcitrant question: What makes us remember the things we remember?

In 2014, the jazz historian Ted Gioia published a short essay about music criticism that outraged a class of perpetually outraged music critics. Gioia’s assertion was that 21st‑century music writing has devolved into a form of lifestyle journalism that willfully ignores the technical details of the music itself. Many critics took this attack personally and accused Gioia of devaluing their vocation. Which is odd, considering the colossal degree of power Gioia ascribes to record reviewers: He believes specialists are the people who galvanize history. Critics have almost no impact on what music is popular at any given time, but they’re extraordinarily well positioned to dictate what music is reintroduced after its popularity has waned.

“Over time, critics and historians will play a larger role in deciding whose fame endures,” Gioia wrote me in an email. “Commercial factors will have less impact. I don’t see why rock and pop will follow any different trajectory from jazz and blues.” He rattled off several illustrative examples: Ben Selvin outsold Louis Armstrong in the 1920s. In 1956, Nelson Riddle and Les Baxter outsold “almost every rock ’n’ roll star not named Elvis,” but they’ve been virtually erased from the public record. A year after that, the closeted gay crooner Tab Hunter was bigger than Jerry Lee Lewis and Fats Domino, “but critics and music historians hate sentimental love songs. They’ve constructed a perspective that emphasizes the rise of rock and pushes everything else into the background. Transgressive rockers, in contrast, enjoy lasting fame.” He points to a contemporary version of that phenomenon: “Right now, electronic dance music probably outsells hip‑hop. This is identical to the punk‑versus‑disco trade‑off of the 1970s. My prediction: edgy hip‑hop music will win the fame game in the long run, while E.D.M. will be seen as another mindless dance craze.”

Gioia is touching on a variety of volatile ideas here, particularly the outsize memory of transgressive art. His example is the adversarial divide between punk and disco: In 1977, the disco soundtrack to “Saturday Night Fever” and the Sex Pistols’ “Never Mind the Bollocks, Here’s the Sex Pistols” were both released. The soundtrack to “Saturday Night Fever” has sold more than 15 million copies; it took “Never Mind the Bollocks” 15 years to go platinum. Yet virtually all pop historiographers elevate the importance of the Pistols above that of the Bee Gees. The same year the Sex Pistols finally sold the millionth copy of their debut, SPIN magazine placed them on a list of the seven greatest bands of all time. “Never Mind the Bollocks” is part of the White House record library, supposedly inserted by Amy Carter just before her dad lost to Ronald Reagan. The album’s reputation improves by simply existing: In 1985, the British publication NME classified it as the 13th‑greatest album of all time; in 1993, NME made a new list and decided it now deserved to be ranked third. This has as much to do with its transgressive identity as its musical integrity. The album is overtly transgressive (and therefore memorable), while “Saturday Night Fever” has been framed as a prefab totem of a facile culture (and thus forgettable). For more than three decades, that has been the overwhelming consensus.

But I’ve noticed — just in the last four or five years — that this consensus is shifting. Why? Because the definition of “transgressive” is shifting. It’s no longer appropriate to dismiss disco as superficial. More and more, we recognize how disco latently pushed gay, urban culture into white suburbia, which is a more meaningful transgression than going on a British TV talk show and swearing at the host. So is it possible that the punk‑disco polarity will eventually flip? Yes. It’s possible everyone could decide to reverse how we remember 1977. But there’s still another stage here, beyond that hypothetical inversion: the stage in which everybody who was around for punk and disco is dead and buried, and no one is left to contradict how that moment felt. When that happens, the debate over transgressions freezes and all that is left is the music. Which means the Sex Pistols could win again or maybe they lose bigger, depending on the judge.

by Chuck Klosterman, NY Times |  Read more:
Image: Sagmeister & Walsh

Sunday, May 22, 2016

The Train That Saved Denver

A decade ago, travelers arriving at Denver’s sprawling new airport would look out over a vast expanse of flat, prairie dog-infested grassland and wonder if their plane had somehow fallen short of its destination. The $4.9 billion airport—at 53 square miles, larger than Manhattan—was derided as being “halfway to Kansas,” and given the emptiness of the 23-mile drive to the city, it felt that way.

Last month, arriving visitors boarded the first trains headed for downtown, a journey that zips past a new Japanese-style “smart city” emerging from the prairie before depositing passengers 37 minutes later in a bustling urban hive of restaurants, shops and residential towers that only six years ago was a gravelly no man’s land—an entire $2 billion downtown neighborhood that’s mushroomed up around the hub of Denver’s rapidly expanding light rail system.

The 22.8-mile spur from the airport to downtown is the latest addition to a regional rail system that has transformed Denver and its suburbs. Using an unprecedented public-private partnership that combines private funding, local tax dollars and federal grants, Denver has done something no other major metro area has accomplished in the past decade, though a number of cities have tried. At a moment when aging mass transit systems in several major cities are capturing headlines for mismanagement, chronic delays and even deaths, Denver is unveiling a shiny new and widely praised network: 68 stations along 10 different spurs, covering 98 miles, with another 15 miles still to come. Even before the new lines opened, 77,000 people were riding light rail each day, making it the eighth-largest system in the country even though Denver is not in the top 20 cities for population. The effects on the region’s quality of life have been measurable and also surprising, even to the project’s most committed advocates. Originally intended to unclog congested highways and defeat a stubborn brown smog that was as unhealthy as it was ugly, the new rail system has proven that its greatest value is the remarkable changes in land use its stations have prompted, from revitalizing moribund neighborhoods, like the area around Union Station, to creating new communities where once there was only sprawl or buffalo grass.

“We are talking about a culture-transforming moment,” says Denver mayor Michael Hancock. “Light rail has really moved Denver into the 21st century.”

“Our adolescence is over, and we’ve matured to adulthood,” he adds.

How the $7.6 billion FasTracks project saved Denver from a dreaded fate locals call “Houstonization” is the story of regional cooperation that required the buy-in of businesspeople, elected officials, civil servants and environmentalists across a region the size of Delaware. Their ability to work collectively—and the public’s willingness to approve major taxpayer investments—has created a transit system that is already altering Denver’s perception of itself, turning an auto-centric city into a higher-density, tightly-integrated urban center that aims to outcompete the bigger, older coastal cities on the global stage.

by Colin Woodard, Politico | Read more:
Image: Mark Peterson/Redux Pictures