Saturday, December 26, 2015


Physics dad
via:

Like a Prayer


Even secular people need time out to meditate, reflect, and give thanks. Is prayer the answer?

My soul – if I have one, which is still up for debate – is an angry misfit type of soul. It’s not a soul that likes cashew cheese or people who talk about their spirit animals. My soul likes a nice yoga class as much as the next soul, but it wishes the blankets there weren’t so scratchy, and that they’d play better music, and that the lady across the room wouldn’t chat nervously through the whole goddamn thing like her soul has been snorting crystal meth all morning. My soul would like for all the other souls to shut the fuck up once in a while.

My soul is not necessarily allergic to spirituality or to religion itself. It just feels suspicious towards bossy, patriarchal gods dreamt up by bossy patriarchs. Not that my soul doesn’t recognise that it’s a product of its environment! My soul is the first to admit that if my mother weren’t agnostic and I weren’t raised Catholic and I didn’t have a premature existential crisis after watching Horton Hears a Who! (1970) when I was eight, I could just go to church like all the other people who don’t like cashew cheese or wind chimes or men in linen pants. Then I could file into a pew and fold my hands in prayer and ask forgiveness for being such an irritable jackass. Unfortunately, my soul has spent lots of time with the Lord, and my soul is just not that into Him.

I’m not alone on that front. In Religion for Atheists (2012), the philosopher Alain de Botton writes that although religions have a lot to offer – they ‘deliver sermons, promote morality, engender a spirit of community, make use of art and architecture, inspire travels, train minds and encourage gratitude at the beauty of spring’ – it can be hard for atheists to reap those benefits.

We might not need to know why we’re here, but most of us want to feel like we’re in touch with something bigger than our own fluctuating moods and needs, and that we’re pointed in the right direction. But prayer isn’t just a spiritual version of Google Earth. Beyond asking for guidance or expressing gratitude, it can be a way of nudging our intentions toward action. As Philip and Carol Zaleski explain in Prayer: A History (2005), ‘Prayer is speech, but much richer than speech alone. It is a peculiar kind of speech that acts, and a peculiar kind of action that speaks to the depths and heights of being.’

That sounds like a pretty tall order, until you consider how fundamental prayer has been to humankind since prehistoric times. There’s some evidence that Neanderthals buried their dead surrounded by flowers, and scholars have suggested that engraved bones from the site at Laugerie Basse in southwestern France depict humans engaged in prayer. Prayer has been used to ask for protection or rainfall, for inspiration, answers or healing, as well as in thanks or celebration or mourning. Prayer can communicate adoration or devotion, ecstasy or ‘mystical union’ according to the Zaleskis, who must be Jeff Buckley fans. But however prayer is used, it makes simple sense that it should feel more received than invented. So where does that leave those of us intent on inventing a prayer for ourselves out of thin air?

by Heather Havrilesky, Aeon |  Read more:
Image: Vilhelm Hammershoi

Should AI Be Open?

All this likewise indubitably belonged to history, and would have to be historically assessed; like the Murder of the Innocents, or the Black Death, or the Battle of Paschendaele. But there was something else; a monumental death-wish, an immense destructive force loosed in the world which was going to sweep over everything and everyone, laying them flat, burning, killing, obliterating, until nothing was left…Nor have I from that time ever had the faintest expectation that, in earthly terms, anything could be salvaged; that any earthly battle could be won or earthly solution found. It has all just been sleep-walking to the end of the night.
   ~Malcolm Muggeridge

H.G. Wells’ 1914 sci-fi book The World Set Free did a pretty good job predicting nuclear weapons:
They did not see it until the atomic bombs burst in their fumbling hands…before the last war began it was a matter of common knowledge that a man could carry about in a handbag an amount of latent energy sufficient to wreck half a city
Wells’ thesis was that the coming atomic bombs would be so deadly that we would inevitably create a utopian one-world government to prevent them from ever being used. Sorry, Wells. It was a nice thought.

But imagine that in the 1910s and 1920s, the period’s intellectual and financial elites had started thinking really seriously along Wellsian lines. Imagine what might happen when the first nation – let’s say America – got the Bomb. It would be totally unstoppable in battle and could take over the entire world and be arbitrarily dictatorial. Such a situation would be the end of human freedom and progress.

So in 1920 they all pool their resources to create their own version of the Manhattan Project. Over the next decade their efforts bear fruit, and they learn a lot about nuclear fission. In particular, they learn that uranium is a necessary resource, and that the world’s uranium sources are few enough that a single nation or coalition of nations could obtain a monopoly upon them. The specter of atomic despotism is more worrying than ever.

They get their physicists working overtime, and they discover a variety of nuke that requires no uranium at all. In fact, once you understand the principles you can build one out of parts from a Model T engine. The only downside to this new kind of nuke is that if you don’t build it exactly right, its usual failure mode is to detonate on the workbench in an uncontrolled hyper-reaction that blows the entire hemisphere to smithereens. But it definitely doesn’t require any kind of easily controlled resource.

And so the intellectual and financial elites declare victory – no one country can monopolize atomic weapons now – and send step-by-step guides to building a Model T nuke to every household in the world. Within a week, both hemispheres are blown to very predictable smithereens.

II.

Some of the top names in Silicon Valley have just announced a new organization, OpenAI, dedicated to “advanc[ing] digital intelligence in the way that is most likely to benefit humanity as a whole…as broadly and evenly distributed as possible.” Co-chairs Elon Musk and Sam Altman talk to Steven Levy:
Levy: How did this come about? […] 
Musk: Philosophically there’s an important element here: we want AI to be widespread. There’s two schools of thought?—?do you want many AIs, or a small number of AIs? We think probably many is good. And to the degree that you can tie it to an extension of individual human will, that is also good. […] 
Altman: We think the best way AI can develop is if it’s about individual empowerment and making humans better, and made freely available to everyone, not a single entity that is a million times more powerful than any human. Because we are not a for-profit company, like a Google, we can focus not on trying to enrich our shareholders, but what we believe is the actual best thing for the future of humanity. 
Levy: Couldn’t your stuff in OpenAI surpass human intelligence? 
Altman: I expect that it will, but it will just be open source and useable by everyone instead of useable by, say, just Google. Anything the group develops will be available to everyone. If you take it and repurpose it you don’t have to share that. But any of the work that we do will be available to everyone. 
Levy: If I’m Dr. Evil and I use it, won’t you be empowering me? 
Musk: I think that’s an excellent question and it’s something that we debated quite a bit. 
Altman: There are a few different thoughts about this. Just like humans protect against Dr. Evil by the fact that most humans are good, and the collective force of humanity can contain the bad elements, we think its far more likely that many, many AIs, will work to stop the occasional bad actors than the idea that there is a single AI a billion times more powerful than anything else. If that one thing goes off the rails or if Dr. Evil gets that one thing and there is nothing to counteract it, then we’re really in a bad place.
Both sides here keep talking about who is going to “use” the superhuman intelligence a billion times more powerful than humanity, as if it were a microwave or something. Far be it from me to claim to know more than Sam Altman about anything, but I propose that the correct answer to “what would you do if Dr. Evil used superintelligent AI” is “cry tears of joy and declare victory”, because anybody at all having a usable level of control over the first superintelligence is so much more than we have any right to expect that I’m prepared to accept the presence of a medical degree and ominous surname.

A more Bostromian view would forget about Dr. Evil, and model AI progress as a race between Dr. Good and Dr. Amoral. Dr. Good is anyone who understands that improperly-designed AI could get out of control and destroy the human race – and who is willing to test and fine-tune his AI however long it takes to be truly confident in its safety. Dr. Amoral is anybody who doesn’t worry about that and who just wants to go forward as quickly as possible in order to be the first one with a finished project. If Dr. Good finishes an AI first, we get a good AI which protects human values. If Dr. Amoral finishes an AI first, we get an AI with no concern for humans that will probably cut short our future.

Dr. Amoral has a clear advantage in this race: building an AI without worrying about its behavior beforehand is faster and easier than building an AI and spending years testing it and making sure its behavior is stable and beneficial. He will win any fair fight. The hope has always been that the fight won’t be fair, because all the smartest AI researchers will realize the stakes and join Dr. Good’s team.

Open-source AI crushes that hope. Suppose Dr. Good and her team discover all the basic principles of AI but wisely hold off on actually instantiating a superintelligence until they can do the necessary testing and safety work. But suppose they also release what they’ve got on the Internet. Dr. Amoral downloads the plans, sticks them in his supercomputer, flips the switch, and then – as Dr. Good himself put it back in 1963 – “the human race has become redundant.”

The decision to make AI findings open source is a tradeoff between risks and benefits. The risk is letting the most careless person in the world determine the speed of AI research – because everyone will always have the option to exploit the full power of existing AI designs, and the most careless person in the world will always be the first one to take it. The benefit is that in a world where intelligence progresses very slowly and AIs are easily controlled, nobody will be able to use their sole possession of the only existing AI to garner too much power.

Unfortunately, I think we live in a different world – one where AIs progress from infrahuman to superhuman intelligence very quickly, very dangerously, and in a way very difficult to control unless you’ve prepared beforehand.

by Scott Alexander, Slate Star Codex |  Read more:
Image: Ex Machina

Friday, December 25, 2015

Lynch Mob

[ed. See also: At Harvard, Feelings Trump Knowledge.]

A Depression-era Lebanon Valley College leader with the last name Lynch has found himself thrust into the middle of a roiling 21st-century debate on campus civil rights.

Students at the private college in Annville have demanded administrators remove or modify Dr. Clyde A. Lynch's last name, as it appears on a campus hall, due to the associated racial connotations.

The demand was made at a forum on campus equality issues held Friday, capping a week of demonstrations calling for changes at the predominantly white institution. (...)

In the days that followed, commenters on pennlive.com leapt to defend Lynch, who served as the college's president from 1932 to 1950 when he died in office, saying he's been unfairly dragged into the fray by this modern-day movement.

A commenter going by the screen name "10xchamps," who identified himself as a recent graduate of the college, said "Anyone with half a brain would know that the name has nothing to do with racial connotations. It's the last name of a very generous donor who probably helped fund many of these students."

According to its website, Lynch led the college through the Great Depression and World War II, helping to raise $550,000 for a new physical education building which was named for him following his death. (...)

In response, student activists who made the demand said they'd be willing to settle for adding his first name and middle initial to the building instead of removing it altogether. At Friday's forum they acknowledged no known links between Dr. Clyde A. Lynch and the practice of "Lynching" but said as is, the building and last name harken back to a period in American history when Blacks were widely and arbitrarily killed by public hangings and "Lynch Mobs."  (...)

"I will not longer watch NFL football when John Lynch announces. Or watch Jane Lynch on TV. Too upsetting," a commenter named "gmaven" quipped.

by Colin Deppen, Penn Live |  Read more:
Image: via:

Applied Fashion

[ed. I don't know... is it just me, or is there a certain apprehension around party season this year?

Have you ever worn something completely wrong for the occasion? I don’t mean social-death wrong: we don’t live in such judgmental, Victorian times. I mean embarrassing-wrong: so wrong you want to go home and change, so wrong you wish the floorboards would open and let you descend into darkness. It may seem a quaint question to ask when the Naked Rambler has appeared in Britain’s Court of Appeal wearing nothing at all. And certainly, getting it really wrong is increasingly rare at a time when presidents appear in open-necked shirts and Downing Street advisers wander about in socks. But it can still, on occasion, happen.

“Occasion” is the key word: there are invitations to certain events which, once accepted, mean you have to play your part just as much as if you were an actor on stage. That includes wearing, more or less, the right costume. Parties, like plays, need to create an atmosphere, to weave a touch of magic, in order to take flight. They are fragile, airy confections, like spun sugar or candy floss; they hold their shape if all the ingredients come together, but if not, they collapse into a gritty pile. That, more than the attempt to exclude socially, is why the dress code still exists.

Dress codes on invitations tend to give men clear instructions: “black tie”, “lounge suits”. Both are unambiguous. For women they’re just the broadest of clues. Hence the phone-a-friend call asking “What are you going to wear?”, a question which lays bare the need of social animals to fit in with their tribe. I’ve always thought it would make a feminist point to turn up in a black tie one day – just the tie, not the full tuxedo – but I’d never have the nerve. (And incidentally, the one time it doesn’t look chic for a woman to turn up in a well-cut tux is when the dress code actually is black tie: it reads as protest or parody, rather than stylishness and wit.) Other dress codes I’ve come across include “dress to party”, “summer chic” and “dress up”, as well as the familiar, oxymoronic “smart casual”. None of them is specific, not even for men. But when decoded they all mean the same: “Be comfortable. No need to go over the top. But please make an effort, because we have.”

by Rebecca Willis, More Intelligent Life |  Read more:
Image: via:

Thursday, December 24, 2015

Why America Is Moving Left

In the late ’60s and ’70s, amid left-wing militancy and racial strife, a liberal era ended. Today, amid left-wing militancy and racial strife, a liberal era is only just beginning.

Understanding why requires understanding why the Democratic Party—and more important, the country at large—is becoming more liberal.

The story of the Democratic Party’s journey leftward has two chapters. The first is about the presidency of George W. Bush. Before Bush, unapologetic liberalism was not the Democratic Party’s dominant creed. The party had a strong centrist wing, anchored in Congress by white southerners such as Tennessee Senator Al Gore, who had supported much of Ronald Reagan’s defense buildup, and Georgia Senator Sam Nunn, who had stymied Bill Clinton’s push for gays in the military. For intellectual guidance, centrist Democrats looked to the Democratic Leadership Council, which opposed raising the minimum wage; to The New Republic (a magazine I edited in the early 2000s), which attacked affirmative action and Roe v. Wade; and to the Washington Monthly, which proposed means-testing Social Security.

Centrist Democrats believed that Reagan, for all his faults, had gotten some big things right. The Soviet Union had been evil. Taxes had been too high. Excessive regulation had squelched economic growth. The courts had been too permissive of crime. Until Democrats acknowledged these things, the centrists believed, they would neither win the presidency nor deserve to. In the late 1980s and the 1990s, an influential community of Democratic-aligned politicians, strategists, journalists, and wonks believed that critiquing liberalism from the right was morally and politically necessary.

George W. Bush wiped this community out. Partly, he did so by rooting the GOP more firmly in the South—Reagan’s political base had been in the West—aiding the slow-motion extinction of white southern Democrats that had begun when the party embraced civil rights. But Bush also destroyed centrist Democrats intellectually, by making it impossible for them to credibly critique liberalism from the right.

In the late 1980s and the 1990s, centrist Democrats had argued that Reagan’s decisions to cut the top income-tax rate from 70 percent to 50 percent and to loosen government regulation had spurred economic growth. When Bush cut the top rate to 35 percent in 2001 and further weakened regulation, however, inequality and the deficit grew, but the economy barely did—and then the financial system crashed. In the late ’80s and the ’90s, centrist Democrats had also argued that Reagan’s decision to boost defense spending and aid the Afghan mujahideen had helped topple the Soviet empire. But in 2003, when Bush invaded Iraq, he sparked the greatest foreign-policy catastrophe since Vietnam.

If the lesson of the Reagan era had been that Democrats should give a Republican president his due, the lesson of the Bush era was that doing so brought disaster. In the Senate, Bush’s 2001 tax cut passed with 12 Democratic votes; the Iraq War was authorized with 29. As the calamitous consequences of these votes became clear, the revolt against them destroyed the Democratic Party’s centrist wing. “What I want to know,” declared an obscure Vermont governor named Howard Dean in February 2003, “is why in the world the Democratic Party leadership is supporting the president’s unilateral attack on Iraq. What I want to know is, why are Democratic Party leaders supporting tax cuts?” By year’s end, Dean—running for president against a host of Washington Democrats who had supported the war—was the clear front-runner for his party’s nomination.

With the Dean campaign came an intellectual revolution inside the Democratic Party. His insurgency helped propel Daily Kos, a group blog dedicated to stiffening the liberal spine. It energized the progressive activist group MoveOn. It also coincided with Paul Krugman’s emergence as America’s most influential liberal columnist and Jon Stewart’s emergence as America’s most influential liberal television personality. In 2003, MSNBC hired Keith Olbermann and soon became a passionately liberal network. In 2004, The New Republic apologized for having supported the Iraq War. In 2005, The Huffington Post was born as a liberal alternative to the Drudge Report. In 2006, Joe Lieberman, the Democratic Party’s most outspoken hawk, lost his Democratic Senate primary and became an Independent. In 2011, the Democratic Leadership Council—having lost its influence years earlier—closed its doors.

By the time Barack Obama defeated Hillary Clinton for the Democratic presidential nomination in 2008, in part because of her support for the Iraq War, the mood inside the party had fundamentally changed. Whereas the party’s most respected thinkers had once urged Democrats to critique liberal orthodoxy, they now criticized Democrats for not defending that orthodoxy fiercely enough. The presidency of George W. Bush had made Democrats unapologetically liberal, and the presidency of Barack Obama was the most tangible result.

But that’s only half the story. Because if George W. Bush’s failures pushed the Democratic Party to the left, Barack Obama’s have pushed it even further. If Bush was responsible for the liberal infrastructure that helped elect Obama, Obama has now inadvertently contributed to the creation of two movements—Occupy and Black Lives Matter—dedicated to the proposition that even the liberalism he espouses is not left-wing enough.

by Peter Beinart, The Atlantic |  Read more:
Image: uncredited

Binge Reading Disorder

In 2008, Nicholas Carr wrote an article in the Atlantic called “Is Google Making Us Stupid?”—that became famous enough to merit its own Wikipedia page—in which he argues that the abundance of information that the internet provides is diminishing our abilities to actually comprehend what we read. Every article written about the article that I found mentioned this particular quote: “My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.”

Perhaps the reason Carr had to discard his flippers is because the sea just got too big and too populated for him to actually see anything. When you encounter so many sentences a day, even if they are well constructed, intelligent, and seemingly memorable, how do you actually remember one intelligent thought when a thousand others are clamoring for your attention?

A UC San Diego report published in 2009 suggests the average American’s eyes cross 100,500 words a day—text messages, emails, social media, subtitles, advertisements—and that was in 2008. Data collected by the marketing company Likehack tells us that the average social media user “reads”—or perhaps just clicks on—285 pieces of content daily, an estimated 54,000 words. If it is true, then we are reading a novel slightly longer than The Great Gatsby every day.

Of course, the word “read” is rather diluted in this instance. You can peruse or you can skim, and it’s still reading. I spoke with writer and avid reader John Sherman about his online reading habits. “Sometimes, when I say I read an article,” said Sherman, “what I actually mean is I read a tweet about that article.” He is hardly alone in this. Using information collected from the data analysis firm Chartbeat, Fahrad Manjoo writes at Slate that there is a very poor correlation between how far a reader scrolls down in an article and when he or she shares the article on Twitter. In fact, people are more likely to tweet a link to an article if they have not read it fully. “There is so much content out there, capital c, and a lot of it overlaps,” Sherman said. “It takes less time to respond to an idea than a complete argument.”

It takes even less time to respond to an idea or argument with somebody else’s article. Have you read this? No, but that’s like what I read in this other piece. Perhaps nothing depicts this exchange better than a particular Portlandia skit, in which Fred Armisen and Carrie Brownstein rat-a-tat back and forth about what they’ve read, begin tearing the pages out of a magazine and stuffing them in their mouths, and when they run across the street to lunge for a Yellow Pages, they get hit by a car. “Hey, can’t you read?” yells the driver.

Reading is a nuanced word, but the most common kind of reading is likely reading as consumption: where we read, especially on the Internet, merely to acquire information. Information that stands no chance of becoming knowledge unless it “sticks.”

by Nikkitha Bakshani, TMN | Read more:
Image: Rok Hodej

Flying Business Class as a Millennial

I flew business class for the first time in my life last week. It was an overnight, 10-hour flight for a work trip.

STOP, do not click the comment button. I am not a luxurious person! I don’t own designer anything. I hail from a family of proud “Dr. Thunder” drinkers.

The thing is, going into this trip I was already exhausted. All the items on those Internet “self-care” guides—showering, going outside—had fallen right off my increasingly lengthy to-do list. I knew sitting upright with my elbow touching another human for longer than a standard workday would only wear me out further, that I would “wake up” from this flight more wrecked than possibly ever, and that I would immediately have to jump into several days of marathon interviews.

Also, I love sleep. I love it so much, in fact, that it's a wonder I’m not better at it. I can’t sleep in cars, or in most hotels, or even in my own bed whenever work is going badly, or when it’s going suspiciously well. I can’t sleep after I eat a big meal, or after I accidentally say “you too!” back to someone who wished me a happy birthday. And I definitely, without a doubt, can’t sleep on planes.

The prospect of all of these forces converging made my brain feel like it was going to liquefy and dribble out through my nostrils.

So when the one-word question—“Upgrade?”—popped up on the check-in computer at Washington Reagan, I thought I would honor the spirit of the airport’s namesake by at least looking into the best thing unfettered capitalism has ever visited on mankind: business class.

Don’t worry, my momma raised me right: When the ticket-checker told me the cost of upgrading, I played hardball.

“I dunnnooooo,” I said, “that’s a liiiiiiiiiittle pricey.”

“Let me know what you decide,” he said, turning back to his computer. I excused myself to Google Wall Street Journal stories about what constitutes a good deal when upgrading. The price he was quoting me was hundreds of dollars less.

“Okay fine I’ll take it.”

One credit-card swipe later (so easy!) the man's attitude toward me brightened considerably. “Okay, as a first-class passenger, you now have access to the Admiral lounge.”

“What’s that?”

“Just go in that little black elevator to a special room. It’s one of your perks.”

I did so. Inside the wood-paneled room are: Old people, guys who look like they could be start-up founders, and women who looked like they could be actresses. ‘Tis not an ordinary path that leads to the Admiral lounge.

People were having extremely quiet in-person conversations and extremely expletive-filled phone calls. My fellow Admirals gave me the side-eye, but I flashed my business-class boarding pass at them, Pretty Woman-style. (Except of course it looked just like a regular boarding pass so the effect was diminished somewhat.)

I spent my time sending decisive-sounding emails and chugging a free glass of wine. When they announced my flight, I got to wait in the “priority” line, rather than the clearly inferior “main cabin” line immediately to its right.

Below is a brief log from inside the aircraft:

by Olga Khazan, The Atlantic |  Read more:
Image Mary Altaffer / AP

Magda Indigo, Poinsettia
via:

Wednesday, December 23, 2015

Children of the Yuan Percent: Everyone Hates China’s Rich Kids

[ed. They'll get ruthless soon enough. It's the nature of wealth.]

Emerging from a nightclub near Workers’ Stadium in Beijing at 1:30 a.m. on a Saturday in June, Mikael Hveem ordered an Uber. He selected the cheapest car option and was surprised when the vehicle that rolled up was a dark blue Maserati. The driver, a young, baby-faced Chinese man, introduced himself as Jason. Hveem asked him why he was driving an Uber—he obviously didn’t need the cash. Jason said he did it to meet people, especially girls. Driving around late at night in Beijing’s nightclub district, he figured he’d find the kind of woman who would be charmed by a clean-cut 22-year-old in a sports car.

When I heard this story from a friend who had also been in the car, I asked for the driver’s contact info. I introduced myself to Jason over WeChat, China’s popular mobile app, and asked for an interview. He replied immediately with a screen shot that included photos of women in various states of undress. “Best hookers in bj :),” he added. I explained there had been a misunderstanding, and we arranged to have coffee.

When we met at a cafe in Beijing’s business district, it was clear that Jason, whose surname is Zhang, was different from other young Chinese. He had a job, at a media company that produced reality TV shows, but didn’t seem especially busy. He’d studied in the U.S., but at a golf academy in Florida, and he’d dropped out after two years. His father was the head of a major HR company, and his mother was a government official. He wore a $5,500 IWC watch because, he said, he’d lost his expensive one. I asked him how much money he had. “I don’t know,” he said. “More than I can spend.” So this was it: I had found, in the wild, one of the elusive breed known in China as the fuerdai, or “second-generation rich.” (...)

It’s no surprise that most fuerdai, after summering in Bali and wintering in the Alps, reading philosophy at Oxford and getting MBAs from Stanford, are reluctant to take over the family toothpaste cap factory. Ping Fan, 36, who serves as executive deputy director of Relay, moved to Shanghai to start his own investment firm rather than work at his father’s real estate company in Liaoning province. He picked Shanghai, he said, “because it was far from my family.” After graduating from Columbia University, Even Jiang, 28, briefly considered joining her mother’s diamond import business, but they disagreed about the direction of the company. Instead, she went to work at Merrill Lynch, then returned to Shanghai to start a concierge service, inspired by the American Express service she used when living in Manhattan. Liu Jiawen, 32, whose parents own a successful clothing company in Hunan province, tried to start her own clothing line after graduating. “I wanted to show I could do it on my own,” she said. The company failed.

Along with riches, fuerdai often inherit a surplus of emotional trauma. The first generation of Chinese entrepreneurs came of age during a time that rewarded callousness. “They were the generation of the Cultural Revolution,” said Wang. “During that time, there was no humanity.” His grandfather, the principal of a middle school in Guizhou province, was humiliated by Red Guards. “They were raised cruelly—there was no mercy. It was survival of the fittest.” Many fuerdai have their parents’ same coldness, Wang said: “They’re really hard to be friends with.”

Zhang, the Uber driver, was sent to boarding school starting in kindergarten, even though his parents lived only a short distance from the school. Perhaps to compensate for their inattention, they gave him everything he wanted, including hundreds of toy cars. Last Christmas he bought himself the Maserati. “It’s like their childhood has not ended,” Wang said of his fellow rich kids. “Their childhood was not fully satisfied, so they always want to prolong the process of being children.” Thanks to China’s one-child policy, most fuerdai grew up without siblings. That’s why so many travel in packs on Saturday nights, Wang said. “They want to be taken care of. They want to be loved.”

For Zhang, partying is a way of staving off boredom. He used to go out clubbing five nights a week. “If I didn’t go, I couldn’t sleep,” he said. He doesn’t lack for companionship, he added. Two or three times a week, he’ll hire a high-end sex worker—a “booty call,” in his words—for $1,000 or more. Zhang prefers paying for sex to flirting with a girl under the pretense that he might date her. “This way is more direct,” he said. “I think this is a way of respecting women.” But some nights, sitting at home alone, he scrolls through the contacts on his phone only to reach the bottom without finding anyone he wants to call. When we first spoke, he said he had a girlfriend of three years who treated him well, but that he didn’t love her. “You’re the first person I’ve told that to,” he said.

Most fuerdai don’t talk about their problems so openly. “They have trust issues,” said Wayne Chen, 32, a second-generation investor from Shanghai. “They need a place to talk. They need a group.” Relay offers a setting in which they can speak honestly, without having to pretend. “It’s similar to a rehab center,” he said.

by Christopher Beam, Bloomberg | Read more:
Image:Ka Xiaoxi

Superman of Havana

The mayor’s son drew on his cigarette, thought back sixty years, paused, and made a chopping motion on his lower thigh—fifteen inches, give or take, from his groin to just above his knee. “The women said, ‘He has a machete.’”

The mayor’s son is in his seventies now, but he was a teenager back then, during the years of Havana’s original sin. He thought back to his father as a young man, a lotto numbers runner who rose to the mayoralty of the gritty Barrio de Los Sitios, in Centro Habana. His dad loved mingling with the stars that flocked to the capital, and he sometimes took his boy to meet them: Brando, Nat King Cole, and that old borrachón Hemingway. The mayor’s son once got blind drunk with Benny Moré, the famous Cuban crooner who had a regular gig at the Guadalajara.

But more revered than all the rest was the man of many names. El Toro. La Reina. The Man With the Sleepy Eyes. Outside Cuba, from Miami to New York to Hollywood, he was known simply as Superman. The mayor’s son never met the legendary performer, but everybody knew about him. The local boys talked about his gift. They gossiped about the women, the sex. “Like when you’re coming of age, reading your dad’s Playboys. That’s what the kids talked about,” he said. “The idea that this man was around in the neighborhood, it was mind-boggling in a way.”

Superman was the main attraction at the notorious Teatro Shanghai, in Barrio Chino—Chinatown. According to local lore, the Shanghai featured live sex shows. “If you’re a decent guy from Omaha, showing his best girl the sights of Havana, and you make the mistake of entering the Shanghai, you’ll curse Garcia and will want to wring his neck for corrupting the morals of your sweet baby,” Suppressed, a tabloid magazine, wrote in its 1957 review of the club.

After the revolution, the Shanghai shuttered. Many of the performers fled the country. Superman disappeared, like a ghost. No one knew his real name. There were no known photos of him. A man who was once famous well beyond Cuba’s shores—who was later fictionalized in The Godfather Part II and Graham Greene’s Our Man in Havana—was largely forgotten, a footnote in a sordid history.

In the difficult years that followed, people didn’t talk about those times, as if they never happened at all. “You didn’t want to make problems with the government,” the mayor’s son said. “People were afraid. People didn’t want to look back. Afterward, it was an entirely new story. It was like everything didn’t exist before. It was like Year Zero.”

And into that void, the story of Superman disappeared.

by Mitch Moxley, Roads and Kingdoms |  Read more:
Image: Michael Magers

Sting feat. Robert Downey Jr.


Daniel Craig
via:

Tuesday, December 22, 2015

Common, The Co-Living Startup

[ed. Paying more for less. Seems to be a 'common' theme in America these days.]

Common, a co-living startup from General Assembly co-founder Brad Hargreaves, is unveiling its first building today in Brooklyn’s Crown Heights. With more than four floors and 7,300 square feet of space, the building has 19 private bedrooms costing anywhere from $1,800 to $1,950. Along with the private rooms, comes four communal kitchens, a large dining room, work space and a roof deck.

The Common opening comes at a time when venture-backed companies like WeWork are piling into co-living as a way to use urban residential space more cost-efficiently and to attract Millennials, who are delaying marriage and families later and later. In New York, older tenant laws control the number of tenants that can be listed on a lease, and brokers often charge upwards of a month’s rent to find apartments. That makes it difficult for newcomers to find housing easily compared to other American cities. On the property owner side, Common’s pitch is that they can partner to purchase whole vacant buildings and turn them into stable, market-rate income streams while removing the hassles of leasing and property management.

Over the summer, Common partnered with a local New York City real estate developer to buy Crown Heights building earlier this year. They invested a little less than $1 million in re-modeling the space.

They kept four suites or units in tact, but opened up large dining and work areas. “The whole idea here is is to use common areas and activate typically under-utilized space,” Hargreaves told me in a video tour via Skype.

Common built in several smart phone features like Bluetooth door locks compatible with keycards, phones and the Apple watch, and Nest thermostats. Through Hargreaves’ connections, they added mattresses from the startup Casper, along with furniture from Restoration Hardware and West Elm.

“Aesthetically, I would say it’s mid-century Modern with some Hudson Valley Americana built into it,” he said. “We wanted to evoke the neighborhood as well. A lot of the art is from Crown Heights and the furniture are things you would find in a traditional Brownstone.”

Services include free laundry, regular deliveries of coffee, tea and paper towels, and weekly cleanings in bathrooms and common areas. Utilities and wi-fi is baked into the price.

For the communal element, Common is bringing in Sunday potlucks and other kinds of event programming. He partnered with his old General Assembly co-founder Matthew Brimer, who went on to create the morning dance party Daybreaker. They’re bringing in Common residents to the next one.

by Kim-Mai Cutler, Techcrunch |  Read more:
Image: uncredited

How One Doctor Changed Football Forever

[ed. The article that inspired the movie Concussion to be released later this week.]

On a foggy, steel gray Saturday in September 2002, Bennet Omalu arrived at the Allegheny County coroner’s office and got his assignment for the day: Perform an autopsy on the body of Mike Webster, a professional football player. Omalu did not, unlike most 34-year-old men living in a place like Pittsburgh, have an appreciation for American football. He was born in the jungles of Biafra during a Nigerian air raid, and certain aspects of American life puzzled him. From what he could tell, football was rather a pointless game, a lot of big fat guys bashing into each other. In fact, had he not been watching the news that morning, he may not have suspected anything unusual at all about the body on the slab.

The coverage that week had been bracing and disturbing and exciting. Dead at 50. Mike Webster! Nine-time Pro Bowler. Hall of Famer. "Iron Mike," legendary Steelers center for fifteen seasons. His life after football had been mysterious and tragic, and on the news they were going on and on about it. What had happened to him? How does a guy go from four Super Bowl rings to...pissing in his own oven and squirting Super Glue on his rotting teeth? Mike Webster bought himself a Taser gun, used that on himself to treat his back pain, would zap himself into unconsciousness just to get some sleep. Mike Webster lost all his money, or maybe gave it away. He forgot. A lot of lawsuits. Mike Webster forgot how to eat, too. Soon Mike Webster was homeless, living in a truck, one of its windows replaced with a garbage bag and tape.

Omalu loved the brain. Of all the organs in the body, it was easily his favorite. He thought of it sort of like Miss America. Such a diva! So high-maintenance: It requires more energy to operate than any other organ. The brain! That was his love and that was his joy, and that’s why his specialty was neuropathology. (...)

Omalu stared at Mike Webster’s brain. He kept thinking, How did this big athletic man end up so crazy in the head? He was thinking about football and brain trauma. The leap in logic was hardly extreme. He was thinking, Dementia pugilistica? "Punch-drunk syndrome," they called it in boxers. The clinical picture was somewhat like Mike Webster’s: severe dementia—delusion, paranoia, explosive behavior, loss of memory—caused by repeated blows to the head. Omalu figured if chronic bashing of the head could destroy a boxer’s brain, couldn’t it also destroy a football player’s brain? Could that be what made Mike Webster crazy?

Of course, football players wear helmets, good protection for the skull. But the brain? Floating around inside that skull and, upon impact, sloshing into its walls. Omalu thought: I’ve seen so many cases of people like motorcyclists wearing helmets. On the surface is nothing, but you open the skull and the brain is mush.

So Omalu carried Mike Webster’s brain to the cutting board and turned it upside down and on its side and then over again. It appeared utterly normal. Regular folds of gray matter. No mush. No obvious contusions, like in dementia pugilistica. No shrinkage like you would see in Alzheimer’s disease. He reviewed the CT and MRI scans. Normal. That might have been the end of it. He already had a cause of death. But Omalu couldn’t let it go. He wanted to know more about the brain. There had to be an answer. People don’t go crazy for no reason. (...)

It was late, maybe midnight, when Bob Fitzsimmons, a lawyer working in a renovated firehouse in Wheeling, West Virginia, got a call from the Pittsburgh coroner’s office. It was not unusual for him to be at the office that late; he was having a bad week. He struggled to understand the man’s accent on the phone, jutted his head forward. "Excuse me? You need what?"

The brain. Permission from the Webster family to process Mike Webster’s brain for microscopic examination.

Oh brother was Fitzsimmons’s initial thought. As if the Webster case wasn’t already complicated enough.

Fitzsimmons had first met Webster back in 1997, when he showed up at his office asking for help untangling his messed-up life. Webster was a hulk of a man with oak-tree arms and hands the size of ham hocks. Fitzsimmons shook his hand and got lost in it, mangled fingers going every which way, hitting his palm in creepy places that made him flinch. It seemed like every one of those fingers had been broken many times over. Mike Webster sat down and told Fitzsimmons what he could remember about his life. He had been to perhaps dozens of lawyers and dozens of doctors. He really couldn’t remember whom he’d seen or when. He couldn’t remember if he was married or not. He had a vague memory of divorce court. And Ritalin. Lots of Ritalin.

"With all due respect, you’re losing your train of thought, sir," Fitzsimmons said to Webster. "You appear to have a serious illness, sir." Not a pleasant thing to tell anyone, and here was a hero, a famous football player Fitzsimmons once bowed to, as did all young guys worth the Terrible Towels they proudly waved in the 1970s. The Dynasty! The black and the gold! It fueled optimism here, up and down the rivers, mill towns held tight in the folds of the Allegheny Mountains. And here was Iron Mike himself.

As a personal-injury lawyer, Fitzsimmons thought what he saw in Webster was an obvious case of a man suffering a closed-head injury—the kind he’d seen plenty of times in people who had suffered through car crashes and industrial accidents. No fracture, no signs of physical damage to the skull, but sometimes severe psychiatric problems, memory loss, personality changes, aggressive behavior.

"Please help me," Mike Webster said.

It took Fitzsimmons a year and a half to hunt down all of Webster’s medical records, scattered in doctors’ offices throughout western Pennsylvania and West Virginia. He sent Webster for four separate medical evaluations, and all four doctors confirmed Fitzsimmons’s suspicion: closed-head injury as a result of multiple concussions.

Fitzsimmons filed the disability claim with the NFL. There are several levels of disability with the NFL, and Mike Webster was awarded the lowest one: partial, about $3,000 a month.

Fitzsimmons said, "Oh, please." He said if ever there was a guy who qualified for the highest, it was Mike Webster. The highest level was "total disability, football-related," reserved for those who were disabled as a result of playing the game. It would yield Webster as much as $12,000 a month. Fitzsimmons said to the NFL, "Four doctors—all with the same diagnosis!"

The NFL said no. Four doctors were not enough. They wanted Webster seen by their own doctor. So their own doctor examined Webster...and concurred with the other four: closed-head injury. Football-related.

The NFL pension board voted unanimously for partial disability anyway.

Fitzsimmons said, "You have got to be kidding me." He filed an appeal with the U.S. District Court in Baltimore, where the pension board is headquartered. The judge reversed the decision of the NFL pension board—the first time in history any such action had been taken against the NFL.

And yet still the NFL fought. They took the case to federal court. They said Mike Webster—who had endured probably 25,000 violent collisions during his career and now was living on Pringles and Little Debbie pecan rolls, who was occasionally catatonic, in a fetal position for days—they said Mike Webster didn’t qualify for full disability.*

Mike Webster and Bob Fitzsimmons grew close during those days. In fact, Mike Webster clung to Fitzsimmons like a baby to his mamma. He took to sleeping in the parking lot, waiting for Fitzsimmons to show up for work. He would stay there all day, just watching, waiting, and when Fitzsimmons would go home, Mike Webster would go back to his truck and write him letters. Hundreds and hundreds of letters. "Dear Bob, Thank you for helping me. We’ve got to keep up the fight. We have to see this thing through." And then he would start talking about wars. And blood splattering. The letters would inevitably trail off into the mutterings of a madman.

And now he was dead.

Bob Fitzsimmons did not know what in the world to say, in 2002, to the man with the thick accent who called from the Pittsburgh coroner’s office, four days after Mike Webster died of a heart attack, asking to study Webster’s brain. Fitzsimmons was, in truth, grieving his client’s death deeply; Mike Webster had been living for nothing but the case, the appeal, the last victory against a multibillion-dollar entertainment industry that seemed to have used him, allowed him to become destroyed, and then threw him away like a rotten piece of meat.

And now he was dead.

"Yes," Fitzsimmons said. And he gave Omalu the brain.

by Jeanne Marie Laskas, GQ |  Read more:
Image: Nick Veasay