Tuesday, May 5, 2015

America’s Economy is a Nightmare of Our Own Making

For the past quarter-century I’ve offered in articles, books, and lectures an explanation for why average working people in advanced nations like the United States have failed to gain ground and are under increasing economic stress: Put simply, globalization and technological change have made most of us less competitive. The tasks we used to do can now be done more cheaply by lower-paid workers abroad or by computer-driven machines.

My solution—and I’m hardly alone in suggesting this—has been an activist government that raises taxes on the wealthy, invests the proceeds in excellent schools and other means people need to become more productive, and redistributes to the needy. These recommendations have been vigorously opposed by those who believe the economy will function better for everyone if government is smaller and if taxes and redistributions are curtailed.

While the explanation I offered a quarter-century ago for what has happened is still relevant—indeed, it has become the standard, widely accepted explanation—I’ve come to believe it overlooks a critically important phenomenon: the increasing concentration of political power in a corporate and financial elite that has been able to influence the rules by which the economy runs. And the governmental solutions I have propounded, while I believe them still useful, are in some ways beside the point because they take insufficient account of the government’s more basic role in setting the rules of the economic game.

Worse yet, the ensuing debate over the merits of the “free market” versus an activist government has diverted attention from how the market has come to be organized differently from the way it was a half-century ago, why its current organization is failing to deliver the widely shared prosperity it delivered then, and what the basic rules of the market should be. It has allowed America to cling to the meritocratic tautology that individuals are paid what they’re “worth” in the market, without examining the legal and political institutions that define the market. The tautology is easily confused for a moral claim that people deserve what they are paid. Yet this claim has meaning only if the legal and political institutions defining the market are morally justifiable.

Most fundamentally, the standard explanation for what has happened ignores power. As such, it lures the unsuspecting into thinking nothing can or should be done to alter what people are paid because the market has decreed it.

The standard explanation has allowed some to argue, for example, that the median wage of the bottom 90 percent—which for the first 30 years after World War II rose in tandem with productivity—has stagnated for the last 30 years, even as productivity has continued to rise, because middle-income workers are worth less than they were before new software technologies and globalization made many of their old jobs redundant. They therefore have to settle for lower wages and less security. If they want better jobs, they need more education and better skills. So hath the market decreed.

Yet this market view cannot be the whole story because it fails to account for much of what we have experienced. For one thing, it doesn’t clarify why the transformation occurred so suddenly. The divergence between productivity gains and the median wage began in the late 1970s and early 1980s, and then took off. Yet globalization and technological change did not suddenly arrive at America’s doorstep in those years. What else began happening then?

Nor can the standard explanation account for why other advanced economies facing similar forces of globalization and technological change did not succumb to them as readily as the United States. By 2011, the median income in Germany, for example, was rising faster than it was in the United States, and Germany’s richest 1 percent took home about 11 percent of total income, before taxes, while America’s richest 1 percent took home more than 17 percent. Why have globalization and technological change widened inequality in the United States to a much greater degree?

Nor can the standard explanation account for why the compensation packages of the top executives of big companies soared from an average of 20 times that of the typical worker 40 years ago to almost 300 times. Or why the denizens of Wall Street, who in the 1950s and 1960s earned comparatively modest sums, are now paid tens or hundreds of millions annually. Are they really “worth” that much more now than they were worth then?

Finally and perhaps most significantly, the market explanation cannot account for the decline in wages of recent college graduates. If the market explanation were accurate, college graduates would command higher wages in line with their greater productivity. After all, a college education was supposed to boost personal incomes and maintain American prosperity.

To be sure, young people with college degrees have continued to do better than people without them. In 2013, Americans with four-year college degrees earned 98 percent more per hour on average than people without a college degree. That was a bigger advantage than the 89 percent premium that college graduates earned relative to non-graduates five years before, and the 64 percent advantage they held in the early 1980s.

But since 2000, the real average hourly wages of young college graduates have dropped. The entry-level wages of female college graduates have dropped by more than 8 percent, and male graduates by more than 6.5 percent. To state it another way, while a college education has become a prerequisite for joining the middle class, it is no longer a sure means for gaining ground once admitted to it. That’s largely because the middle class’s share of the total economic pie continues to shrink, while the share going to the top continues to grow.

A deeper understanding of what has happened to American incomes over the last 25 years requires an examination of changes in the organization of the market. These changes stem from a dramatic increase in the political power of large corporations and Wall Street to change the rules of the market in ways that have enhanced their profitability, while reducing the share of economic gains going to the majority of Americans.

This transformation has amounted to a redistribution upward, but not as “redistribution” is normally defined. The government did not tax the middle class and poor and transfer a portion of their incomes to the rich. The government undertook the upward redistribution by altering the rules of the game.

by Robert Reich, Salon |  Read more:
Image: uncredited

Deco Japan


K. Kotani, Songbook for "The Modern Song" (Modan Bushi), 1930
From: Deco Japan: Shaping Art and Culture, 1920-1945

Sunday, May 3, 2015

Rodrigo y Gabriela

The End of California?

In a normal year, no one in California looks twice at a neighbor’s lawn, that mane of bluegrass thriving in a sun-blasted desert. Or casts a scornful gaze at a fresh-planted almond grove, saplings that now stand accused of future water crimes. Or wonders why your car is conspicuously clean, or whether a fish deserves to live when a cherry tree will die.

Of course, there is nothing normal about the fourth year of the great drought: According to climate scientists, it may be the worst arid spell in 1,200 years. For all the fields that will go fallow, all the forests that will catch fire, all the wells that will come up dry, the lasting impact of this drought for the ages will be remembered, in the most exported term of California start-ups, as a disrupter.

“We are embarked upon an experiment that no one has ever tried,” said Gov. Jerry Brown in early April, in ordering the first mandatory statewide water rationing for cities.

Surprising, perhaps even disappointing to those with schadenfreude for the nearly 39 million people living in year-round sunshine, California will survive. It’s not going to blow away. The economy, now on a robust rebound, is not going to collapse. There won’t be a Tom Joad load of S.U.V.s headed north. Rains, and snow to the high Sierra, will eventually return.

But California, from this drought onward, will be a state transformed. The Dust Bowl of the 1930s was human-caused, after the grasslands of the Great Plains were ripped up, and the land thrown to the wind. It never fully recovered. The California drought of today is mostly nature’s hand, diminishing an Eden created by man. The Golden State may recover, but it won’t be the same place.

Looking to the future, there is also the grim prospect that this dry spell is only the start of a “megadrought,” made worse by climate change. California has only about one year of water supply left in its reservoirs. What if the endless days without rain become endless years?

In the cities of a changed California, brown is the new green. A residential lawn anywhere south of, say, Sacramento, is already considered an indulgence. “If the only person walking on your lawn is the person mowing it,” said Felicia Marcus, chairwoman of the State Water Resources Control Board, then maybe it should be taken out. The state wants people to convert lawns to drought-tolerant landscaping, or fake grass.

Artificial lakes filled with Sierra snowmelt will become baked-mud valleys, surrounded by ugly bathtub rings. Some rivers will dry completely — at least until a normal rain year. A few days ago, there was a bare trickle from the Napa, near the town of St. Helena, flowing through some of the most valuable vineyards on the planet. The state’s massive plumbing system, one of the biggest in the world, needs adequate snow in order to serve farmers in the Central Valley and techies in Silicon Valley. This year, California set a record low Sierra snowpack in April — 5 percent of normal — following the driest winter since records have been kept.

To Californians stunned by their bare mountains, there was no more absurd moment in public life recently than when James Inhofe, the Republican senator from Oklahoma who is chairman of the environment and public works committee, held up a snowball in February as evidence of America’s hydraulic bounty in the age of climate change.

You can see the result of endless weeks of cloudless skies in New Melones Lake, here in Calaveras County in the foothills east of the Central Valley, where Mark Twain made a legend of a jumping frog. The state’s fourth largest reservoir, holding water for farmers, and for fish downstream, is barely 20 percent full. It could be completely drained by summer’s end.

It’s a sad sight — a warming puddle, where the Stanislaus River once ran through it. At full capacity, with normal rainfall, New Melones should have enough water for nearly two million households for a year.

Even worse is the Lake McClure reservoir, impounding the spectral remains of the Merced River as it flows out of Yosemite National Park. It’s at 10 percent of capacity. In a normal spring, the reservoir holds more than 600,000 acre-feet of water. As April came to a close, it was at 104,000 acre-feet — with almost no snowmelt on the way. (The measurement is one acre filled to a depth of a foot, or 325,851 gallons.) That’s the surface disruption in a state that may soon be unrecognizable in places.

The morality tale behind California’s verdant prosperity will most certainly change. In the old narrative, the evil city took water from powerless farmers. Swimming pools in greater Los Angeles were filled with liquid that could have kept orchards alive in the Owens Valley, to the north.

It was hubris, born in the words of the city’s chief water engineer, William Mulholland, when he opened the gates of the Los Angeles Aqueduct in 1913 with an immortal proclamation: “There it is. Take it.”

by Timothy Egan, NY Times |  Read more:
Image: Ken Light/Contact Press Images

Borb

Obsessed with Parkour

On the Thursday morning I arrive in London, my phone pings with a message from a teenage mother and school dropout named Shirley Darlington: "Kilburn tube station, 7pm."

I get there 10 minutes early, but about 20 women are already warming up, including the British movie actress Christina Chong and her sister Lizzi, a professional dancer.

Every Thursday, Shirley emails the 100 or so members of her all-female Parkour crew, revealing the secret location for that night's challenge. She keeps the venue a surprise so her crew never knows what to expect, and she keeps guys away because the biggest threat to Parkour - as even Parkour's all-male founders would agree - is testosterone.

"Young guys turn up, and lots of times all they want is the flash and not the fundamentals," says Dan Edwardes, the master instructor who gave Shirley her start. "They want to backflip off a wall and leap around on rooftops. With a group of lads, you'll get the show-off, the questioner, the giddy one. But in a women's group, there's none of that. It's very quiet. They get to it."

What I was after was even more fundamental than the fundamentals. I'd only come to Parkour by accident while chasing the secrets of the most remarkable athletes of our time - World War Two resistance fighters.

I'd become fascinated with the underground when I realised it wasn't made up of hardened soldiers. Often, they were civilians hiding behind enemy lines who had to live off the land while attacking Hitler's forces in gruelling hit-and-run operations.

Take Crete - on that small Greek island, farmers and shopkeepers were joined by British academics and poets who'd essentially been recruited just because they knew Ancient Greek. For the next four years, these misfits routinely covered ultra-marathon distances over mountain peaks and pulled off feats of strength and endurance that would stagger an Olympic athlete. I wanted to learn - what was their secret? And could I master it too?

One clue came from Samuel Gridley Howe, an American medic who joined the Greek Revolution in the early 1800s. Howe was amazed by the way Greek fighters seemed to bounce along the landscape, using so little effort that they barely needed food or rest. "A Greek soldier," Howe commented, "will march, or rather skip, all day among the rocks, expecting no other food than a biscuit and a few olives, or a raw onion, and at night, lies down content upon the ground with a flat rock for a pillow." (...)

To me, this sounded remarkably similar to what I knew of Parkour, the French street art of using the body's natural elastic recoil to leap and flow across the urban outback.

I wondered if Parkour's pioneers, seven French street kids who called themselves the "Yamakasi", and learned their basic moves from a survivor of colonial jungle fights in French-occupied Asia - might have rediscovered the same ancient athletic principle which allowed the Cretans to cover fantastic distances with remarkably little effort.

by Christopher McDougall, BBC | Read more:
Image: Ben Curwen

Saturday, May 2, 2015


Morpher Folding Helmet Technology
via:

The Rage of the Jilted Crowdfunder

“Thank you all very much,” Update 57 concluded, by way of goodbye. “Working on this project was the most ambitious and meaningful undertaking any of us have ever attempted. Getting to know all of you, and working to create some seriously cool technology, was one of the most rewarding things we’ve ever done. We are deeply and truly sorry that despite our best efforts, we were not able to get this machine across the finish line. Love, Gleb, Igor and Janet, Team ZPM.”

It had been three long years of gradual disappointment since the 1,500 or so supporters of ZPM Espresso — otherwise known as the PID-Controlled Espresso Machine project on the crowdfunding platform Kickstarter — each put a few hundred dollars, or some $370,000 in total, into the campaign, and eight months since the last communiqué from the project’s creators. Now, with Update 57 in January, ZPM Espresso announced that it was winding itself down. For the backers who expected a ZPM machine for their pledge, there would be neither fulfillment nor refunds. All accumulated moneys, the update said, were dispersed on the nonrecoverable engineering costs involved in ZPM’s failed attempt to manufacture an inexpensive commercial-grade espresso machine for the home market.

Ian Woodhouse, the 44-year-old director of operations for a Toronto real estate developer, was one of ZPM’s earliest and most ardent backers. Three years on, though, no new blow could surprise him. The update represented exactly what he had long come to expect from the creators. It was evasive and opaque. There was no clear explanation for the company’s insolvency. Woodhouse was especially nettled by that valediction: “love.” What he wanted, he told me later, was not another update “signed ‘love.’ They always signed their updates ‘love.’ ” He could see that it seemed like a peculiar fixation, but the word was so disingenuous and cloying, and it made him angry. “Notice,” he instructed me, “how I keep bringing up the ‘love’ thing.” It reminded him that what ought to have been a straightforward financial transaction had somehow left him feeling taken advantage of and betrayed.

Powerless to act individually, the backers began, in Update 57’s wake, to organize. Their first step was a Facebook group, “Ripped off by ZPM Espresso,” but Woodhouse feared it wasn’t private: The principals of ZPM, who seemed to him at once inept and robust in their malfeasance, might easily monitor the activity there. So Woodhouse, along with a few others among the more persistent backers, instead set up a private forum on the messaging service Slack, where they shared their sense of affliction and pondered legal or moral redress. The legal options were limited. Although Kickstarter’s terms of use stipulate that any creators unable to satisfy the terms of their agreement with their backers might be subject to legal action, no sane attorney would initiate a class-action suit on a contingency-fee basis against insolvent creators, and no sane backer would ante up the necessary legal fees.

The other alternative was a consumer-protection suit filed by a state attorney general, but for that to proceed, the backers would need evidence of actual fraud, which they spent an enormous amount of time and energy trying to uncover. As Woodhouse, who speaks in the tone of a reasonable man drawn hopelessly against his will into a vast conspiracy, put it to me, “We found out a whole bunch of interesting information.” He ticked off the names of four lawyers, an unpaid accountant and two Silicon Valley investors. He delved into what he dug up in the corporate-filings section of the State of Georgia’s website, and on a shady-seeming portal seeking to bundle Chinese angel investments. By his estimations, ZPM had ultimately raised $1.2 million, all of it gone and unaccounted for. The ZPM founders, the backers discovered in their attempt to serve up (at the very least) a small-claims action, were lying low; they had left Atlanta and absconded for San Francisco. (...)

Since Kickstarter’s debut in 2009, campaigns on the platform have raised $1.4 billion for the creators of more than 80,000 projects. In the process, the company and its crowdfunding competitors have invented a new sort of economic relationship, and a corresponding frontier of Internet acrimony. Disgruntled crowdfunders are not your typical Internet-commenting degenerates: In ZPM’s case, they are affluent, well-educated professionals, working in New York and Los Angeles as systems analysts, TV directors, research physicians specializing in adrenal pathology — the sorts of people you would expect to write off their $250 donation as a gamble gone sour. Yet they found, for reasons that weren’t always clear to them, that they couldn’t. A professor in Columbia’s graduate school of architecture wrote, on the private Slack forum, that he considered Polyakov to have “neither humility nor shame.” He continued, “I also think it entirely appropriate that he never work in technology, finance, consulting or the coffee fields (sorry, that kills the barista career) again.”

The rancor is due, perhaps, to a fundamental confusion about what crowdfunding really is. On one hand, a backer is not a customer, because the product does not exist yet and may never; Kickstarter is constantly reminding its patrons that the platform is not a store. (On some level, backers must already know this, or else they wouldn’t be backers; if Ian Woodhouse had just wanted an inexpensive espresso machine, the top seller on Amazon retails for $86 and has thousands of five-star reviews.)

On the other hand, though, neither is a backer an investor, even if many of ZPM’s backers insisted they be treated as such. A Kickstarter pledge does not buy a portion of a company. Backers do not sit on the board; they are not enfranchised to review the company’s audited financials. Investors’ interests, at least ideally, are aligned with those of the company, whereas nothing in the crowdfunding relationship ties a backer to the company for the long term. Moreover, the last thing Kickstarter wants to deal with is S.E.C. regulations.

by Gideon Lewis-Kraus, NY Times |  Read more:
Image: Mark Mahaney

Friday, May 1, 2015

The Waterboys



[ed. More great Waterboys here and here and here.]

In Praise of Vulgar Feminism

Just prior to the publication of Kim Gordon’s memoir Girl in a Band, a characteristic controversy broke out on the internet. Among the people disparaged in the book is the young singer Lana Del Rey. “Today we have someone like Lana Del Rey,” Gordon writes, in summing up the fallen state of things (since the ’90s), “who doesn’t even know what feminism is, who believes it means women can do whatever they want, which, in her world, tilts toward self-destruction, whether it’s sleeping with gross older men or being a transient biker queen. . . . Naturally, it’s just a persona. If she really truly believes it’s beautiful when young musicians go out on a hot flame of drugs and depression, why doesn’t she just off herself?” Del Rey’s fans got wind of the insult and duly commenced to trash Gordon on Twitter, whom they had clearly never heard of. Gordon, for her part, retweeted the worst of the abuse.

Faced with a choice between the bassist of Sonic Youth and the nihilist nymphet Lana Del Rey and her army of Twitter defenders, the highbrow music fan knows whose side she’s on. And it’s not as if Gordon is wrong about Del Rey, whose embrace of American rock and roll myths, shot through with a cartoonish sense of female desire, really is infantile. The appeal of Kim Gordon is completely different. She came from the New York art world of the early ’80s, co-founded one of the most admired bands of all time with her boyfriend and eventual husband Thurston Moore, and has now written an honest memoir about the whole thing. She’s one of the most respected personalities in rock music, who somehow obtained a license in the world of male-dominated culture to combine the impossible—to be both sexy and smart, mature and attractive, a mother and an artist, confrontational and political and also eternally “cool.” How many women are able to do this in music or pop culture, or at all? Not many.

Which makes me think: Isn’t Del Rey, precisely through her disturbing, masochistic fantasies of rape, mental abuse, and violent sex, and on top of that her adolescent rejection of feminism (“feminism doesn’t interest me as an idea,” she’s told interviewers on several occasions), a better icon for our time? Don’t her words and lyrics say more about the contemporary position of women than the mature, self-confident, and in the end somewhat commonplace pronouncements of Kim Gordon?

Feminism and class (and taste) are interesting categories through which to approach Gordon’s memoir. The book, which describes in detail her acrimonious breakup with Moore and the disbanding of Sonic Youth, must have been very hard to write. The beginning and the end, especially, are full of details of the brutal breakup, with the band carrying on for several months of touring after the fact. This makes for disturbing reading, and Gordon handles it with a wry sense of humor. But I can’t let go of a feeling that everything falls into place rather too easily, and feminism, class, and taste have a lot to do with it. Again: Gordon’s memoir is compelling—even gripping—and well-written, with atmospheric images and disarming honesty. But throughout something feels not quite right. Gordon writes with an absolute sureness of self that enables her to reminisce with the same confidence about her art and musical output as the artistic circles she has lived in for the last thirty-odd years, and also to rebuke the likes of Del Rey, Courtney Love, and several other women, including Lydia Lunch and Madonna. The women Gordon likes and admires are Kathleen Hanna, Kim Deal, her friend and Free Kitten bandmate and fashion label co-founder Julie Cafritz, Chloe Sevigny, and Sofia Coppola—a classy, laconic bunch.

While I was reading Girl in a Band, I was also reading a new book in the33 1/3 series on Hole’s 1994 album Live Through This, by an Australian author Anwen Crawford. Crawford hasn’t a negative thing to say about her subject and the album’s creator, Courtney Love. She challenges the persistent public hatred of Love and the accusation that she “killed” her husband and Nirvana frontman Kurt Cobain. She furthermore makes a formidable case for the album itself, presenting it as a manifesto of positive, alternative, grassroots feminism, a feminism that has nothing to do with positive adjustment, good taste, or middle-class-ness, and in which self-confidence is born of exclusion—for being a woman, for being queer, for living on the periphery. Given the mutual history of Gordon and Love, whose paths crossed constantly in the early ’90s, it’s tempting to use the coincidence of these two books’ publication to talk about that specific moment in history, a moment crucial for Sonic Youth, Nirvana, and Hole, and for the whole accession of “alternative” music to the mainstream in the United States. (...)

The differences of perception between Courtney Love and Kim Gordon were, and remain, profoundly a matter of taste, which is to say of class. Courtney Love never said that she came from a working class or poor background, and stressed a few times that she didn’t. (Love’s mother was a psychotherapist and her father was the first manager of the Grateful Dead.) But she was kitschy, exhibitionistic, shameless, and at the same time vulnerable and ready to show it. Love came from a “complicated” family background. She grew up without much attention, and was passed from relative to relative, and traveled as a teenager to the UK to follow around Liverpool bands Echo & the Bunnymen and The Teardrop Explodes. In the ’80s she worked as a jobbing actress and a stripper. Side by side, Gordon and Love represent mirror images of the Nineties—of music, femininity, feminism, and politics. If Gordon was tastefully highbrow, Love was lowbrow, “distasteful”: the disgraced widow, widely regarded as someone who was, if not directly responsible for her husband’s death, then at least insufficiently “helpful,” who was too mad, too freakish, too much of a selfish junkie careerist to look after her suffering husband. But that’s not how her fans saw her. (...)

In the end Gordon created a space in her music, where irony toward her own experiences or masks could protect her from her fears. Love confronted her traumas in the opposite way. Her act wasn’t to hide before the menace, it was to become that menace herself. Her voice is not one of beauty, but it’s powerful: she’s giving everything she has, until she can’t speak anymore. It’s funny how Gordon dismisses Courtney as somebody exploiting suffering (like in “Doll Parts,” where Love compares herself to a dismembered, killed doll, yet the one “with the most cake”), as if it was impossible for a woman to fake it and get away with it. But at the same time, Courtney lived through it, through the hate and contempt of the people around her, and still managed to create compelling music. If she often seemed like an attention-craving jackass, it’s because she actually refused to think she should behave any different from the way men in rock behave.

by Agata Pyzik, N+1 |  Read more:
Image: Album art Hole's Live Through This

Twitter at the Crossroads

Twitter as we know it is over. While the early release of ugly revenue numbers sent the company’s stock spiraling Tuesday, the actual quarterly earnings report that followed that afternoon was even worse. Twitter is acquiring users more slowly, particularly on mobile. It is failing to monetize these users as well as expected. And it is tapping other companies like Google, with whom it will partner to take advantage of its DoubleClick ad-serving platform, for lifelines. As a consequence, the ultimate value of the social network’s nearly 300 million users is looking significantly lower than previously thought. Twitter is well aware of these factors. Its recent actions signal that it is trying to redefine its business, not as a service that monetizes its users, but as a crowdsourced media platform and advertising agency—a dangerous bet that is unlikely to pay off.

Twitter’s strength is being the pulse of the Internet, the place where news gets broken in 140-character messages, where important topics start trending the second they enter the collective hivemind, and where politicians and celebrities and thinkers of all stripes can make announcements without the bother of a press release or the filter of the media. Yet this has always made Twitter Janus-faced: Is it a real-time news aggregator or a social network? More importantly, how will it make money? The conventional wisdom was once that Twitter would monetize its users by showing them ads that are extremely relevant to them. It is now obvious that Twitter’s future does not lie in a Facebook-like model, but in something else entirely. Twitter sees its user base, whose growth is flattening, not as customers but as content producers. In which case, who are its customers?

On the earnings call, Twitter CEO Dick Costolo specifically attributed the quarter’s revenue shortfall ($436 million, short of a projected $440 million to $450 million) to the underperformance of some of Twitter’s new “direct response” advertising products that have not performed as well as expected. Examples include Twitter’s “mobile-app install” ads, which offer a direct link to install an advertiser’s app. While Twitter hasn’t mentioned any new direct-response strategies, Costolo said it hopes to boost its advertising revenue by acquiring TellApart, “a leading marketing technology company providing retailers and e-commerce advertisers with unique cross-device retargeting capabilities.” This is a bad omen. When Google bought DoubleClick, it was buying DoubleClick’s utter dominance in the advertising sector, not its technology. TellApart doesn’t have that kind of dominance; Twitter’s purchase will get it TellApart’s technology and consumer profiles. Neither merits a headline announcement in an earnings report.

But the acquisition of TellApart does tell us something:, that Twitter is now trying to monetize nonusers, people who don’t have Twitter accounts but might read it anyway. That’s because Twitter is running up against the limitations of its interface, which makes monetizing active Tweeters difficult. For starters, Twitter can’t collect useful data about its users the way Facebook can; its 140-character limit hampers it. And because Twitter is public, people keep less of their lives on it. Tweets are less valuable as a key to profiling users than they are as an attractor of eyeballs. I’ve previously discussed the unique advantages and disadvantages of the public-private hybrid discourse produced on Twitter, but the flip side is what that unique hybrid amounts to in terms of monetization: It’s mostly a downside.

by David Auerbach, Slate | Read more:
Image: Natalie Matthews-Ramo

Ramil Gilvanov
via:

The Saudi Royal Family Shakeup

The House of Saud, one of the world’s largest and richest royal families, experienced a quiet coup within its ranks shortly before dawn on Wednesday. King Salman canned his Crown Prince and appointed a tough security official as the new heir. He named as second-in-line to the throne a young son with limited experience. And he removed the world’s longest serving foreign minister, who was responsible for building the alliance between Riyadh and Washington under seven American Presidents since 1975.

A longer list of abrupt royal decrees was announced in an early-morning television bulletin. Senior princes were then assembled at a Riyadh palace to pledge loyalty to the new order of succession. The shakeup, which concentrates power in a conservative wing of the vast royal family, could shape policy in the world’s largest oil exporter for decades.

The apparent goal was to signal renewed vigor amid deepening turmoil in and around the country. Last month, Saudi Arabia mobilized a ten-nation coalition to intervene in neighboring Yemen’s war, to the south. That has not gone well. To the north, the Kingdom is also part of the U.S.-led coalition running daily air strikes against the Islamic State, which has defiantly held on to huge chunks of Iraq and Syria. And this week the government announced the arrests of ninety-three militants who were allegedly plotting against security targets, foreign residential compounds, and the U.S. Embassy. Most are Saudis.

The decrees were all the more startling because the Kingdom just went through a big transition in January, when King Abdullah died, after two decades in power. Usually, the Saudis move slowly and with consensus. Usually, age takes precedence, no matter the ailments of the senescent first generation of princes sired by the Kingdom’s founder, the warrior Abdulaziz Al-Saud. Sequence was honored even when lining up at royal events.

King Salman instead removed his youngest half-brother and turned the Kingdom decisively over to the next generation of princes, the founder’s grandsons. He also skipped over hundreds who had seniority among them. (The royal family has somewhere around seven thousand princes and princesses.) Salman turns eighty this year. The question is whether the new precedent of forsaking promises and leap-frogging royals might, in turn, be used against Salman’s appointees after he dies—annd whether it might end up generating more uncertainty than stability.

by Robin Wright, New Yorker |  Read more:
Image: Hamad I Mohammed/Reuters via Landov 

Why Girls Love the Dadbod


[ed. See also: Dadbod: A New Word for a Timeless Physique]

In case you haven't noticed lately, girls are all about that dad bod. I hadn't heard about this body type until my roommate mentioned it. She used to be crazy over guys she claimed had the dad bod. After observing the guys she found attractive, I came to understand this body type well and was able to identify it. The dad bod is a nice balance between a beer gut and working out. The dad bod says, "I go to the gym occasionally, but I also drink heavily on the weekends and enjoy eating eight slices of pizza at a time." It's not an overweight guy, but it isn't one with washboard abs, either.

The dad bod is a new trend and fraternity boys everywhere seem to be rejoicing. Turns out skipping the gym for a few brews last Thursday after class turned out to be in their favor. While we all love a sculpted guy, there is just something about the dad bod that makes boys seem more human, natural, and attractive. Here are a few reasons that girls are crazy about the dad bod:

by Mackenzie Pearson, The Odyssey |  Read more:
Image: uncredited

Thursday, April 30, 2015

The Future of College?

On a Friday morning in April, I strapped on a headset, leaned into a microphone, and experienced what had been described to me as a type of time travel to the future of higher education. I was on the ninth floor of a building in downtown San Francisco, in a neighborhood whose streets are heavily populated with winos and vagrants, and whose buildings host hip new businesses, many of them tech start-ups. In a small room, I was flanked by a publicist and a tech manager from an educational venture called the Minerva Project, whose founder and CEO, the 39-year-old entrepreneur Ben Nelson, aims to replace (or, when he is feeling less aggressive, “reform”) the modern liberal-arts college.

Minerva is an accredited university with administrative offices and a dorm in San Francisco, and it plans to open locations in at least six other major world cities. But the key to Minerva, what sets it apart most jarringly from traditional universities, is a proprietary online platform developed to apply pedagogical practices that have been studied and vetted by one of the world’s foremost psychologists, a former Harvard dean named Stephen M. Kosslyn, who joined Minerva in 2012.

Nelson and Kosslyn had invited me to sit in on a test run of the platform, and at first it reminded me of the opening credits of The Brady Bunch: a grid of images of the professor and eight “students” (the others were all Minerva employees) appeared on the screen before me, and we introduced ourselves. For a college seminar, it felt impersonal, and though we were all sitting on the same floor of Minerva’s offices, my fellow students seemed oddly distant, as if piped in from the International Space Station. I half expected a packet of astronaut ice cream to float by someone’s face.

Within a few minutes, though, the experience got more intense. The subject of the class—one in a series during which the instructor, a French physicist named Eric Bonabeau, was trying out his course material—was inductive reasoning. Bonabeau began by polling us on our understanding of the reading, a Nature article about the sudden depletion of North Atlantic cod in the early 1990s. He asked us which of four possible interpretations of the article was the most accurate. In an ordinary undergraduate seminar, this might have been an occasion for timid silence, until the class’s biggest loudmouth or most caffeinated student ventured a guess. But the Minerva class extended no refuge for the timid, nor privilege for the garrulous. Within seconds, every student had to provide an answer, and Bonabeau displayed our choices so that we could be called upon to defend them.

Bonabeau led the class like a benevolent dictator, subjecting us to pop quizzes, cold calls, and pedagogical tactics that during an in-the-flesh seminar would have taken precious minutes of class time to arrange. He split us into groups to defend opposite propositions—that the cod had disappeared because of overfishing, or that other factors were to blame. No one needed to shuffle seats; Bonabeau just pushed a button, and the students in the other group vanished from my screen, leaving my three fellow debaters and me to plan, using a shared bulletin board on which we could record our ideas. Bonabeau bounced between the two groups to offer advice as we worked. After a representative from each group gave a brief presentation, Bonabeau ended by showing a short video about the evils of overfishing. (“Propaganda,” he snorted, adding that we’d talk about logical fallacies in the next session.) The computer screen blinked off after 45 minutes of class.

The system had bugs—it crashed once, and some of the video lagged—but overall it worked well, and felt decidedly unlike a normal classroom. For one thing, it was exhausting: a continuous period of forced engagement, with no relief in the form of time when my attention could flag or I could doodle in a notebook undetected. Instead, my focus was directed relentlessly by the platform, and because it looked like my professor and fellow edu-nauts were staring at me, I was reluctant to ever let my gaze stray from the screen. Even in moments when I wanted to think about aspects of the material that weren’t currently under discussion—to me these seemed like moments of creative space, but perhaps they were just daydreams—I felt my attention snapped back to the narrow issue at hand, because I had to answer a quiz question or articulate a position. I was forced, in effect, to learn. If this was the education of the future, it seemed vaguely fascistic. Good, but fascistic. (...)

Nelson’s long-term goal for Minerva is to radically remake one of the most sclerotic sectors of the U.S. economy, one so shielded from the need for improvement that its biggest innovation in the past 30 years has been to double its costs and hire more administrators at higher salaries.

The paradox of undergraduate education in the United States is that it is the envy of the world, but also tremendously beleaguered. In that way it resembles the U.S. health-care sector. Both carry price tags that shock the conscience of citizens of other developed countries. They’re both tied up inextricably with government, through student loans and federal research funding or through Medicare. But if you can afford the Mayo Clinic, the United States is the best place in the world to get sick. And if you get a scholarship to Stanford, you should take it, and turn down offers from even the best universities in Europe, Australia, or Japan. (Most likely, though, you won’t get that scholarship. The average U.S. college graduate in 2014 carried $33,000 of debt.)Some claim education is an art and a science. Nelson has disputed this: “It’s a science and a science.”

Financial dysfunction is only the most obvious way in which higher education is troubled. In the past half millennium, the technology of learning has hardly budged. The easiest way to picture what a university looked like 500 years ago is to go to any large university today, walk into a lecture hall, and imagine the professor speaking Latin and wearing a monk’s cowl. The most common class format is still a professor standing in front of a group of students and talking. And even though we’ve subjected students to lectures for hundreds of years, we have no evidence that they are a good way to teach. (One educational psychologist, Ludy Benjamin, likens lectures to Velveeta cheese—something lots of people consume but no one considers either delicious or nourishing.) (...)

The Minerva boast is that it will strip the university experience down to the aspects that are shown to contribute directly to student learning. Lectures, gone. Tenure, gone. Gothic architecture, football, ivy crawling up the walls—gone, gone, gone. What’s left will be leaner and cheaper. (Minerva has already attracted $25 million in capital from investors who think it can undercut the incumbents.) And Minerva officials claim that their methods will be tested against scientifically determined best practices, unlike the methods used at other universities and assumed to be sound just because the schools themselves are old and expensive. Yet because classes have only just begun, we have little clue as to whether the process of stripping down the university removes something essential to what has made America’s best colleges the greatest in the world.

Minerva will, after all, look very little like a university—and not merely because it won’t be accessorized in useless and expensive ways. The teaching methods may well be optimized, but universities, as currently constituted, are only partly about classroom time. Can a school that has no faculty offices, research labs, community spaces for students, or professors paid to do scholarly work still be called a university?

If Minerva fails, it will lay off its staff and sell its office furniture and never be heard from again. If it succeeds, it could inspire a legion of entrepreneurs, and a whole category of legacy institutions might have to liquidate. One imagines tumbleweeds rolling through abandoned quads and wrecking balls smashing through the windows of classrooms left empty by students who have plugged into new online platforms.

by Graeme Wood, Atlantic |  Read more:
Image: Adam Vorhees

Who Gets to Wear Shredded Jeans?

Recently I scanned the statement of authenticity on a brand-new pair of good old bluejeans. Printed on the inside of the left pocket, beneath an equine insignia, an 1873 patent date and a boast of its status as “an American tradition, symbolizing the vitality of the West,” Levi Strauss & Company reissued its ancient invitation to inspect the dry goods: “We shall thank you to carefully examine the sewing, finish and fit.” The fit was slim, the sewing sound, the finish glamorously traumatized, as if intending homage to clothes Steve McQueen might have worn home from a bike crash.

A ragged extravagance of fraying squiggled from each knee, where an irregular network of holes was patched from behind by a white-cotton rectangle stretchier than sterile gauze. Knotted to a belt loop was a paper tag headed “Destruction,” explaining that these Levi’s, shredded to resemble “the piece you just can’t part with,” merited gentle treatment: “Be sure to take extra care when wearing and washing.” The process of proving the denim tough had endowed it with the value of lace.

These jeans sent a dual message — of armor, of swaddling — in the accepted doublespeak of distressed denim. Pre-washed bluejeans are now sold already on their last legs: ripped, blasted, trashed, wrecked, abused, destroyed, sabotaged, devastated and, in what may be a borrowing of aerospace jargon for drones obliterated by remote control, destructed. Below this disaster-headline language, the fine print babbles smoothly about the soft comfort of deep familiarity, as the textile historian Beverly Gordon observed in a paper titled “American Denim.” These are clothes that suit the Friday-evening needs of Forever 21-year-olds buttressing their unformed selves with ceremonial battle scars, and they also meet the Saturday-morning wants of grown-ups who, arrayed as if to hint at having been out all night, enliven the running of errands by wearing trousers that look and feel like an opiated hangover.

The mass clique of distressed denim exists in polar opposition to another school of bluejean enthusiasm: the dye-stained cult of raw denim. The denim purists — looking professional in unsullied indigo fresh off the shuttle loom, in their natural habitat of bare brick walls and old gnarled wood and other textures invested with magical thinking — are likely to meet the approval of strict good taste. As opposed to people who buy their jeans prefaded and abraded, with a thumb-wide key punch in the watch pocket and the sham phantom of a wallet’s edge in back. But sometimes good taste goes on holiday, to a music festival, for example, turned out in acid-streaked, bleach-stained, chaotically nasty cutoffs. This is the order of things. One point of beat-up bluejeans is to bother good taste, which is a muscular aesthetic stance, a canny market footing and an ambiguous moral position.

Some distressed denim is beauty-marked with subtle scuffs amounting to off-duty signs. Some is lavishly slashed into canvases for abstract craft work, with a fleeciness of bare threads asymmetrically outlined by stubby blue tufts, a kind of plumage for people treating a humble fiber as a vehicle for expressing splendor. There are bluejeans serially slit up the front, space striped as if by the shadows of window blinds in a film noir, and sometimes they are sold by shop assistants wearing jeans sliced to bare hamstrings, as if everyone’s bored of the old ways of constraining the sight and shape of the body. There is a place in Paris that gathers old bluejeans as raw material for reassembled jeans that will cost $1,450 a pair. Which would be a bargain if you believed the piece worthy of framing as a collage deconstructing aperture and entropy and the tensions of a labor-class fabric reworked as universal playwear. (...)

“Everything that was directly lived has receded into a representation,” the Situationist theorist Guy Debord wrote in “Society of the Spectacle.” He was describing a phenomenom now exemplified by new denim marketed as having been “aged to mimic look and feel of 11-year-old denim.” The product lets its buyers slip into the approximation of a lived-in skin and by proximity, to enhance their own personal histories.

The insolence of indecent denim has evolved into a prefab mannerism, a marker of “punk chic” or “grunge cool.” The holes can still reify a generation gap, I think, having heard a 35-year-old banker say that she cannot put on such jeans without imagining her parents’ disapproval: “You should have worn those dungarees all day long until you wore them out yourself.” But that purist’s objection misses the point. The patent insincerity of distressed denim is integral to its appeal. What to make, glancing around the waiting room, of the precision-shredded knees of a pair of plainly expensive maternity jeans promoted for their “rock ’n’ roll appeal”? No one supposes that a woman wearing an elasticized waistband to accommodate the fullness of her third trimester wiped out on her skateboard. The lie is not a lie but a statement of participation in a widespread fantasy. Contentedly pretending to be a dangerous bohemian, she is simply exercising the right to be her own Joey Ramone. We put on jeans with ruined threading in a self-adoring performance of annihilation.

by Troy Patterson, NY Times |  Read more:
Image: Mauricio Alejo