Sunday, July 31, 2016

Thursday, July 28, 2016


photo: markk, four boys

Wolverine Creek


[ed. I think I'll post a few old photos for a change. Here's one challenging project, trying to keep bears and idiots fisherpeople separated at Wolverine Creek, AK.]

photos: markk

Can the World Deal With a New Bank Crisis?

As Europe braces for the release of its bank stress tests on Friday, the world could be on the verge of another banking crisis. The signs are obvious to all. The World Bank estimates the ratio of non-performing loans to total gross loans in 2015 reached 4.3 percent. Before the 2009 global financial crisis, they stood at 4.2 percent.

If anything, the problem is starker now than then: There are more than $3 trillion in stressed loan assets worldwide, compared to the roughly $1 trillion of U.S. subprime loans that triggered the 2009 crisis. European banks are saddled with $1.3 trillion in non-performing loans, nearly $400 billion of them in Italy. The IMF estimates that risky loans in China also total $1.3 trillion, although private forecasts are higher. India’s stressed loans top $150 billion.

Once again, banks in the U.S., Canada, U.K., several European countries, Asia, Australia and New Zealand are heavily exposed to property markets, which are overvalued by historical measures. In addition, banks have significant exposure to the troubled resource sector: Lending to the energy sector alone totals around $3 trillion globally. Borrowers are struggling to service that debt in an environment of falling commodity prices, weak growth, overcapacity, rising borrowing costs and (in some cases) a weaker currency.

To make matters worse, the world’s limp recovery since 2009 is intensifying loan stresses. In advanced economies, low growth and disinflation or deflation is making it harder for companies to pay off what they owe. Many European firms are suffering from a lack of global competitiveness, exacerbated by the effects of the single currency.

Government efforts to revive growth -- largely through a targeted expansion of bank lending -- are having dangerous side effects. With safe assets offering low returns, banks have financed less creditworthy borrowers, especially in the shale oil sector and emerging markets. Abundant liquidity has inflated asset prices and banks have lent against this overvalued collateral. Low rates have allowed weak borrowers to survive longer than they should, which delays the necessary pain of writing off bad loans.

In developing economies, strong capital inflows, seeking higher returns or fleeing depreciating currencies, have contributed to a risky buildup in leverage. So have government policies encouraging debt-funded investment or consumption to stimulate aggregate demand.

What’s most worrying, though, is the fact that the traditional solutions to banking crises no longer seem available or effective.

To recover, banks need strong earnings, capital infusions, a process to dispose of bad loans and industry reforms. Yet today, banks’ ability to earn their way out of their problems and write off losses is limited.

by Satyajit Das, Bloomberg | Read more:
Image: none

Wednesday, July 27, 2016

Yo-Yo Ma, Stuart Duncan, Edgar Meyer, Chris Thile, Aoife O'Donovan


Here and Heaven

With a hammer and nails and a fear of failure we
are building a shed
Between here and heaven between the wait and the
wedding for as long as we both shall be dead
to the world beyond the boys and the girls trying
to keep us calm
We can practice our lines 'til we're deaf and blind
to ourselves to each other where it's
Fall not winter spring not summer cool not cold
And it's warm not hot have we all forgotten that
we're getting old

With an arrow and bow and some seeds left to sow
we are staking our claim
On ground so fertile we forget who we've hurt along
the way and reach out for a strange hand
to hold someone strong but not bold enough to
tear down the wall
'Cause we ain't lost enough to find the stars ain't
crossed why align them why fall hard
not soft into
Fall not winter spring not summer cool not cold
Where it's warm not hot have we all forgotten that
we're getting old

Lyrics via:

Why It's Nearly Impossible To Stop This Amazon and eBay Scheme

Fred Ruckel was an advertising guy. At 25, he started his own agency, and over 12 years he developed commercials for the Super Bowl, Lays and Pepsi. But he never considered himself a Mad man. “I’ve always been an inventor,” he says. A tinkerer. An explorer. He was a guy with ideas but no time to pursue them. So in 2011, his wife, Natasha, gave him the gift of a lifetime: Quit your job, she said. She’d cover the bills while he built a new career. Ruckel immediately went to his business partner and said, “I’m out. I’m going to go change my life.”

He opened a production studio. He sunk $30,000 into an app. He experimented. And on Valentine’s Day 2015, as his wife was playing the piano at home, he watched their cat, Yoda, discover a new toy: a rug under the couple's drum set. It had become rippled, and Yoda swatted at the resulting funny shapes. Ruckel knew: This was it.

He called it the Ripple Rug. It’s stupidly simple, as all great cat toys are: There’s a small rug, you see, and on top of that is another rug. The top rug, attached by Velcro, is full of holes. It’s designed to be a crumpled mess, with bulges and tunnels for cats to explore. Soon hundreds of yards of carpet and Ripple Rug designs cluttered Ruckel’s home. So in June 2015, the couple made another concession to inventions: They left Manhattan, where they’d lived for 22 years, and moved upstate to a house they built as a future retirement home. There, Ruckel would truly have space to invent.

Ruckel hired a factory in Georgia and developed a way to make every Ripple Rug out of exactly 24 recycled bottles. The product debuted in September and went live on Amazon in December. Sales quickly spiked to $2,000 a day, and he became obsessed with the numbers. “Amazon is, without a doubt, Kickstarter on steroids,” he says. “It’s adrenaline. It’s like crack -- ahhhhhh, all day long.” There’s another word for this drug: validation. He was finally a successful inventor.

Then his brother-in-law called.

“Did you see that people are selling your Ripple Rug on eBay?” he said.

Ruckel looked. It was true. Lots of people were selling it -- and not used, either. They were selling new Ripple Rugs. “I’m like, ‘Oh, man, what is that? They copied my stuff!’” Ruckel says. “I’m thinking, Where are they getting it? Is one of the guys in my factory selling it on the side? So I called up the factory, and of course I looked like a jackass.”

The factory was innocent. But as Ruckel kept digging, he discovered the true cause. It’s an industry of people who transform themselves into uninvited middlemen -- either as a form of reseller or, depending on your perspective, a parasite. They steal brands’ marketing materials and make money off their products, creating all sorts of consequences for small retailers like Ruckel. And yet, these people also represent a difficult new reality for entrepreneurs: In the increasingly complex world of e-commerce, everything about a brand -- from its reputation to its pricing -- can be up for grabs.

To understand what’s happening, it’s helpful to visit a different site -- the place largely credited with launching this middleman army: It’s DSDomination.com. The site, which says it has had more than 140,000 users, launched in 2013 and spawned a universe of copycats. Over a cheery ukulele, a man in a video explains its offering: “DS Domination is the first and only platform of its kind that allows the average person to harness the power of multibillion-dollar companies like Amazon, eBay and Walmart at the push of a button,” he says, like an infomercial pitchman. “Using our unique platform, any user can create an income within minutes, simply by copy-and-pasting product information from one company to another.”

The internet is, of course, full of promises like this. Work from home! Get rich with no effort! If you have a few years of your life to waste, you can go down the mother of all rabbit holes trying to understand them. Suffice it to say: Most rely on something called MLM, or “multilevel marketing” -- pyramid schemes, basically. DS Domination does offer an MLM element, but its main service is something more unique: It’s called “Amazon-to-eBay arbitrage.” It sells software and strategies to make this possible.

A quick language lesson. Arbitrage means to take advantage of price differences between markets: Buy low in one place and sell high in another. DS stands for “drop shipping,” which means to sell a product and then have it shipped directly from the wholesaler or manufacturer. In this world that DS Domination sparked, the terms are used somewhat interchangeably. But both play a role in the cleverly complex transaction that enables someone to sell Ruckel’s Ripple Rug on eBay -- and, occasionally, make more money on it than Ruckel himself does.

To see how this works in real time, I go to eBay and buy a Ripple Rug. There are five listings for the product on this day, and I select one from a seller called AFarAwayGalaxy. The price is $49.51; on Amazon, Ruckel sells it for $39.99. So, how’d this listing get here? Almost certainly, the seller is using some kind of software -- made by DS Domination or a competitor -- that scans Amazon for its best-selling products. (They can also do this on large sites like Walmart’s, though most seem to focus on Amazon). The software found the Ripple Rug, which, on the day in June I buy it, is ranked number 25 in cat toys. Then it copied everything in the Amazon listing and pasted it into an eBay listing --amusingly, right down to the part of the product description that says,“Thank you for viewing our Amazon version of the Ripple Rug.”

The price is usually set between 5 and 15 percent over the Amazon price. When I make the purchase, the person behind AFarAwayGalaxy simply goes to Amazon and buys a Ripple Rug -- but instead of buying it for themselves, they designate it as a gift and have it shipped to me. Because I paid $9.52 above the Amazon price, that’s profit, which AFarAwayGalaxy can keep (minus Paypal and eBay fees). This seller has more than 11,000 items listed on eBay. That can quickly add up to real money.

After I place my order, I get an email from AFarAwayGalaxy: “This is to let you know we got it, processed it and have sent it on to the warehouse for shipping,” the note says. Of course, that leaves out a few details. The “warehouse” is actually Amazon’s fulfillment center, which is where Ruckel stocks his Ripple Rugs.

“That’s genius!” says David Bell, a professor at the University of Pennsylvania’s Wharton School, who studies e-commerce. He’d never heard of this scheme but laughed loudly when I explained it.

As it turns out, retail experts didn’t see this coming. In 1997, in the dawn of e-commerce, a New York University professor named Yannis Bakos wrote a well-regarded paper that predicted the internet would change pricing forever. Imagine the old days: You went to a store and had no idea what other stores charged for the same products -- which meant the store you were in could jack up the price. But once everyone could comparison shop online, Bakos reasoned, every site would likely have to offer the same price. And yet, it turns out, many shoppers don’t do the research. If they like eBay, they buy on eBay. Simple as that. Bell’s conclusion: “I think if you’re a small guy, you just have to accept the fact that the platform is the place where the product is going to be sold.” But which platform, and at which price? That’s hard to control.

Ruckel wasn’t feeling so laissez-faire. The more he understood what was happening, the angrier he got. At first, he was protective of his product. “Brand consistency is primo for me,” he says, and the eBay listings were often janky. But then he began seeing an uptick in returns and pieced together what was happening: Someone orders the Ripple Rug on eBay, but the product shows up in an Amazon box. The customer is confused, goes to Amazon, sees how much cheaper the product is there and feels ripped off. “Who are they immediately mad at?” Ruckel says. “The people at Ripple Rug, not some person from nowhere!”

The customer then returns the product, setting off a crazy series of events. Let’s say I want to return the Ripple Rug I just bought. I’d push the “return” button on eBay. AFarAwayGalaxy would then go to Amazon, acquire a return label (which is free for Amazon Prime customers) and send it to me. But because eBay sellers can set their own return policies, AFarAwayGalaxy reserves the right to charge customers a 20 percent “restocking fee” -- which in this case would come out to about $9.90 -- as well as a shipping fee. Meanwhile, Amazon would charge Ruckel a return fee and ship him the product so he could inspect it. Almost always, Ruckel says, returned products have been opened and are covered in cat hair -- making them impossible to sell again.

So, in total: I could have lost more than $10. Ruckel would lose $19.51 (that’s the $2.05 per unit it costs him to stock at Amazon’s warehouse, $12.06 in nonrefundable fees for Amazon to process a sale and $5.40 in return fees). And AFarAwayGalaxy, the only person in this transaction to never spend a dime, just made enough money for lunch.

The fees add up. Ripple Rugs have been returned to Amazon 219 times -- that’s nearly $8,000 in losses since December -- and while Ruckel can’t prove they were all arbitrage-related, he says that sales through his own website have yielded only one return. That’s why this whole thing makes him furious. He has appealed to eBay and Amazon, but arbitraging doesn’t appear to violate either platform’s rules. Amazon declined to comment for this story. An eBay spokesman told me, “We don’t specify where sellers obtain the products they sell.” Hitesh Juneja, DS Domination’s cofounder, says he has “a very good relationship with eBay.”

And so, Ruckel has tried taking his campaign to the arbitragers himself. He’s gotten into email arguments with them. He finds other anti-arbitrage sellers and swaps strategies. One of those people, Eric Wildermuth, who sells a line of children’s hats called Snuggleheads, came up with a particularly sneaky punishment: He bought his own hat from an eBay arbitrager for $27 -- and then, before the arbitrager could go to Amazon and make the purchase, Wildermuth changed his Amazon listing price to $199. Result: The arbitrager could either lose $172 on the sale or cancel the purchase, which would damage the arbitrager’s eBay ranking. Wildermuth repeated this about 10 times. “I got these frantic calls [from the arbitrager]. He said, ‘Please don’t do this,’” says Wildermuth. “He knew what I was doing. And I let out a string of expletives.”

This summer, Ruckel tried a new approach: He put his own product on eBay and titled it “All other eBay sellers are fake.” A few weeks later, he stumbled upon an eBay listing with a familiar title. “All other eBay sellers are fake,” it said. It wasn’t his, of course.

Someone had copied that, too. (...)

At its heart, none of this is new. In the Mycenaean period, no doubt, some clever ancient Greeks were arbitraging wine. Ticket scalpers are arbitragers. People have accused McDonald’s of arbitraging meat, selling the McRib only when pork prices drop. The difference today, however, is the breadth of commerce happening on just a handful of platforms. In exchange for the massive, unprecedented reach companies like Amazon and eBay provide, a product like Ripple Rug must relinquish some measure of control and identity. It is not a box on a shelf, carefully positioned and branded. It is a clickable subject line, a few photos and some text. And in this environment, it wouldn’t even occur to most customers to wonder: Who’s actually selling this?

by Jason Fiefer, Entrepreneur |  Read more:
Image: uncredited

It’s Okay To Suck

For fidelity to mediocrity, no one beats Florence Foster Jenkins. The subject of a Stephen Frears-directed biopic starring Meryl Streep and Hugh Grant coming out in North America in August, Jenkins was a truly bad singer. But every year she held a recital at New York’s Ritz-Carlton Hotel and then, at age 76, she made it big: she played Carnegie Hall. “She was exceedingly happy in her work,” wrote Robert Bager in a New York World-Telegram review of the sold-out show in 1944. “It is a pity so few artists are. And her happiness was communicated as if by magic to her listeners … who were stimulated to the point of audible cheering, even joyous laughter and ecstasy by the inimitable singing.”

What Jenkins lacked in pitch, rhythm and vocal range, she made up for in attitude: “People may say I can’t sing,” she said, “but no one can ever say I didn’t sing.”

Nadine Cooper can now say the same thing. And most bad singers can probably relate to her story more than to Jenkins’s triumphant career. When Cooper was 11 or 12, a teacher told her to stop singing because she was spoiling it for everyone else. So she stopped. As an adult, she tried to avoid worrying about her inability to sing, but sometimes she’d think, “I wish I could do that.” Occasionally, she’d go as far as “I should be able to do that,” which eventually evolved into “There should be somewhere for me to do that.”

But there wasn’t. No-audition choirs abound these days, and they are ideal for many singers, even the untrained and the unconfident. But for truly bad singers, they are too intimidating. Cooper didn’t want to be in a situation where she’d have to apologize every second note: “I’m pretty sure I’d still spoil if for everyone else.”

Her lack of confidence is understandable. What remains unfathomable is that people actually tell children not to sing. And I’ve heard countless tales of people told to just “mouth the words.” Many teachers and musical directors have no idea how to help bad singers, but need bodies to make their choirs seem successful so they make young people go through the humiliating charade of faking it. Maybe the deception is easier for parents to accept than not having their kids be part of the recital at all, but it is the most effective way to give someone a life-long fear of singing in public.

For most middle-class people, growing up is just another term for a narrowing of interests. As a kid, I was fortunate to try soccer, skiing, sailing, skeet shooting, football, baseball, basketball, cricket, canoeing, cycling, golf, tennis, horseback riding, hockey and many other activities. Some (basketball) I gave a hard pass to right away because I was short and slow; some (skiing) I loved but stopped doing for financial reasons; some (golf) just became too frustrating. Today, I play hockey, cycle and go on an annual canoe trip. (Much to my surprise, though, I’ve also taken up yoga, something I long said I would never do.)

Cutbacks at schools mean reductions in sports and arts programs while high costs—even sports such as hockey are now so expensive that many families can’t afford them—and limited parental time can mean the end to some pursuits. We also launch our own self-selection process. Most kids don’t need to be the best player on the team to enjoy soccer and parents encourage their offspring to keep playing because the benefits, including exercise and working with teammates, are considerable. But we soon drop the activities we don’t enjoy and go harder on the ones we do—and the correlation between what we enjoy and what we’re good at is strong.

Adults routinely ask kids who’ve just played a game, “Did you score?” So both adults and other kids send us the message that anything we’re good at is more fun and more worthwhile than anything we’re bad at. Being the last one picked for a team can sour us on a sport, but the streamlining of interests can also be brutal in non-athletic pursuits. Eventually our parents stop putting our masterpieces on the fridge.

One way or another, we learn early on that it’s not okay to be bad. And we often carry this “knowledge” into old age. So we’re always impressed when adults take up activities they’d never tried before or wished they’d never given up. If the theories are right about the benefits of aging brains learning new stuff, then we may be staving off dementia later in life (and if the theories are wrong, at least we learned some cool new stuff). Sometimes we turn out to have a talent for it; sometimes we don’t and that’s okay because it’s actually good for us to do things we’re bad at.

Wanting to sing, Cooper asked Bernie Bracha, a choir director in Nottingham, England, about starting a Tone Deaf Choir. Although Cooper considered herself tone deaf, Bracha, an educational consultant and former music teacher, said that was unlikely and balked at the idea of such a choir. While bad singing is common, congenital amusia, the scientific term for tone deafness, is not—perhaps as rare as 1.5 percent of the population. Cooper, who has a chemical engineering degree from Cambridge and worked as a management consultant and planning manager before becoming an entrepreneur, did an online test: she wasn’t tone deaf. So she went back to Bracha and suggested a Tuneless Choir instead. This time, the choir leader agreed to hold one workshop to see if there was enough interest.

Sixty people showed up. The media, including the BBC, soon followed. And now at least 120 people—they range in age (early twenties to late seventies) as well as socio-economic background; about one fifth are men—show up at each session, held every second Thursday. The singers arrive early to socialize and then at 8, Bracha leads them through some warm-up exercises followed by about five songs, a break for tea and nibbles and then another half hour of singing. They clap after every song—and laugh when one falls apart.

Cooper sees a huge difference between joining a regular no-audition choir, in which the majority can sing, and being part of the Tuneless Choir. “There’s no pressure,” she insists, “just pleasure.” And before long, she was receiving franchising enquiries from other English communities. In May, a choir in Sutton Coldfield near Birmingham launched with 86 participants. Cooper now has three franchisees and expects to have ten by the end of the year, making the Tuneless Choir a full-time job.

While she can list the benefits of singing in a choir—including the release of endorphins, the social activity, the easing of anxiety and depression and the strengthening of the immune system—Cooper’s motivation is simpler. “I just enjoy it. It’s fun because we have a laugh and don’t take things seriously. It’s fun because it’s mischievous,” she told me during a Skype conversation. She sees it as a form of rebellion and a chance to stick it to everyone who told her not to sing. “I’m not going to let people stop me from doing this.”

The story of the Tuneless Choir set me to thinking about why we do only those things we’re good at and, more important, if that’s a good thing. I naively thought the answer would be straight-forward and emailed my friend Alex Russell this two-part question: “Why is it good for us to do things we suck at? And why is it bad for us to only do things we are good at?”

Russell is a clinical psychologist (and, full disclosure, I helped him write a book on parenting a few years ago). To my surprise, he replied that we’d need to discuss it over a beer. When we sat down in a pub around the corner from his office in Toronto, he told me he’d been thinking about my question all day and had even discussed it with colleagues in his practice because it is so central to their work. “It taps into everything we’re doing,” he said. “It is The Question, maybe.”

by Tim Falconer, Hazlitt |  Read more:
Image: uncredited and via:

Tuesday, July 26, 2016

Death by Prefix? The Paradoxical Life of Modernist Studies

[ed. I think I have a headache. Can someone tell me if I do? (or don't?)]

Like many scholars of modernism, I’m often asked two questions: What is modernism? And why is modernist studies, it seems, all the rage right now? I don’t have a good, succinct answer to either question — and I’ve no doubt frustrated plenty of friends because of that — but the reasons why I don’t are pretty telling.

There’s a familiar response to the question of what modernism is — dense and difficult language, myth and allusion, formal experimentation, and so on — and I regularly use it when introducing the term to undergraduates. But this answer feels rather disingenuous: that sense of the term cohered and reigned only for a small, recent window of time in a history of “modernism” that dates back more than a millennium. Which is to say, unlike fields marked by the relatively neater boundaries of centuries, nations, or languages, modernist studies, for most of its roughly century-long academic history, has failed to form a consensus on the nature of its titular object. Every field loves to ponder its own shifting borders, but how would I feel if I asked a colleague in postcolonial literature or in African-American studies to sketch her field for me, and she responded, “Well, it’s complicated, so let me just fall back on what people more or less agreed on a half-century ago”?

And yet, this definitional uncertainty helps explain the thriving and transforming contemporary field that is typically called the New Modernist Studies, which has been documented, analyzed, and disparaged in a number of places by this point (for example, here, here, and here). What remains unresolved — at once exciting and haunting — is a central paradox in the field. Scan the program of any recent conference of the Modernist Studies Association, the titles of articles published in Modernism/modernity, or the monographs published in the field (at least a half-dozen presses have initiated series in modernist studies in the past decade, with more coming), and one will similarly find “modernism” endlessly modified by prefixes. From Transpacific to Mediterranean, Pragmatic to Revolting, Digital to Slapstick, hardly a region, concept, technology, category of being, or historical movement has been excluded as a possible type of modernism. No one could claim to know even half of the field at this point, much less a plausible totality. Donald Rumsfeld’s Orwellian phrase “known unknowns” echoes mercilessly as I try just to keep up with the publications in my own subfields (comparative and global) of modernist studies.

Why these expansions by way of proliferating prefixes? In part, it’s because of the robust dissatisfaction with that old, familiar notion of modernism. And in a way, the current climate has returned us to the historical moment many of us study — a moment when “modernism” meant everything and, potentially, nothing. In the first half of the 20th century, “modernism” pointed unevenly to a new mode of writing, to new appliances and technologies, and to the rebellious priests excommunicated by Pius X, who in 1910 made clergy swear an “Oath Against Modernism.” We have finally dismissed the myth that the figures we most often call “modernist” did not use that term. Rather, they didn’t use it consistently, or they found it already overused or insufficiently descriptive.

The original readers of Laura Riding and Robert Graves’s A Survey of Modernist Poetry (1927), one of the first major studies to consecrate the term “modernism” as a literary concept, no doubt were as bewildered as we are by the ubiquity of “modernism” across many spheres of culture. And Riding and Graves themselves were ambivalent about the term’s strictures and prospects. Indeed, as early as 1924, the Fugitive poet and future dean of New Criticism John Crowe Ransom lamented that no working poet could “escape” from the rigid doctrines of a modernism associated with Ezra Pound, T. S. Eliot, and F. S. Flint — that poets must write to their standards now or risk remaining unknown. And one of the primary critical figures in modernist history, Edmund Wilson, didn’t even use the term in his milestone study, Axel’s Castle (1931); he called it “symbolism” instead.

How did a term that meant so much — or, again, so little — come to single out a literary aesthetic found in selective (not all) works by figures like Eliot, Pound, James Joyce, and Virginia Woolf? No single critic or book could claim responsibility. Instead, it was a series of successful anthologies and widely used syllabi that gained traction and exposure in the expanding college classrooms of the 1960s and 1970s that mostly delimited what is now the “old modernism.” In the 1980s, postmodernist critics pounced on this usefully rigid sense of “modernism” to name and describe the foundation of an elitist, often racist right-wing politics. Colleagues who have worked in this field since this time have told me chilling stories of the days when modernism was blacklisted and when no publisher, no search committee, no journal editor wanted to hear the term (unless perhaps prefaced by an obscenity).

The New Modernist Studies, which dates roughly from the mid-1990s, was born of the vigorous responses to these attacks. Modernism reinvented itself and expanded to include feminist, lowbrow, popular, ethnic, and other forms that had been derided at one point. But that did not obviate the fact that there was a good deal of truth in the postmodernist attacks. The political histories of figures like Eliot, Pound, Wyndham Lewis, and D. H. Lawrence are littered with everything from pro-Mussolini radio broadcasts to ugly anti-Semitic rants to violent misogynistic fantasies. These old modernisms, however, were variously buried, repackaged, or dismissed as aberrant, leaving modernist scholars free to transform their field rapidly and immensely. The new modernisms, unhinged from defined temporal, geographical, and formal restrictions, started gobbling up new texts and new sites that other fields (Victorian literature, aestheticism, postmodernism) had once claimed. Scholars in adjacent fields pushed back, and once again, modernism responded: Gertrude Stein was both a modernist and a postmodernist. Problem solved.

And thus the paradox: The old “modernism” is still pragmatically and strategically valuable for the New Modernist Studies. To characterize modernism in the old, familiar way, even if convenient, is to buy into a host of assumptions that are now fully discredited: that modernity originated in a certain moment in European history, or that Charles Baudelaire founded a movement that had no other possible roots, or that formal innovation is the genuine marker of the “new” in literary history (even Eliot himself doubted that last one), and so on. Instead, there are hundreds of modernisms, and as long as the particular invocation of the term points to some time period, authors, site, or aesthetics once associated with the term “modernism,” no one doubts its validity. A colleague in the field recently remarked to me that “modernism” now has enough cachet and critical purchase that a formulaic, fill-in-the-blank title (_____ Modernism) is already more than half a step to a book contract.

Which is to say that while there is no consensus on what “modernism” means, the term carries significant conceptual and professional weight.

by Galye Rogers, LARB |  Read more:
Image: Ezra Pound

David Hockney, Dog Days, 1996
via:

Appetite for Destruction

[ed. Sorry, I have to say I generally find nominating conventions pretty boring, despite their attempts at soaring rhetoric. Too scripted... like watching the Academy Awards. The GOP convention was certainly different in that respect (and not in a good way) - a slow motion, hallucinatory train wreck, made worse (and more frightening) the longer it went on, capped off finally by The Donald himself in full apocalyptic mode. It makes the Democratic convention seem almost too polite by comparison: too careful, too introspective, almost too logical (maybe we should call it the Spock Party). Both have blood on their hands if we're talking about corporate manipulation and influence (but at least the Dems try to balance that out with support, if not lip service, for the beleaguered lower and middle classes and marginalized racial and social groups). Btw... I didn't see a single red, white and blue piece of clothing all night, sequined or otherwise. Bunch of communists.]

Hell, yes, it was crazy. You rubbed your eyes at the sight of it, as in, "Did that really just happen?"

It wasn't what we expected. We thought Donald Trump's version of the Republican National Convention would be a brilliantly bawdy exercise in Nazistic excess.

We expected thousand-foot light columns, a 400-piece horn section where the delegates usually sit (they would be in cages out back with guns to their heads). Onstage, a chorus line of pageant girls in gold bikinis would be twerking furiously to a techno version of "New York, New York" while an army of Broadway dancers spent all four days building a Big Beautiful Wall that read winning, the ceremonial last brick timed to the start of Donald's acceptance speech...

But nah. What happened instead was just sad and weird, very weird. The lineup for the 2016 Republican National Convention to nominate Trump felt like a fallback list of speakers for some ancient UHF telethon, on behalf of a cause like plantar-wart research. (...)

The Republican Party under Trump has become the laughingstock of the world, and it happened in front of an invading force of thousands of mocking reporters who made sure that not one single excruciating moment was left uncovered.

So, yes, it was weird, and pathetic, but it was also disturbing, and not just for the reasons you might think. Trump's implosion left the Republican Party in schism, but it also created an unprecedented chattering-class consensus and a dangerous political situation.

Everyone piled on the Republicans, with pundits from George Will to David Brooks to Dan Savage all on the same side now, and nobody anywhere seeming to worry about the obvious subtext to Trump's dumpster-fire convention: In a two-party state, when one collapses, doesn't that mean only one is left? And isn't that a bad thing?

Day two of the Republican National Convention in Cleveland, a little after 6:30 p.m. Roll has been called, states are announcing their support for the Donald, and the floor is filled with TV crews breathlessly looking for sexy backdrops for the evolving train wreck that is the Republican Party.

Virtually every major publication in America has run with some version of the "Man, has this convention been one giant face-plant, or what?" story, often citing the sanitized, zero-debate conventions of the past as a paradise now lost to the GOP.

"The miscues, mistakes [and] mishandled dissent," wrote Elizabeth Sullivan in Cleveland's Plain Dealer, "did not augur well for the sort of smoothly scripted, expertly choreographed nominating conventions our mainstream political parties prefer."

The odd thing is that once upon a time, conventions were a site of fierce debates, not only over the content of the party platform but even the choice of candidates themselves. And this was regarded as the healthy exercise of democracy.

It wasn't until the television era, when conventions became intolerably dull pro-forma infomercials stage-managed for the networks to consume as fake shows of unity, that we started to measure the success of conventions by their lack of activity, debate and new ideas.

A Wyoming delegate named Rick Shanor shakes his head as he leans against a wall, staying out of the way of the crews zooming to and fro. He insists dissent is always part of the process, and maybe it's just that nobody cared before.

"It's beautiful," he says. "You've got to have the discourse. You've got to have arguments about this and that. That's the way we work in the Republican Party. We yak and yak, but we coalesce."

The Republican Convention in Cleveland was supposed to be the site of revolts and unprecedented hijinks on the part of delegates. But on the floor of Chez LeBron, a.k.a. the Quicken Loans Arena, a.k.a. the "Q," it's the journalists who are acting like fanatics, buttonholing every delegate in sight for embarrassed quotes about things like Melania's plagiarism flap.

"The only safe place to stand is, like, in the middle row of your delegation," one delegate says, eyeing the media circling the edges of the floor like a school of sharks. "If you go out to get nachos or take a leak, they come after you."

A two-person crew, a camera and a coiffed on-air hack, blows through a portion of the Washington state delegation, a bunch of princely old gentlemen in zany foam tree-hats. The trees separate briefly, then return to formation.

Meanwhile, the TV crew has set up and immediately begun babbling still more about last night's story, Melania Trump's plagiarism, which Esquire's Charlie Pierce correctly quipped was a four-hour story now stretching toward multiple days.

Nearby, watching the reporters, one delegate from a Midwestern state turns to another.

"This is like a NAMBLA convention," he says with a sigh. "And we're the kiddies."

Outside, it's not much better.

The vast demilitarized zone set up between the Q and anywhere in the city that contains people is an inert, creepy place to visit. Towering metal barricades line streets cleansed of people, with the only movement being the wind blowing the occasional discarded napkin or pamphlet excerpt of The Conservative Heart (the president of the American Enterprise Institute's hilarious text about tough-love cures for poverty first littered the floor of the Q, then the grounds outside it).

Thus the area around the convention feels like some other infamous de-peopled landscapes, like Hitler's paintings, or downtown New Orleans after Katrina. You have to walk a long way, sometimes climbing barriers and zigzagging through the multiple absurd metal mazes of the DMZ, to even catch a glimpse of anyone lacking the credentials to get into this most exclusive of clubs: American democracy.

by Matt Taibbi, Rolling Stone |  Read more:
Image:Victor Juhasz

Monday, July 25, 2016

Neoliberalism Is a Political Project

Eleven years ago, David Harvey published A Brief History of Neoliberalism, now one of the most cited books on the subject. The years since have seen new economic and financial crises, but also of new waves of resistance, which themselves often target “neoliberalism” in their critique of contemporary society.

Cornel West speaks of the Black Lives Matter movement as “an indictment of neoliberal power”; the late Hugo Chávez called neoliberalism a “path to hell”; and labor leaders are increasingly using the term to describe the larger environment in which workplace struggles occur. The mainstream press has also picked up the term, if only to argue that neoliberalism doesn’t actually exist.

But what, exactly, are we talking about when we talk about neoliberalism? Is it a useful target for socialists? And how has it changed since its genesis in the late twentieth century?

Bjarke Skærlund Risager, a PhD fellow at the Department of Philosophy and History of Ideas at Aarhus University, sat down with David Harvey to discuss the political nature of neoliberalism, how it has transformed modes of resistance, and why the Left still needs to be serious about ending capitalism.

Neoliberalism is a widely used term today. However, it is often unclear what people refer to when they use it. In its most systematic usage it might refer to a theory, a set of ideas, a political strategy, or a historical period. Could you begin by explaining how you understand neoliberalism?

I’ve always treated neoliberalism as a political project carried out by the corporate capitalist class as they felt intensely threatened both politically and economically towards the end of the 1960s into the 1970s. They desperately wanted to launch a political project that would curb the power of labor.

In many respects the project was a counterrevolutionary project. It would nip in the bud what, at that time, were revolutionary movements in much of the developing world — Mozambique, Angola, China etc. — but also a rising tide of communist influences in countries like Italy and France and, to a lesser degree, the threat of a revival of that in Spain.

Even in the United States, trade unions had produced a Democratic Congress that was quite radical in its intent. In the early 1970s they, along with other social movements, forced a slew of reforms and reformist initiatives which were anti-corporate: the Environmental Protection Agency, the Occupational Safety and Health Administration, consumer protections, and a whole set of things around empowering labor even more than it had been empowered before.

So in that situation there was, in effect, a global threat to the power of the corporate capitalist class and therefore the question was, “What to do?”. The ruling class wasn’t omniscient but they recognized that there were a number of fronts on which they had to struggle: the ideological front, the political front, and above all they had to struggle to curb the power of labor by whatever means possible. Out of this there emerged a political project which I would call neoliberalism.

Can you talk a bit about the ideological and political fronts and the attacks on labor?

The ideological front amounted to following the advice of a guy named Lewis Powell. He wrote a memo saying that things had gone too far, that capital needed a collective project. The memo helped mobilize the Chamber of Commerce and the Business Roundtable.

Ideas were also important to the ideological front. The judgement at that time was that universities were impossible to organize because the student movement was too strong and the faculty too liberal-minded, so they set up all of these think tanks like the Manhattan Institute, the Heritage Foundation, the Ohlin Foundation. These think tanks brought in the ideas of Freidrich Hayek and Milton Friedman and supply-side economics.

The idea was to have these think tanks do serious research and some of them did — for instance, the National Bureau of Economic Research was a privately funded institution that did extremely good and thorough research. This research would then be published independently and it would influence the press and bit by bit it would surround and infiltrate the universities.

This process took a long time. I think now we’ve reached a point where you don’t need something like the Heritage Foundation anymore. Universities have pretty much been taken over by the neoliberal projects surrounding them.

With respect to labor, the challenge was to make domestic labor competitive with global labor. One way was to open up immigration. In the 1960s, for example, Germans were importing Turkish labor, the French Maghrebian labor, the British colonial labor. But this created a great deal of dissatisfaction and unrest.

Instead they chose the other way — to take capital to where the low-wage labor forces were. But for globalization to work you had to reduce tariffs and empower finance capital, because finance capital is the most mobile form of capital. So finance capital and things like floating currencies became critical to curbing labor.

At the same time, ideological projects to privatize and deregulate created unemployment. So, unemployment at home and offshoring taking the jobs abroad, and a third component: technological change, deindustrialization through automation and robotization. That was the strategy to squash labor.

It was an ideological assault but also an economic assault. To me this is what neoliberalism was about: it was that political project, and I think the bourgeoisie or the corporate capitalist class put it into motion bit by bit.

I don’t think they started out by reading Hayek or anything, I think they just intuitively said, “We gotta crush labor, how do we do it?” And they found that there was a legitimizing theory out there, which would support that.

by David Harvey, Jacobin |  Read more:
Image: Direitos Urbanos

Fitz and the Tantrums

Vetements and the Cult of the Fashion Victim

There is a common element in a lot of the street style photos over the past couple of seasons or so: Vetements. The Paris-based brand, which was launched in 2014 by brothers Demna and Guram Gvasalia and friends, is undeniably the “it” brand at the moment amongst industry insiders (or wannabe insiders or “fashion victims” as fashion blogger, BryanBoy aptly coined them recently) with $1000 to spend on a sweatshirt. The sweatshirts that say Thrasher on them are Vetements. The yellow DHL-logoed t-shirts are Vetements. The poncho-like rain jackets that say Vetements on the back of them are obviously Vetements, as well.

Maybe more interesting than the rapid proliferation of Vetements fans (read: fashion victims) is what these $1000+ sweatshirts (which are made of 80% cotton, 20% polyester) and $330+ DHL t-shirts stand for. As we all know, fashion is in a weird, unstable place at the moment. The future of the fashion calendar is up in the air. Sales growth is low. Consumer fatigue is growing, and widespread economic woes certainly do not help. With this in mind, we have seen an array of attempts by brands to weather the storm. Speeding up the runway-to-retail timeline and making collections shoppable instantaneously is one way brands are coping. Playing on consumers’ desires to own “it” items – a longstanding principle in luxury fashion – is another. And this is where Vetements plays a role (in addition to falling in the former camp, as the brand recently announced that it is changing up its own runway show schedule).

Status Matters

Status matters to consumers. It is the reason fashion houses can charge $2000 for a basic nylon or laminated canvas bag that is covered in logos or $100+ for a licensed fragrance. Sure, Vetements does not sell bags or fragrances but it does fit neatly into this same notion, nonetheless, as its garments have risen to industry “it” items. And its founders appear to understand what drives luxury shopping to an extent: "In order to make people want something, you need to make scarcity. The real definition of luxury is something that is scarce. Every single piece in our collection is going to be a limited item... We don't restock and we don't reproduce -- if it's sold out, it's sold out,” the brothers recently noted.

As Vogue’s Sarah Mower wrote of the brand last October, “Demna Gvasalia himself learned the ropes at Maison Martin Margiela, before setting up Vetements and getting on with proving that there can be a different way of doing things.” Yet, if we consider the aforementioned notion, Vetements is not actually doing anything completely revolutionary. At its core, the brand is tapping into fashion fans’ desires to show that they are worthy, that they are in the know, that they have something exclusive, that they are cool. These individuals are essentially taking the coveted “it” bag of the season and wearing it as a sweatshirt. In this way, the Vetements’ method (at least when it comes to the brand’s most coveted items) and the resulting fan fury over those garments is not anything new.

The statement sweatshirt is not coming completely out of left field. In some circles, statements sweatshirts or t-shirts rival the “it” bag. Ask Supreme die-hards. Or look at the Givenchy fans, who were walking around in Rottweiler sweatshirts not too long ago. (Note: such garments really helped Givenchy, which was for many, many decades known primarily for couture, make its mark in the ready-to-wear market and to up profits and visibility).

With this in mind, it is not surprising that we see the offering for sale of Thrasher sweatshirts for $1,000 by Vetements and more importantly, it is not surprising to see people actually buying them. And let’s be clear: we are not talking about die-hard skateboarders here - they probably already own the $35.95 version from Tactics Boardshop that predates the Vetements one. No, we are talking primarily about fashion girls and Kanye West clones, who are happy to spend $1k on a trendy sweatshirt that will send a message to their friends and to other fashion insiders/fans. (...)

I’ll spare you the bit about the sped up fashion cycle because by now I am sure you have read at least 12 articles dissecting the rapidity of the current fashion model. There is one very interesting aspect to the recurring discussion about the sped-up nature of fashion, however: The argument that the speed of it all has left designers with less time to be as creative as they’d like and the result is fashion that lacks depth (Raf Simons, for instance, has sounded off on this exact point. "There is no more thinking time," he said this past fall). But not just limited to designers, the cycle has created a larger feeling that fashion is simply more superficial. It's not personal. It's just business. And in many cases, it really does go both ways.

In theory, this should not be a problem, as the majority of consumers (and of course, there are exceptions!) are not necessarily interested in fashion in anything more than a purely superficial way. Most are not buying based on cut or construction or a deep love or appreciation for the brand – this is true even for high fashion shoppers. They are buying into a brand’s image at the present moment, buying based on what makes them look good - both in a physical way but probably more significantly, in a status type of way. With this in mind, many fashion fans – from the Vetements-wearing fashion victims to our fast fashion-shopping friends – are not buying based on quality. They are buying to keep up with appearances. They are buying to cement themselves into the zeitgeist. This is not a novel concept. It is just happening with more rapidity than before.

Buying – regardless of a garment’s price point – is one of the easiest ways to gain status. It does not require learning or accomplishing anything. It allows the buyer to be part of something cool without expending anything more than money. And that is convenient because in the current landscape of things, which can be probably be aptly categorized by the fact that most people don’t want to read anything longer than a text, ease reigns supreme. So, why wouldn’t that tide over into the fashion industry? It is, after all, one of the most immediate reflections of the time in which we are living.

In short: Shoppers now – just like shoppers in the past – aim to maintain the appearance of status, and a Vetements sweatshirt will give them that for a few seasons.

The Upside of All of This

What does this say about the state of the fashion industry at the moment? A number of things.

by TFL, The Fashion Law |  Read more:
Image: Le21eme.com

Diana Rigg’s Enduring Appeal


[ed. Sometimes you fall down a rabbit hole doing this blog stuff. I don't get HBO so wasn't aware that Diana Rigg is still going strong on Game of Thrones. When I was young, Mrs. Peel in The Avengers was the first woman to make me realize that the other sex might have qualities that (for some strange reason) seemed irresistably and uncomfortably interesting.]

To me, she was and always will be Emma Peel, the brainy, fiercely courageous, impossibly sexy, black-leather clad British secret agent she portrayed in the popular 1960s TV show, “The Avengers,” who captivated and haunted me from the time I first watched her as a little girl in Brooklyn and could never outgrow or forget or leave behind.

“Mrs. Peel, we’re needed.”

How I lived for those words each week. They were a staple of the show, that moment when her partner, the dapper John Steed, would turn to her for help in solving a chilling crime, or to confront a sinister set of bad guys. She would appear, ever so elegant in her Mod outfits—form-fitting hip-huggers, white boots, topped with a jaunty beret or Carnaby hat, the newsboy cap that was all the rage.

Most often, she wore her hair loose—it was thick and lustrous, a deep shade of auburn, and she was constantly brushing it off her face, one of her many gestures I sought to emulate.

Mine was no garden-variety girlhood crush. It was a full-fledged obsession.

by Lucette Lagnado, WSJ |  Read more:
Images via: here and here

Kookaburra
via:

Sunday, July 24, 2016

Joanna Lumley and the New Ab Fab


[ed. I went to a conference once where Joanna Lumley was the keynote speaker (an Oil Industry and Science conference no less!). She was/is an ardent and articulate advocate for the environment. Who'd have known? I've been (even more of) a fan ever since.]

Back in 1992, a certain Bill Clinton was about to become president, the world’s first text message was sent, the Queen suffered her famed annus horribilis and Absolutely Fabulous aired for the first time, giving us Brits our very own big-haired, fast-living icon in a time when big-haired and fast-living was only permitted if you were a middle-aged man. Fast forward to today, a different Clinton is hoping to return to the White House and Absolutely Fabulous is preparing its comeback, this time to the big screen, with her majesty Patsy Stone, staggering on, still drinking, swearing, smoking, and unable to remember her real age. It almost makes one wonder if the last quarter of a century actually happened.

Stone’s arrival in the world was a revelation. A hilarious, satirical icon who accurately represented Generation Y. A sexually free, Harvey Nicks-addicted, Stoli-Bolly legend, clutched to the nation’s heart. To get a sense of just how much has happened since Patsy was presented to us, bear in mind that computers were the size of houses, there was no Tinder, email was confined to academics and a woman enjoying a pint was deemed radical enough to earn the nickname “ladette”. If Jennifer Saunders’ Eddie was loved, Pats was adored. No less the woman who embodied her, Joanna Lumley – already a great British acting stalwart.

And, 24 years on, here she is sitting before me in a Soho hotel room in a pair of her granddaughter’s baby-blue Converse. (“They’ve been passed on to me because she’s grown out of them,” confesses Lumley.) The only reason I’m not blown away by her glamour and velvet voice is because both are already so very familiar. For Lumley is a renaissance woman with legions of what she terms “parallel lives”. Her CV encompasses actor, model, Bond girl, TV presenter, journalist, traveller, political campaigner and advocate for over 80 charities. She has become a national treasure here in the UK, where she is currently attempting to change the face of the capital with her controversial pedestrian Garden Bridge across the Thames, and a “daughter of Nepal” after her stalwart championing of Gurkha rights. (Born in Kashmir, her father was a major in the 6th Gurkha Rifles.)

Under normal circumstances, I pride myself on being the voice of objective journalism. However, in this case, I feel honour-bound to report that 70-year-old Ms Lumley kicks ass. There is a Joanna Lumley Research Fellowship at Oxford, while Woman’s Hour named her as one of its 100 most powerful women in Britain. She is basically saving the world, while refusing to make a song and dance about it.

Lumley is, in short, ab fab – which is, of course, why we’re here. Saunders’ hotly awaited Absolutely Fabulous: The Movie is released next week. As I write, no-one has seen it, but I suspect it will delight existing fans of the show while also appealing to a new generation of milliennials.

If Ms Clinton needs help taking on Donald Trump, I’m starting to think she could do worse than putting in a call to Ms Lumley…

Everyone idolises Pats. Do you?

I adore Patsy. I think what people, particularly women, see in Ab Fab is that women have friends, they stick with friendships and have the same sort of worries. Eddie worries about whether she’s going to look right, be out of fashion or not be recognised. And women – particularly professional women – have anxieties. Patsy doesn’t have any worries. She doesn’t really care. Most of us mind whether we’ve behaved properly or hurt anybody’s feelings. Patsy never thinks about these things ever, almost like a child. If it doesn’t work, let it go.

Is that liberating?

I like doing her, but I’m really not like her. Well, obviously it’s the same carcass that contains us both. So she’s mine, but I’m not hers, if you know what I mean?

How old is Patsy now?

I just pick ages out because she doesn’t really know how old she is. But Edina is about 65, and I think Patsy is easily 80. She’s pickled, she’s in formaldehyde, and she’s also smoked like a kipper, so she’s kind of undieable.

I hear she was very much your creation.

Well, she didn’t exist. When Jennifer wrote the pilot, she was thinking of maybe a Fleet Street hack, but I didn’t know that. I went along with just lines on the page. I didn’t know Edina’s character. I’d never met Jennifer. So it was suck it and see. We invented a backstory where I could bring in things I knew about, like modelling in the Sixties.

The original series was so much about the Nineties. How did you update it for 2016?

The world has changed and strangely enough caught up with the Ab Fab women because in those days, it was shocking – women drinking too much, staying out, not caring, doing stuff like that. Social media didn’t exist. Hello! had only just started [in 1988]. And now the world is much more sensitive. People take offence at the smallest things, which in those days were just funny. In the future, it’s going to be harder to write anything. And this idea of casting: you’ve got to be the character or you shouldn’t be cast as it. In the old days, we could all dress up and be anybody, but now, you have to be that person, which means a gay person has to be played by a gay actor. You go, “Well, this is the whole point of acting kicked out the window. We all pretend to be other people. We pretend to be older, we pretend to be this or that, we pretend to be different nationalities, we put on accents.” (...)

Do you care about the film’s reception?

A lot of people feel they own Ab Fab and know what they want it to be. And so they might say, “What I wanted was…” What I want this film to be is like a glass of champagne on a summer evening. Just go in, laugh, see who’s in it. It’s a fabulous story made by stunning people. It’s funny! It’s divine!

by Hannah Betts, The Stylist | Read more:
Image: Tony Wilson

Verizon to Pay $4.8 Billion for Yahoo’s Core Business

[ed. Great. Verizon... who's plans are apparently to "combine Yahoo's operations with AOL".  Sounds like a winning strategy to me, how about you? Too bad we didn't get Comcast or AT&T.  And how about that Marissa Mayer? Talk about failing up. This might be of no consequence to most people but I hope to hell they don't screw up Tumblr (wishful thinking, I know). Seems like the best of the web always devolves into the worst possible (corporate) outcome.]

Yahoo was the front door to the web for an early generation of internet users, and its services still attract a billion visitors a month.

But the internet is an unforgiving place for yesterday’s great idea, and on Sunday, Yahoo reached the end of the line as an independent company.

The board of the Silicon Valley company agreed to sell Yahoo’s core internet operations and land holdings to Verizon for $4.8 billion, according to people briefed on the matter, who were not authorized to speak about the deal before the planned announcement on Monday morning.

After the sale, Yahoo shareholders will be left with about $41 billion in investments in the Chinese e-commerce company Alibaba, as well as Yahoo Japan and a small portfolio of patents.

That’s a pittance compared with Yahoo’s peak value of more than $125 billion, reached in January 2000.

Verizon and Yahoo declined to comment about the deal.

Founded in 1994, Yahoo was one of the last independently operated pioneers of the web. Many of those groundbreaking companies, like the maker of the web browser Netscape, never made it to the end of the first dot-com boom.

But Yahoo, despite constant management turmoil, kept growing. Started as a directory of websites, the company was soon doing much more, offering searches, email, shopping and news. Those services, which were free to consumers, were supported by advertising displayed on its various pages.

For a long time, the model worked. It seemed like every company in America — and across much of the world — wanted to reach people using the new medium, and ad revenue poured in to Yahoo.

In the end, the company was done in by Google and Facebook, two younger behemoths that figured out that survival was a continuous process of reinvention and staying ahead of the next big thing. Yahoo, which flirted with buying both companies in their infancy, watched its fortunes sink as users moved on to apps and social networks.

Verizon, one of the nation’s biggest telecommunications companies, plans to combine Yahoo’s operations with AOL, a longtime Yahoo competitor acquired by Verizon last year. The idea is to use Yahoo’s vast array of content and its advertising technology to offer more robust services to Verizon customers and advertisers. Bloomberg first reported the price of the Verizon deal.

Marissa Mayer, who was hired as Yahoo’s chief executive four years ago but failed to turn around the company, is not expected to stay after the deal closes. But she is due to receive severance worth about $57 million, according to Equilar, a compensation research firm. All told, she will have received cash and stock compensation worth about $218 million during her time at Yahoo, according to Equilar’s calculations.

by Vindu Goel and Michael J. de la Merced, NY Times |  Read more: 
Image: Paul Sakuma/Associated Press

ExxonMobil Vows Lenient Treatment For Any Species That Surrenders Voluntarily


IRVING, TX—Addressing the world’s plant and animal life directly during a press event Friday, officials from ExxonMobil vowed to bestow lenient treatment on any species that surrendered to the corporation voluntarily. “I want every bird, fish, marine mammal, and all other flora and fauna to know that any among them who willingly submit to us now without putting up further resistance can expect to be shown a degree of mercy,” said company CEO Rex Tillerson, who added that wildlife will be given a 60-day window to accept the multinational energy conglomerate’s terms and turn themselves in at one of ExxonMobil’s corporate offices. “It is important you understand that your situation is completely hopeless. However, if you end this struggle now and give yourself up to us of your own will, I guarantee you will be spared and treated with a level of dignity, with only a modest punishment. This is far more than I can say for those species who refuse our generous proposal.” Tillerson also offered a substantial reward to any species who provides information about other remaining holdouts.

via: The Onion

Pushing and Pulling Goals

This is a distinction I’ve always found helpful.

A pulling goal is when you want to achieve something, so you come up with a plan and a structure. For example, you want to cure cancer, so you become a biologist and set up a lab and do cancer research. Or you want to get rich, so you go to business school and send out your resume.

A pushing goal is when you have a plan and a structure, and you’re trying to figure out what to do with it. For example, you’re studying biology in college, your professor says you need to do a research project to graduate, and so you start looking for research to do. You already know the plan – you’re going to get books, maybe use a lab, do biology-ish things, and end up with a finished report which is twenty pages double-spaced. All you need to figure out is what you’re going to select as the nominal point of the activity. There’s something perversely backwards about this – most people would expect that the point of a research project is to research some topic in particular. But from your perspective the actual subject you’re researching is almost beside the point. The point is to have a twenty page double-spaced report on something.

School and business are obvious ways to end up with pushing goals, but not every pushing goal is about satisfying somebody else’s requirements. I remember in college some friends set up an Atheist Club. There was a Christian Club, and a Buddhist Club, so why shouldn’t the atheists get a club too? So they wrote the charter, they set a meeting time, and then we realized none of us knew what exactly the Atheist Club was supposed to do. The Christian Club prayed and did Bible study; the Buddhist club meditated, the atheist club…sat around and tried to brainstorm Atheist Club activities. Occasionally we came up with some, like watching movies relevant to atheism, or having speakers come in and talk about how creationism was really bad. But we weren’t doing this because we really wanted to watch movies relevant to atheism, or because we were interested in what speakers had to say about creationism. We were doing this because we’d started an Atheist Club and now we had to come up with a purpose for it.

Sometimes on Reddit’s /r/writing I see people asking “How do you come up with ideas for things to write about?” and I feel a sort of horror. So you want to write a novel, but…you don’t have anything to write about? And you just sit there thinking “Maybe it should be about romance…no, war…no, the ennui of the working classes…or maybe hobbits.” I can understand this in theory – you want to be A Writer – but it still weirds me out.

You may have noticed I don’t really like pushing goals. Part of it is an irrational intuition that they’re dishonest in some way that’s hard to explain. It usually ends up with me trying to figure out what to do my biology research project on, and I think “well, I can’t think of anything I really want to research, so maybe I should just do whatever is easiest”. But if I do whatever is easiest, I feel really bad, and worry maybe I have some kind of obligation to research something important that I care about. So I get my brain tangled up trying to figure out how much easiness I can get away with, then feeling bad for asking the question, then trying to come up with something important I honestly want to do, which doesn’t exist since I wasn’t doing a biology research project the month before my professor assigned it to me and so clearly I am only doing it to satisfy the requirement.

Another part of it is that it’s often a sign something has gone wrong somewhere. In the example of the Atheist Club, that thing might have been starting the club in the first place. But assuming that we genuinely want to start the club, then the presence of a pushing goal means we don’t understand why we wanted to start the club. If we wanted to start it because we wanted to hang out with other atheists, then that offers a blueprint for a solution to the problem – instead of planning all these movies and speakers, we should just hang out. If we did it because we thought it was important for atheism to be more visible on campus, then again, that offers a blueprint for a solution – spend our sessions trying to improve atheism’s campus visibility. If we just sit there saying “I guess we have an Atheist Club now, better think of something to do at meetings”, then it seems like something important hasn’t been fully examined.

by Scott Alexander, Slate Star Codex |  Read more:
Image: none

Saturday, July 23, 2016

Beth Hart

The Substitute

An email blast goes out from the director of composition to all the adjuncts, graduate students, and temporary-contract full-time instructors who teach writing at a large state university. The director of composition is chipper about a professional development opportunity at a neighboring institution and he’ll reserve space for any of the teacher-persons who want to go, the fee paid by the English Department. The invitation is for a workshop with a nationally recognized composition scholar, and the teachers can carpool. For teachers, professional development means a day without teaching, but a day spent talking about teaching, and this is a welcome and often productive change. Almost as an afterthought, the director of composition ends the email by saying anyone who would like to attend the workshop who teaches on that day should arrange for a substitute to teach their classes.

If you’ve never heard of substitute teachers for college classes, that’s because they don’t exist. A substitute is a teacher-person who goes from school to school, and from class to class, to sit in when the regular teacher has taken a day off. Since the students who attend high school, middle school, or elementary school are required by law to sit in those classrooms, the job of the substitute is to watch the kids, end of story. The teacher calls their principal and says they won’t be coming in that day and the principal goes down the list until someone agrees to substitute. It’s never been the teacher’s responsibility.

Professional development is important for teachers: professors go to conferences to stay relevant and to make contacts. It used to be that the university would pay for this, and that’s still mostly true, though it’s almost never the case for the kinds of teacher-persons who received the director of composition’s email. The director of composition recognized that these were teacher-persons who needed professional development for different reasons. The adjuncts, the graduate students, and the temporary-contract full-time instructors were all in the same boat, valued for their semi-pro status, and the director of composition would escort them to the nationally recognized scholar like Scouts on their first camping trip. No one believes that the workshop attendees will ever achieve the heights of the national scholar, no matter what their potential, and in the meantime, they’re still expected to adhere to university policy and to be sure the classes are being taught.

The director of composition should have known more than anyone that it would be next to impossible for composition teachers to get substitutes for their classes, due to the number of them attending the workshop and the number of classes those teachers regularly taught. You see, there’s no substitute pool for them to draw from, so they have to ask unpaid favors of each other, because they certainly can’t ask the tenured professors, who have minimal contact with contingent faculty as they come and go from year to year. Despite the good intentions of the director of composition, the contingent teachers have been reminded that they can’t pretend to claim the perks taken for granted by tenured professors and also do their jobs effectively. (...)

When a student tells me they want to be a teacher I suggest they try substitute teaching. It doesn’t cost anything, one doesn’t need a license, and there’s no prep. They only work the days they want to work and if they are even semi-competent, the schools will keep calling them back. For my own part, I had good experiences as a substitute teacher. It let me know that I was able to go into a classroom where I was greeted with extremely low expectations, yet I could trick the students into learning despite themselves. Anyone who thinks they want to teach will get a pretty quick sense of whether they still want to teach after they’ve tried substituting.

As sensible as this sounds, this suggestion, that a student should try substituting is almost always greeted by the student with disdain. We tend to think of ourselves as better than substitutes, even those with no teaching experience. And so asking someone to be a substitute for a college class only works if that teacher is already in a position that comes with a degree of disrespect. By comparison, instead of asking “Will you be the substitute for my classes?” a contingent teacher might try out some of these equivalent phrases on their colleagues: “Will you babysit my child? Will you be a server at my wedding? Will you pick up my mom from the airport?” The only reason I need a substitute for my classes is because the director of composition told me I did, and the only reason the director of composition told me I did was because the university administration, a.k.a. the numbers people, told the director that I did.

I’m in a position that comes with disrespect, though my students don’t really know it, because whether I go to a workshop for faculty development or not, I’m a professional. Sometimes, the students who turn their noses up at the suggestion that they try substitute teaching will say, “I don’t want to teach high school. I want to do what you do. I want to teach college.”

I tell them that if they teach in the public secondary or elementary schools they’ll have better job security and will likely be paid more. They might have to deal with student discipline or with standardized tests, but maybe that’s not such a big deal. Maybe it just is. It’s teaching after all. It’s a calling.

Blind faith in the capitalist meritocracy makes it surprising for a student to hear that a professor with a Ph.D. can make less than a teacher with a bachelor’s degree, someone they might think of as more like a smart mom than a scholar. But at the end of the school year that teacher won’t be let go simply because her contract has run out. She’ll almost automatically be kept on the payroll, and she’ll be given a raise. At public secondary or elementary schools it’s easier to keep someone around than to do a new job search, so even without tenure there’s almost always job security. At institutions of higher education, however, they pretty much always run job searches, even when they have someone they want to keep around. A professor on a temporary contract might be asked to reapply for the job they already have, or their renewable contract could be replaced with one that is nonrenewable, because the rules can change without warning.

by John Minichillo, McSweeny's |  Read more:
Image: via:

The Unified Theory of Deliciousness

My first restaurant, Momofuku Noodle Bar, had an open kitchen. This wasn’t by choice—I didn’t have enough money or space to put it farther away from the diners. But cooking in front of my customers changed the way I look at food. In the early years, around 2004, we were improvising new recipes every day, and I could instantly tell what was working and what wasn’t by watching people eat. A great dish hits you like a Whip-It: There’s momentary elation, a brief ripple of pure pleasure in the spacetime continuum. That’s what I was chasing, that split second when someone tastes something so delicious that their conversation suddenly derails and they blurt out something guttural like they stubbed their toe.

The Momofuku Pork Bun was our first dish that consistently got this kind of reaction. It was an 11th-hour addition, a slapped-together thing. I took some pork belly, topped it with hoisin sauce, scallions, and cucumbers, and put it inside some steamed bread. I was just making a version of my favorite Peking duck buns, with pork belly where the duck used to be. But people went crazy for them. Their faces melted. Word spread, and soon people were lining up for these buns.

That became my yardstick: I’d ask, “Is this dish good enough to come downtown and wait in line for? If not, it’s not what we’re after.” A chef can go years before getting another dish like that. We’ve been lucky: Hits have come at the least expected time and place. I’ve spent weeks on one dish that ultimately very few people would care about. And then I’ve spent 15 minutes on something that ends up flooring people like the pork bun.

Believe me, nobody is more surprised about this than I am. Cooking, as a physical activity, doesn’t come naturally to me. It never has. To compensate for my lack of dexterity, speed, and technique, I think about food constantly. In fact, I’m much stronger at thinking about food than I am at cooking it. And recently I started seeing patterns in our most successful dishes that suggested our hits weren’t entirely random; there’s a set of underlying laws that links them together. I’ve struggled to put this into words, and I haven’t talked to my fellow chefs about it, because I worry they’ll think I’m crazy. But I think there’s something to it, and so I’m sharing it now for the first time. I call it the Unified Theory of Deliciousness.

This probably sounds absolutely ridiculous, but the theory is rooted in a class I took in college called Advanced Logic. A philosopher named Howard DeLong taught it; he wrote one of the books that directly inspired Douglas Hofstadter to write Gödel, Escher, Bach. The first day, he said, “This class will change your life,” and I was like, “What kind of asshole is this?” But he was right. I would never pretend to be an expert in logic, and I never made it all the way through Gödel, Escher, Bach. But the ideas and concepts I took away from that class have haunted me ever since.

DeLong and Hofstadter both found great beauty in what the latter called strange loops—occasions when mathematical systems or works of art or pieces of music fold back upon themselves. M. C. Escher’s drawings are a great, overt example of this. Take his famous picture of two hands drawing each other; it’s impossible to say where it starts or ends. When you hit a strange loop like this, it shifts your point of view: Suddenly you aren’t just thinking about what’s happening inside the picture; you’re thinking about the system it represents and your response to it.

It was only recently that I had a realization: Maybe it’s possible to express some of these ideas in food as well. I may never be able to hear them or draw them or turn them into math. But I’ll bet I can taste them. In fact, looking back over the years, I think a version of those concepts has helped guide me to some of our most popular dishes.

My first breakthrough on this idea was with salt. It’s the most basic ingredient, but it can also be hellishly complex. A chef can go crazy figuring out how much salt to add to a dish. But I believe there is an objectively correct amount of salt, and it is rooted in a counterintuitive idea. Normally we think of a balanced dish as being neither too salty nor undersalted. I think that’s wrong. When a dish is perfectly seasoned, it will taste simultaneously like it has too much salt and too little salt. It is fully committed to being both at the same time.

Try it for yourself. Set out a few glasses of water with varying amounts of salt in them. As you taste them, think hard about whether there is too much or too little salt. If you keep experimenting, you’ll eventually hit this sweet spot. You’ll think that it’s too bland, but as soon as you form that thought, you’ll suddenly find it tastes too salty. It teeters. And once you experience that sensation, I guarantee it will be in your head any time you taste anything for the rest of your life.

It’s a little bit like the famous liar’s paradox, which we studied in DeLong’s class. Here’s one version of it: “The following sentence is true. The preceding sentence is false.” As soon as you accept the first sentence, you validate the second sentence, which invalidates the first sentence, which invalidates the second, which validates the first, and on and on.

Most people won’t ever notice this sensation; they’ll just appreciate that the food tastes good. But under the surface, the saltiness paradox has a very powerful effect, because it makes you very aware of what you’re eating and your own reaction to it. It nags at you, and it keeps you in the moment, thinking about what you’re tasting. And that’s what makes it delicious.

This was an important realization for me, because it seemed like I’d discovered an unequivocal law. And I figured if I could find one, there had to be more—a set of base patterns that people inherently respond to. So then the challenge became discovering those patterns and replicating them in dish after dish. If you could do that, you’d be like the Berry Gordy of cooking; you’d be able to crank out the hits.

by David Chang, Wired |  Read more:
Image: Kiernan Monaghan & Theo Vamvounakis