Saturday, March 14, 2015

Fractured Israel

Deeply divided and foul of mood, Israelis are headed toward what seems like a referendum on their long-serving, silver-tongued prime minister, the hard-line Benjamin Netanyahu.

But with so many of them having despaired of peace talks with the Palestinians, the focus is mostly on Netanyahu's personality, his expense scandals and the soaring cost of living.

And as no candidate is likely to win big in the wild jumble of Israel's political landscape, the outcome of the March 17 election could well be a joint government between Netanyahu and his moderate challenger Isaac Herzog. It's an irony, because the animosities are overwhelming.

Much has changed in the world since Netanyahu first became prime minister in 1996, but Israel remains stuck with the question of what to do with the highly strategic, biblically resonant, Palestinian-populated lands it captured almost a half-century ago.

Israelis know it is their existential issue, but it seems almost too complex for a democracy. After decades of failed peace talks under every sort of government, the whole festering thing has become such a vexation that politicians seem to fear it, and voters look away.

When he called the early election in November, Netanyahu seemed a shoo-in, but somewhere things went wrong. Notorious around the world for American-accented eloquence in the service of a tough stance, he is extraordinarily divisive at home, where he has been prime minister for the past six years, and for nine in total.

His speech last week before the U.S. Congress, urging a tighter deal than he believes is brewing on Iran's nuclear program, was typical: He impressed some Israelis, while infuriating others who sensed a political ploy.

Polls show his nationalist Likud Party running slightly behind Herzog's Labor Party, rebranded the Zionist Union in a bid for nationalist votes. There are scenarios in which Herzog - improbably mild-mannered in a high-decibel land - becomes prime minister. And that would change the music: Herzog is a conciliator genuinely interested in ending the occupation of lands captured in the 1967 war.

Some things to watch for:

Israel is Nearly Ungovernable

Despite its reputation for plucky unity, the country is badly fragmented - and that's reflected in parliament under the proportional representation system.

Combined, the two big parties get far less than half the vote. Then one finds a nationalist party appealing to Russian speakers, another for secular liberals and two for the squeezed middle class. A united list represents the one-fifth of citizens who are Arabs and is itself divided between communist, nationalist and Islamist factions. There are four religious parties, for Jews of European versus Middle Eastern descent and for varying degrees of nationalism.

The schisms are real, reflecting a society so diverse that at times it seems to be flying apart. The discourse is of one's rival destroying the country, through stupidity or evil. A TV debate between the main candidates other than Netanyahu and Herzog quickly degenerated into shouted accusations of fascism, criminality and treason.

by Dan Perry, AP |  Read more:
Image: Matanya via Flickr and Wikipedia

Friday, March 13, 2015

Count Five

Shitphone: A Love Story

Last month my fourth iPhone in six years was, in medical terms, crashing. The screen, which had pulled away from its glue, was behaving strangely. The charging port, no matter how thoroughly I cleaned it, only occasionally took power. Repair would be expensive, especially considering that my contract would be up in about six months. Buying a newer iPhone would mean spending $650 up-front, spending $450 with a new two-year contract or amortizing the price with my carrier’s new early upgrade plan. I felt trapped, as every smartphone owner occasionally does, between two much more powerful entities that take me, an effectively captive chain-buying contract iPhone user, for granted. I began to take offense at the malfunctioning iPhone’s familiarity. Our relationship was strained and decreasingly rational. I was on a trip and away from home for a few weeks, out of sorts and out of climate, slightly unmoored and very impatient.

And so the same stubborn retail-limbic response that prevented me from avoiding this mess in the first place — by buying an AppleCare insurance plan — activated once more, and I placed an order I had been thinking about for months: One BLU Advance 4.0 Unlocked Dual Sim Phone (White), $89.99 suggested retail (but usually listed lower), $76.14 open-box with overnight shipping. 1,829 customer reviews, 4.3 stars. “This isn’t the best phone out there, but it is by far the best phone for only around $80–90,” wrote Amazon reviewer Anne.

Yes, Anne, sounds perfect, let’s do it. Shitphone would be delivered the next day.

I’ve been living happily in an electronics shitworld long enough that I’ve begun evangelizing for it. My last television, a Hisense pulled from the storeroom of a North Carolina Walmart by an employee who didn’t know it was there, is a simple and vibrant LED TV with bad sound. My stereo is built around an Insignia receiver (Best Buy house label) that powers speakers from a company called Micca ($55.60, 347 customer reviews, 4.7 stars) and it sounds… pretty good! My router is made by TP-LINK ($18.99, 575 customer reviews, 4.3 stars), and keeps me online about as reliably as my Netgear did. I bought my mother a neat little Baytek Bluetooth speaker for $26.99 (54 customer reviews, 4.6 stars), which she loves, even if its programmed voice draws out the “ess” in “Connected SuccESSSfully” in a way that suggests a strictly mechanical familiarity with English. I impulse-buy off-brand earbuds with mixed results and derive great satisfaction from discovering good ones. I bought a used Macbook for work but use a $280 Chromebook whenever possible. It is my aspirational shit-top, and I consider this situation a failure. Mainstream laptops are far enough along in the commoditization process that, for the purposes of browsing and emailing and chatting and dealing with photos — a near-totality of my computer usage — almost anything available will do. The top-selling laptop on Amazon is a $250 Asus that runs Windows 8. It would suit my needs nicely. We’ll see what happens when the Mac dies.

Off-brand electronics are, like their branded counterparts, interesting for a limited amount of time: The highest-end branded version of a product offers a chance to taste the luxurious future of technology; the shitworld version lets you preview a more practical future — the future most of the global electronics-buying public will actually enjoy. Take the Jambox, a small and dazzlingly expensive prism of speakers and battery and wireless radios that plays music from nearly any phone at a respectable volume; it was a sensation for a few years after its introduction in 2010. By 2013, off-brand speakers were making major inroads online, allowing shoppers like me to feel like we were somehow gaming the system (this requires, of course, a narrow and convenient definition of “the system”). The year after, Amazon, America’s primary portal to consumer electronics shitworld (and recently one of its proud citizens), had released its own version of the Jambox concept under the pointedly dull name “AmazonBasics Portable Bluetooth Speaker” (731 customer reviews, 4.4 stars). Soon, basic picnic-ready wireless speakers may become an undistinguished, disposable part of many consumers’ lifestyles, like USB sticks or batteries — a point at which branded versions are a minority sustained only by those consumers looking for Bluetooth speakers that signify luxury, style, or taste. Off-brand electronics are alluring only when they feel like deals — that is, only as long as there are more popular branded alternatives which they can imply are overpriced. They’re interesting, in other words, for as long as they make the buyer feel smart.

Homogeneity is what you should expect from shitphones, because it’s what you get. Buy a BLU or an Unnecto or a Posh Mobile or a Prestigio or a Yezz or an InFocus or an iRulu and you can expect similar boxes of parts, sorted by price point. The guts will likely be low-to-mid-range hardware from MediaTek, which is mostly invisible in the U.S. market but is the second largest supplier of mobile phone systems-on-chip in the world. This means the phones will share not just specifications but quirky features: even some of the cheapest phones let you use two SIM cards, for example, and many of them have an FM radio. The shells, which must fit around MediaTek’s core technology, stick to a few basic styles: For the bigger phones, seamless rectangles of a particular thickness; for the small ones, round-back thick-bezel handsets that evoke the iPhone 3G. For cheaper phones, you’ll get Android 4.2x. For a few more dollars, Android 4.4x. As is the case with major brands, shitphones with the latest version of Android, 5.0, are just becoming available.

Premium branded phones are the culmination of decades of research in wireless technology, computing, materials, and design. Shitphones are the culmination of decades of research in wireless technology, computing, materials, and design — minus a year or two. Shitphones are generally not actually shitty. They are, if you isolate them from the distorting effect of highly competitive preference-driven smartphone retail and marketing, the absence of which helps keep them so cheap, marvels of engineering and execution, assembled with precision and care and able to accomplish tasks that a half-dozen years ago would have been inconceivable for a portable device. iPhones are really just shitphones from the future.

This is what commoditization feels like: genuine novelty rapidly reduced to thankless anonymity. The iPhone and its high-end competitors benefited for years as the most visible and functional instance of a profoundly and globally novel new product. To be one of pioneering brands at the beginning of a new technological era — to sell someone his first magical hand device — is to apply a temporary multiplier to everything from brand recognition to loyalty to profit. But their brands, now, are just temporary protective spells cast against the inevitable. As we approach the 10-year anniversary of the release of the iPhone, the category it blew up is starting to feel familiar. By now, an American who purchased a smartphone on contract in 2009 has not just bought but discarded at least three devices, and as smartphones mature, that is the reality of their use: to improve is to disappear just a little more. Aren’t we all just emailing and Instagramming and Facebooking and Snapchatting and WhatsApping and Angry-Birdsing anyway?

by John Herrman, Medium |  Read more:
Image: Hocus-Focus/Getty

Protection Without a Vaccine

Last month, a team of scientists announced what could prove to be an enormous step forward in the fight against H.I.V.

Scientists at Scripps Research Institute said they had developed an artificial antibody that, once in the blood, grabbed hold of the virus and inactivated it. The molecule can eliminate H.I.V. from infected monkeys and protect them from future infections.

But this treatment is not a vaccine, not in any ordinary sense. By delivering synthetic genes into the muscles of the monkeys, the scientists are essentially re-engineering the animals to resist disease. Researchers are testing this novel approach not just against H.I.V., but also Ebola, malaria, influenza and hepatitis.

“The sky’s the limit,” said Michael Farzan, an immunologist at Scripps and lead author of the new study.

Dr. Farzan and other scientists are increasingly hopeful that this technique may be able to provide long-term protection against diseases for which vaccines have failed. The first human trial based on this strategy — called immunoprophylaxis by gene transfer, or I.G.T. — is underway, and several new ones are planned.

“It could revolutionize the way we immunize against public health threats in the future,” said Dr. Gary J. Nabel, the chief scientific officer of Sanofi, a pharmaceutical company that produces a wide range of vaccines.

Whether I.G.T. will succeed is still an open question. Researchers still need to gauge its safety and effectiveness in humans. And the prospect of genetically engineering people to resist infectious diseases may raise concerns among patients.

“The reality is we are touching third rails, and so it’s going to take some explanation,” said Dr. David Baltimore, a Nobel Prize recipient and virologist at Caltech who is testing I.G.T. against a number of diseases.

Conventional vaccines prompt the immune system to learn how to make antibodies by introducing it to weakened or dead pathogens, or even just their molecular fragments. Our immune cells produce a range of antibodies, some of which can fight these infections.

In some cases, these antibodies provide strong defenses. Vaccinations against diseases such as smallpox and measles can lead to almost complete protection.

But against other diseases, conventional vaccines often fail to produce effective antibodies. H.I.V., for example, comes in so many different strains that a vaccine that can protect against one will not work against others.

I.G.T. is altogether different from traditional vaccination. It is instead a form of gene therapy. Scientists isolate the genes that produce powerful antibodies against certain diseases and then synthesize artificial versions. The genes are placed into viruses and injected into human tissue, usually muscle.

The viruses invade human cells with their DNA payloads, and the synthetic gene is incorporated into the recipient’s own DNA. If all goes well, the new genes instruct the cells to begin manufacturing powerful antibodies.

by Carl Zimmer, NY Times |  Read more:
Image: John Hersey

Okomeya
h/t YMFY
[ed. Musubi, sake and rice. Simple.]

Jonas Wood 
via:

Ryo Takemasa
via:

Thursday, March 12, 2015

Opportunity Gap

[ed. See also: Richer and Poorer.]

The event is billed as a lecture on a new book of social science. But the speaker visiting Cambridge’s Lesley University this Monday night sounds like a political candidate on the hustings. Robert D. Putnam ­— Harvard political scientist, trumpeter of community revival, consultant to the last four presidents ­— is on campus to sound an alarm. "What I want to talk to you about," he tells some 40 students and academics, is "the most important domestic challenge facing our country today. I want to talk about a growing gap between rich kids and poor kids."

Two decades ago, Putnam shot to fame with "Bowling Alone," an essay-turned-best-selling-book that amassed reams of data to chart the collapse of American community. His research popularized a concept known as "social capital." The framework, used in fields like sociology and economics, refers to social networks and the norms of reciprocity and trust they create. "He’s one of the most important social scientists of our time," says Gary King, director of Harvard’s Institute for Quantitative Social Science, because of his ability to blend scientific rigor with popular appeal.

But tonight Putnam sets the science aside, at least to start. He opens his Cambridge talk with a story. It’s about two young women, Miriam and Mary Sue. Their families, he says, both originally came from the same small Ohio town. Miriam, who had well-educated parents, went off to an ultra-elite East Coast university. Mary Sue, the daughter of high-school graduates who never held a steady job, ended up on a harrowing path of abuse, distrust, and isolation.

Removing a sheet of paper from a folder — the notes from an interview that one of his researchers conducted with Mary Sue — Putnam reads off the particulars. Mary Sue’s parents split up when she was 5. Her mother turned to stripping, leaving Mary Sue alone and hungry for days. Her only friend until she went to school was a mouse who lived in her apartment. Caught selling pot at 16, she spent time in juvenile detention, flunked out of high school, and got a diploma online. Mary Sue wistfully recalls the stillborn baby she had at 13. She now dates an older man with two infants born to two different mothers.

"To Mary Sue," Putnam says, "this feels like the best she can hope for."

He pauses. "Honestly, it’s hard for me to tell the story."

Miriam is Putnam’s own granddaughter. Mary Sue (a pseudonym) is almost exactly the same age. And the backdrop to this tale is the professor’s hometown of Port Clinton, once an egalitarian community where people looked after all kids, regardless of their backgrounds. In Putnam’s telling, Port Clinton now symbolizes the class disparities that have swept the country in recent decades — a "split-screen American nightmare" where the high-school lot contains one kid’s BMW parked beside the jalopy in which a homeless classmate lives.

"In Port Clinton now, nobody thinks of Mary Sue as one of ‘our kids,’" Putnam says. "They think she’s somebody else’s kid — let them worry about her."

At 74, the professor is embarking on a campaign with one basic goal: getting educated Americans to worry about the deteriorating lives of kids like Mary Sue. It kicks into high gear this week with the publication of his new book, Our Kids: The American Dream in Crisis (Simon & Schuster). The basic argument: To do well in life, kids need family stability, good schools, supportive neighbors, and parental investment of time and money. All of those advantages are increasingly available to the Miriams of the world and not to the Mary Sues, a disparity that Putnam calls "the opportunity gap.

Ever since the Occupy Wall Street movement emerged in 2011, much public discussion has focused on the unequal distribution of income in today’s America. Traditionally, though, that kind of inequality hasn’t greatly concerned Americans, Putnam writes. What they have worried about is a related, though distinct, issue: equality of opportunity and social mobility. Across the political spectrum, Putnam writes, Americans historically paid lots of attention to the prospects for the next generation: "whether young people from different backgrounds are, in fact, getting onto the ladder at about the same place and, given equal merit and energy, are equally likely to scale it."

by Marc Parry, Chronicle of Higher Education |  Read more:
Image: Bryce Vickmark

Cherry Blossoms



[ed. University of Washington, March 12, 2015.]
photo: markk

Seth Avett and Jessica Lea Mayfield

Ultrasound Therapies Target Brain Cancers and Alzheimer’s Disease

From imaging babies to blasting apart kidney stones, ultrasound has proved to be a versatile tool for physicians. Now, several research teams aim to unleash the technology on some of the most feared brain diseases.

The blood-brain barrier, a tightly packed layer of cells that lines the brain's blood vessels, protects it from infections, toxins, and other threats but makes the organ frustratingly hard to treat. A strategy that combines ultrasound with microscopic blood-borne bubbles can briefly open the barrier, in theory giving drugs or the immune system access to the brain. In the clinic and the lab, that promise is being evaluated.

This month, in one of the first clinical tests, Todd Mainprize, a neurosurgeon at the University of Toronto in Canada, hopes to use ultrasound to deliver a dose of chemotherapy to a malignant brain tumor. And in some of the most dramatic evidence of the technique's potential, a research team reports this week in Science Translational Medicine that they used it to rid mice of abnormal brain clumps similar to those in Alzheimer's disease, restoring lost memory and cognitive functions. If such findings can be translated from mice to humans, “it will revolutionize the way we treat brain disease,” says biophysicist Kullervo Hynynen of the Sunnybrook Research Institute in Toronto, who originated the ultrasound method. (...)

Safely and temporarily opening the blood-brain barrier is a long-sought goal in medicine. About a decade ago, Hynynen began exploring a strategy combining ultrasound and microbubbles. The premise is that ultrasound causes such bubbles to expand and contract, jostling the cells forming the blood-brain barrier and making it slightly leaky.

That could help cancer physicians such as Mainprize deliver chemotherapy drugs into the brain. Hynynen also hypothesized that the brief leakage would rev up the brain's inflammatory response against β amyloid—the toxic protein that clumps outside neurons in Alzheimer's and may be responsible for killing them. Disposing of such debris is normally the role of the microglia, a type of brain cell. But previous studies have shown that when β amyloid forms clumps in the brain, it “seems to overwhelm microglia,” Bacskai says. Exposing the cells to anti bodies that leak in when the blood-brain barrier is breached could spur them to “wake up and do their jobs,” he says. Some antibodies in blood may also bind directly to the β-amyloid protein and flag the clumps for destruction. (...)

This week, neuroscientist Jürgen Götz of the Queensland Brain Institute in St. Lucia, Australia, and his Ph.D. student Gerhard Leinenga report that they have built on Hynynen and Aubert's protocol, using a different mouse model of Alzheimer's. After injecting these animals with a solution of microscopic bubbles, they scanned an ultrasound beam in a zigzag pattern across each animal's entire skull, rather than focusing on discrete areas as others have done. After six to eight weekly treatments, the team tested the rodents on three different memory tasks. Alzheimer's mice in the control group, which received microbubble injections but no stimulation, showed no improvement. Mice whose blood-brain barriers had been made permeable, in contrast, saw “full restoration of memory in all three tasks,” Götz says.

by Emily Underwood, Science |  Read more:
Image: Emmanuel Thevenot/Lab of Isabelle Aubert/ Sunnybrook Research Institute

Wednesday, March 11, 2015

Barack and Me

I couldn’t sleep for shit.

Friday night had turned into Saturday morning, and I was staring at the ceiling in a hotel room in Washington, D.C., only blocks from the White House, recovering from my third hot shower of the night. The fever that had developed from an 11-hour Amtrak trip down the East Coast a day earlier hadn’t left my body, and the only way I knew how to deal with the chills was to take hot showers and hope for the best.

But that wasn’t the real reason for my insomnia and this body-zapping panic: I would be speaking to the president of the United States of America in 10 hours. On Air Force One. Before his speech in Selma, Alabama, on the Edmund Pettus Bridge to commemorate the 50th anniversary of the march that took place on what became known as Bloody Sunday.

On Monday, I had received an email from the White House offering “a potential opportunity with President Obama in the very near future.” The opportunity was to be a part of a roundtable of five journalists who would have 30 minutes to talk with the president.

As the week progressed, however, the stakes grew. With the date inching closer, the details became clearer. On Friday, the final email:
Following brief remarks at the top of the roundtable, the President will take a question from each participant.
As in one question. Zero room for error. My editor’s response was as blunt as it was true: “Better make it count.”

Lying in bed, staring at the ceiling, just a sunrise away from that one question, I still wasn’t sure what I was going to ask. I had written one question down, but I wasn’t convinced it was the question. And I was running out of time.

All I could think about was why I was here. Or, more accurately, what brought me here. I knew what I’d wanted to ask for years. I just didn’t know if, when the time came, I’d actually ask it.

I've been chasing Barack Obama for more than a decade. I watched his 2004 speech at the Democratic National Convention while deep in the throes of college application essays. It was a speech that I needed to hear, a speech that felt as if it were specifically for me. Before I knew it, I was working on Capitol Hill in 2007 as a college intern for Senator Ted Kennedy, where I would occasionally catch a glimpse of the then-Senator Obama traveling on the underground monorail from the Senate to the Capitol floor. I reveled in the excitement when he announced his presidency that February. I volunteered for that campaign in 2008 in New Hampshire, taking to the streets of New England with a megaphone following his victory, and hoping to one day be a part of his actual staff. In 2011, looking for a way out of graduate school, I applied for a job as a blogger in his reelection campaign — and I almost got that job, before then not getting that job.

My current job — the second attempt to drop out of graduate school — is a result of not getting a job with the Obama campaign. Living in New York is a result of not getting a job with the Obama administration. And my slow crawl away from politics and toward writing is a direct result of chasing — and never quite catching — the world that surrounds President Obama. The chase has felt never-ending. But in a way, I owe everything to the chase.

The chase was on my mind as I rode in a car to Joint Base Andrews on Saturday morning. It’s what I thought about on the shuttle to Air Force One with the four other journalists, Charles Blow from the New York Times, Zerlina Maxwell from Essence, White House correspondent April Ryan from the American Urban Radio Networks, and DeWayne Wickham, a USA Today columnist and dean of Morgan State University’s School of Global Journalism & Communication. And that chase is what I thought of when we arrived at Andrews and stood before Air Force One. (...)

Air Force One is a plane on PEDs. It rumbles with such force that we were told attempting to record the roundtable on our personal devices would be a challenge, and that the stenographer would have a transcript of proceedings ready for us later that day. In terms of size, it appeared to have swallowed two double-aisled commercial airliners. But it’s still a plane. It has wheels, it has wings, it takes off, and it goes into the air.

There were stairs everywhere, and so many rooms. And many of these rooms had doors. The floor plan felt like a labyrinth of narrow walkways, leading to beige area after beige area. Both times I left my part of the cabin by myself, I got lost. And even though I was never lost for more than 10 seconds, I immediately felt that let-go-of-your-mom’s-hand-at–Six Flags lost, scared that I was either going to get in trouble or never find my way back.

Every now and then, during a break in conversation, I’d retreat to my notebook and stare at my question. I’d written a second one focused on Selma, but it wasn’t right. It was a cop-out question. A question anyone could have asked. So I knew what I had to do. I needed to change a word here, move a sentence there, make it more concise, but I knew it was absolutely the type of question I was asked here to put forward.

by Rembert Browne, Grantland |  Read more:
Image: Rembert Browne

The NFL Trade Wheel

[ed. If any of this makes sense to you, back away from the TV, take a deep breath, have a sobering look at yourself and ask... how do I find my way back to the light.]

The first swap of the three involved the most high-profile player of the bunch. With virtually no warning that they were even shopping their star tight end, the Saints sent Jimmy Graham and their fourth-round pick in this year’s draft to the Seahawks for center Max Unger and Seattle’s first-round pick, the 31st overall selection in this year’s draft.

Using the Draft Pick Value Calculator generated by Chase Stuart at Football Perspective, we can estimate the difference in value between the two draft picks. We also have to guess where New Orleans’s draft pick will land, since compensatory picks have yet to be handed out, but it should come within one or two slots of the 110th overall pick. Using those figures, the balance of what the Seahawks sent amounts to the equivalent of the 65th pick in the draft — the first pick of the third round. That certainly sounds a lot less dramatic than dealing a first-round pick for a fourth-rounder.

It’s not the first time the Seahawks have used their first-round pick in a move to acquire a weapon for Russell Wilson, which is one of the many reasons this deal is so fascinating. Seattle sent a first-round pick to Minnesota two years ago (along with a seventh-round pick in that draft and a third-round pick in 2014) to acquire Percy Harvin in a deal that proved to be a rare misstep for general manager John Schneider, with Harvin missing virtually all of his first season in Seattle with injuries, before being dealt away to the Jets for what ended up being a sixth-round pick. (...)

The part that doesn’t click for me is Seattle adding salary while still owing Wilson and Bobby Wagner new deals. While the Seahawks took cash off their cap in the Harvin deal, they still owe $7.2 million in dead money for Harvin in 2015. The recent contract extension for Lynch gave him $12 million guaranteed, all of which gets paid this season; he has base salaries of $9 million and $7 million in 2016 and 2017, respectively, but the Seahawks could cut him and save $4 million in 2016 or $4.5 million in 2017.

Graham is not cheap for Seattle, even with the Saints eating his $12 million signing bonus. The Seahawks will owe him $8 million in 2015, $9 million in 2016, and $10 million in 2017, assuming they keep the $5 million roster bonus Graham is due Thursday without converting it to a signing bonus. Converting the bonus would free up more cap space in 2015, but would cost the Seahawks if they decided down the line they wanted to move on, so it doesn’t seem like a logical move.

None of the 2016 or 2017 money is guaranteed or would result in dead cap if the Seahawks decide to move on, so this can be anything from a one-year rental to a three-year deal. Given that the Seahawks will likely sign Wilson and Wagner to new deals during the 2015 season, the base salaries owed Graham will be difficult to swallow in the years to come. It wouldn’t be a surprise to see the Seahawks forced to choose between Graham and Lynch in 2016, and if Graham wins, Seattle will likely offer him an extension to free up cap space in 2017.

The megadeals to come are also likely why the Seahawks moved on from Unger. A Pro Bowl–caliber center who has struggled to stay healthy in recent seasons, the 28-year-old Unger was just about due for a new deal. He was entering the third year of a four-year, $26 million extension that has relatively docile cap hits of $4.5 million in both 2015 and 2016. After that, a healthy Unger would have likely expected to see his cap hit double, pointing to the Alex Mack deal as a comparable contract. Seattle couldn’t afford to give Unger that much money, and in trading him now, it was able to get a serious asset who upgraded them at a more meaningful position.

For the Saints, this is a serious repudiation of their all-in philosophy from a year ago and the quality of the team Sean Payton and Mickey Loomis thought they had built. I wrote about their cap woes in December, and while I pointed out the accounting method that would enable them to overcome their $27 million nightmare and get underneath the hard cap, there wasn’t going to be much space to reshape their franchise.

New Orleans had already cut Curtis Lofton and Pierre Thomas this offseason, but to make serious changes to its roster in the years to come, it was going to have to carve one or two of the top salaries off the books. One of those players, apparently, was Graham. While the Saints get $19 million in cap relief over 2016 and 2017, they don’t actually save any money in 2015. With the dead money on his deal, Graham’s cap hold actually rises from $8 million to $9 million. The Saints then add the $4.5 million on Unger’s deal to their 2015 cap, and they’ll also owe an extra $750,000 or so for the salary difference between the draft picks they just traded.

After battling so hard to get underneath the cap of $143 million, New Orleans is already nearly $3 million over the cap. Loomis suggested after the trade that he made the deal to improve New Orleans’s defense, which certainly makes it strange that he traded Graham for a center and not a defensive player.

by Bill Barnwell, Grantland |  Read more:
Image: Chris Graythen, Getty

Dancing Man and the Cult of Well-Intentioned Idiots


[ed. TwitIdiots?]
Read more:

The Revolution Will Probably Wear Mom Jeans

America’s present need is not heroics, but healing; not nostrums, but normalcy; not revolution, but restoration; not agitation, but adjustment; not surgery, but serenity; not the dramatic, but the dispassionate; not experiment, but equipoise; not submergence in internationality, but sustainment in triumphant nationality.

—Warren G. Harding, “A Return to Normalcy,” May 14, 1920

Not long ago, a curious fashion trend swept through New York City’s hipster preserves, from Bushwick to the Lower East Side. Once, well-heeled twentysomethings had roamed these streets in plaid button-downs and floral playsuits. Now, the reign of the aspiring lumberjacks and their mawkish mates was coming to an end. Windbreakers, baseball caps, and polar fleece appeared among the flannel. Cargo shorts and khakis were verboten no longer. Denim went from dark-rinse to light. Sandals were worn, and sometimes with socks. It was a blast of carefully modulated blandness—one that delighted some fashion types, appalled others, and ignited the critical passions of lifestyle journalists everywhere.

They called it Normcore. Across our Fashion Nation, style sections turned out lengthy pieces exploring this exotic lurch into the quotidian, and trend watchers plumbed every possible meaning in the cool kids’ new fondness for dressing like middle-aged suburbanites. Were hipsters sacrificing their coolness in a brave act of self-renunciation? Was this an object lesson in the futility of ritually chasing down, and then repudiating, the coolness of the passing moment? Or were middle-aged dorks themselves mysteriously cool all of a sudden? Was Normcore just an elaborate prank designed to prove that style writers can be fooled into believing almost anything is trendy?

By March 2014, Vogue had declared Normcore totally over, but even that lofty fiat couldn’t put a stop to it. Gap adopted the slogan “dress normal” for its fall ad campaign, and the donnish Oxford English Dictionary nominated “normcore” for 2014’s word of the year. A full twelve months after Vogue tried to extinguish it, Normcore continues to convulse opinion, a half-life long enough (in fashion-time, anyway) to place it among the decade’s most enduring trends.

More than that, elaborate prank or no, Normcore is a remarkably efficient summary of hipster posturing at its most baroque. Never has a trend so perfectly crystallized the endless, empty layers of fashion-based rebellion. And never has a trend shown itself to be so openly contemptuous of the working class. Like many a fad before it, Normcore thrives on appropriation. But where privileged hipsters once looked to underground subcultures—bikers, punks, Teddy Boys—as they pursued their downwardly mobile personal liberation, they now latch onto the faceless working majority: the Walmart shoppers, the suburban moms and dads.

Even if it began as something of a self-referential fashion joke, the media’s infatuation with all things Normcore says a lot. Not least, it highlights our abiding social need for a sanitized counterculture, for a youthful rebellion that can be readily dismissed, for the comfort of neoliberal melancholy, for what Warren G. Harding—the unheralded John the Baptist of the Normcore Gospel—famously called “a return to normalcy.”

The Revolt of the Mass Indie Überelite

The adventure began in 2013, and picked up steam early last year with Fiona Duncan’s “Normcore: Fashion for Those Who Realize They’re One in 7 Billion,” a blowout exploration of the anti-individualist Normcore creed for New York magazine. Duncan remembered feeling the first tremors of the revolution:
Sometime last summer I realized that, from behind, I could no longer tell if my fellow Soho pedestrians were art kids or middle-aged, middle-American tourists. Clad in stonewash jeans, fleece, and comfortable sneakers, both types looked like they might’ve just stepped off an R-train after shopping in Times Square. When I texted my friend Brad (an artist whose summer uniform consisted of Adidas barefoot trainers, mesh shorts and plain cotton tees) for his take on the latest urban camouflage, I got an immediate reply: “lol normcore.”
Brad, however eloquent and charming, did not coin the term himself. He got it from K-HOLE, a group of trend forecasters. To judge by K-HOLE’s name alone—a slang term for the woozy aftereffects of the animal tranquilizer and recreational drug ketamine—the group was more than happy to claim Normcore as its own licensed playground. As company principals patiently explained to the New York Times, their appropriation of the name of a toxic drug hangover was itself a sly commentary on the cultural logic of the corporate world’s frenetic cooptation of young people’s edgy habits. At a London art gallery in October 2013, in a paper titled “Youth Mode: A Report on Freedom,” team K-HOLE proposed the Twitter hashtag #Normcore as a rejoinder to such cooptation:
If the rule is Think Different, being seen as normal is the scariest thing. (It means being returned to your boring suburban roots, being turned back into a pumpkin, exposed as unexceptional.) Which paradoxically makes normalcy ripe for the Mass Indie überelites to adopt as their own, confirming their status by showing how disposable the trappings of uniqueness are.
Jargon aside, the report had a point: lately “Mass Indie überelites”—a group more commonly known as hipsters—have been finding it increasingly difficult to express their individuality, the very thing that confers hipster cred.

Part of the problem derives from the hipster’s ubiquity. For the past several years, hipsterism has been an idée fixe in the popular press—coy cultural shorthand in the overlapping worlds of fashion, music, art, and literature for a kind of rebellion that doesn’t quite come off on its own steam. Forward-thinking middle-class youngsters used to strike fear in the hearts of the squares by flouting social norms—at least nominally, until they grew up and settled into their own appointed professional, middle-class destinies. Now, however, the hipster is a benign and well-worn figure of fun: a lumpenbourgeois urbanite perpetually in search of ways to display her difference from the masses.

by Eugenia Williamson, The Baffler |  Read more:
Image: Hollie Chastain

Frank Roth
via: