Saturday, March 16, 2019

“Robotic” Pill Sails Through Human Safety Study

An average person with type 1 diabetes and no insulin pump sticks a needle into their abdomen between 700 and 1,000 times per year. A person with the hormone disorder acromegaly travels to a doctor’s office to receive a painful injection into the muscles of the butt once a month. Someone with multiple sclerosis may inject the disease-slowing interferon beta drug three times per week, varying the injection site among the arms, legs and back.

Medical inventor Mir Imran, holder of more than 400 patents, spent the last seven years working on an alternate way to deliver large drug molecules like these, and his solution—an unusual “robotic” pill—was recently tested in humans.

The RaniPill capsule works like a miniature Rube Goldberg device: Once swallowed, the capsule travels to the intestines where the shell dissolves to mix two chemicals to inflate a balloon to push out a needle to pierce the intestinal wall to deliver a drug into the bloodstream.

Simple, right?

It may not be simple, but so far, it’s working: Imran’s company, San Jose-based Rani Therapeutics, just announced the successful completion of the first human study of the pill—a 20-person trial that showed a drug-free version of the capsule (roughly the size of a fish oil pill) was well-tolerated, easy-to-swallow, and passed safely through the stomach and intestines.

“There were no issues in swallowing the capsule, in passing it out, and, most importantly, no sensation when the balloon inflated and deflated,” says Imran, Rani’s chairman and CEO. (...)

Working from the outside in, the RaniPill consists of a special coating that protects the pill from the stomach’s acidic juices. Then, as the pill is pushed into the intestines and pH levels rise to about 6.5, the coating dissolves to reveal a deflated biocompatible polymer balloon.

Upon exposure to the intestinal environment, a tiny pinch point made of sugar inside the balloon dissolves, causing two chemicals trapped on either side of the pinch point to mix and produce carbon dioxide. That gas inflates the balloon, and the pressure of the inflating balloon pushes a dissolvable microneedle filled with a drug of choice into the wall of the intestines. Human intestines lack sharp pain receptors, so the micro-shot is painless.

The intestinal wall does, however, have lots and lots of blood vessels, so the drug is quickly taken up into the bloodstream, according to the company’s animal studies. The needle itself dissolves. (...)

Imran calls the device a robot though it has no electrical parts and no metal. “Even though it has no brains and no electronics, it [works through] an interplay between material science and the chemistry of the body,” says Imran. “It performs a single mechanical function autonomously.”

by Megan Scudellari, IEEE Spectrum | Read more:
Image: Rani Therapeutics

Friday, March 15, 2019

Jeremy Corbyn, 1970s Revanchist, Is Suddenly the Face of the New New Left

The politics of Britain and the U.S. can have a strange, synchronized rhythm to them. Margaret Thatcher was a harbinger of Ronald Reagan as both countries veered suddenly rightward in the 1980s. Prime Minister John Major emerged as Thatcher’s moderate successor as George H.W. Bush became Reagan’s, cementing the conservative trans-Atlantic shift. The “New Democrats” and the Clintons were then mirrored by “New Labour” and the Blairs, adapting the policies of the center-left to the emerging consensus of market capitalism. Even Barack Obama and David Cameron were not too dissimilar — social liberals, unflappable pragmatists — until the legacies of both were swept aside by right-populist revolts. The sudden summer squall of Brexit in 2016 and the triumph of Trump a few months later revealed how similarly the Tories and the Republicans had drifted into nationalist, isolationist fantasies.

But what of the parallels on the left? What’s generating activist energy and intellectual ferment in both countries is an increasingly disinhibited and ambitious socialism. Bernie Sanders’s strength in the Democratic Party primaries two years ago was a prelude to a new wave of candidates who’ve struck unabashedly left-populist notes this year, calling for “Medicare for all” and the end of ICE, alongside a more social-justice-oriented cultural message. Some, like the charismatic Alexandria Ocasio-Cortez, have achieved national visibility as an uncomplicated socialism has found more converts, especially among the young. Moderate Democrats have not disappeared, but they are on the defensive. A fight really is brewing for the soul of the Democrats.

And so it seems worth trying to understand what has happened in the Labour Party in Britain in the past few years. In 2015, in a flash, Labour became the most radical, left-wing, populist force in modern British political history. Its message was and is a return to socialism, a political philosophy not taken seriously there since the 1970s, combined with a truly revolutionary anti-imperialist and anti-interventionist foreign policy. This lurch to the extremes soon became the butt of jokes, an easy target for the right-wing tabloid press, and was deemed by almost every pundit as certain to lead the party into a distant wilderness of eccentric irrelevance.

Except it didn’t. Today, Labour shows no sign of collapse and is nudging ahead of the Tories in the polls. In the British general election last year, it achieved the biggest gain in the popular vote of any opposition party in modern British history. From the general election of 2015 to the general election of 2017, Labour went from 30 percent of the vote to 40 percent. It garnered 3.6 million more votes as a radical socialist party than it had as a center-left party. Hobbled only by a deepening row over anti-Semitism in its ranks, Labour will be the clear favorite to form the next government if the brittle Tory government of Theresa May falls as a result of its internal divisions over Brexit.

This success — as shocking for the Labour Establishment as for the Tories — has, for the moment at least, realigned British politics. It has caused Tony Blair, the most successful Labour prime minister in history, to exclaim: “I’m not sure I fully understand politics right now.” It comes a decade after the 2008 crash, after ten years of relentless austerity for most and unimaginable wealth for a few, and after market capitalism’s continued failure to meaningfully raise the living standards of most ordinary people. When the bubble burst ten years ago, it seemed as if Brits were prepared to endure an economic hit, to sacrifice and make the most of a slow recovery, but when growth returned as unequally distributed as ever, something snapped. The hearing the hard left has gotten is yet more evidence that revolutions are born not in the nadir of economic collapse but rather when expectations of recovery are dashed.

Revolution is not that much of an exaggeration. In the wake of capitalism’s crisis, the right has reverted to reactionism — a nationalist, tribal, isolationist pulling up of the drawbridge in retreat from global modernity. Perhaps it was only a matter of time before the left reacted in turn by embracing its own vision of an egalitarian future unimpeded by compromise or caveat. This is the socialist dream being revived across the Atlantic, and not on the fringes but at the heart of one of the two great parties of government.

Democrats should pay attention. Labour’s path is the one they narrowly avoided in 2016 but are warming to this fall and in 2020. It’s an English reboot of Clinton-Sanders, with Sanders winning, on a far more radical platform. And, politically, it might just work.

At the center of this story is a 69-year-old socialist eccentric, Jeremy Corbyn, who never in his life thought he would lead any political party, let alone be credibly tipped to be the next British prime minister. The parallels with Sanders are striking: Both are untouched by the mainstream politics of the past 30 years, both haven’t changed their minds on anything in that time, both are characterologically incapable of following party discipline, both have a political home (Vermont; Islington in London) that is often lampooned as a parody of leftism, and both are political lifers well past retirement age who suddenly became cult figures for voters under 30. (...)

Born into an upper-middle-class family, Corbyn was a classic “red diaper” baby. His parents were socialists, and Yew Tree Manor, the 17th-century house in rural Shropshire he grew up in (which his parents renamed Yew Tree House), was a bohemian, left-leaning, capacious, book-filled salon. As Rosa Prince’s biography Comrade Corbyn details, he rebelled at his high school, joined the Young Socialists at the age of 16, and has told a journalist that his main interests as a teen were “peace issues. Vietnam. Environmental issues.” He graduated with such terrible grades that college was not an option, so he joined Britain’s equivalent of the Peace Corps and decamped to Jamaica for two years, then traveled all over Latin America. Appalled by the rank inequality he saw around him, he radicalized still further, and when he came back to Britain, he moved to multiracial North London, with a heavy immigrant population (today, less than half of those living in his district identify as “white British”). It was as close to the developing world as Britain got. And he felt at home.

Even then, he was an outlier on the left. Sympathetic to the goals of the Irish Republican Army, hostile to the monarchy, supportive of Third World revolutionary movements, campaigning for unilateral nuclear disarmament, the young Corbyn was also opposed to Britain’s membership in NATO and what was to become the European Union, because he despised the American alliance and the EU’s capitalist ambitions. He was ascetic: averse to drink and drugs, a vegetarian, interested in hardly anything but attending meetings, and meetings, and meetings. His first wife left him in part because he was never home, always building the movement; his second because he refused to agree to send their son to a selective high school, rather than to a local one open to all abilities. He was the perpetual organizer, the kind who made sure everyone had a cup of tea before a meeting and for whom no fringe activist group was too small to tend to. Decades later, Corbyn has barely shifted on any of these beliefs — and has only recently agreed to sing the national anthem (he long refused to because it invokes the queen). As Labour leader, though, Corbyn has compromised some: The party’s official position in the Brexit referendum was for remaining, and the party manifesto in 2017 supported NATO. He first became a member of Parliament for Islington North in 1983, at the height of Thatcherism, and has held his seat ever since, his majority increasing in all but two of his seven campaigns. (In 2017, he won with a crushing 73 percent of the vote.) His fierce local support tells you something else about Corbyn. Despite his extreme views, he is, by all accounts, a model in attending to the concerns of his local voters and has made remarkably few political enemies of a personal nature. He’s soft-spoken, sweet, invariably cordial, and even his foes concede his deep personal integrity. When I was in London recently, I spoke with people across the political spectrum, from Blairites to Tories to Corbynistas, and I couldn’t find anyone who disliked him personally. Politically, sure — with venom. But as a human being? No. That’s rare for someone who’s been in Parliament for 35 years. (...)

When I visited Britain this past spring, I was struck by how deep and bitter the divide within Labour remains. “Yeah, it was real intense. Yeah, really intense, really intense. You know, we all lost friendships,” one Corbynista told me of the leadership campaigns and their aftermath. This was Sanders-versus-Clinton-level animosity, but in a smaller, more concentrated pool. I listened to one Labour moderate after another denounce Corbyn’s politics as “sinister” and “incompetent,” even evil. And I heard Corbyn supporters’ faces grow red and their lips curl whenever I mentioned the dissenters. Each of them insisted I tell no one I’d interviewed them. If Labour’s divisions these past few years are any guide, the Democrats’ internal fight could get brutal by 2020.

The central question, of course, is one Corbyn’s opponents have had a hard time answering. Why was the far left able, in its darkest hour, to take over the Labour Party and then come remarkably close in the general election? Some still argue that it’s a fluke. What won Corbyn the leadership was simply his authenticity, they say, compared with the packaged pols who ran against him. And what gave him his general-election surge, they explain, was one of the worst Tory campaigns in memory, a stiff and incompetent performance by the prime minister, Theresa May, who refused even to debate Corbyn one-on-one. Others claim that Corbyn did well precisely because no one thought he could win and so it was a consequence-free vote. Many pro-EU Tory voters may also have used the occasion to vent against their party leadership and vote Labour as a protest. And perhaps all of these factors played a part.

But what’s also unmissable is how deep a chord Corbyn struck. Like Trump, he was a murder weapon against the elite. More specifically, he was the vessel through which the losers of the neoliberal post-Thatcher consensus expressed their long-suppressed rage. And the anger is not hard to understand. There’s no question that, since Thatcher, Britain has regained its economic edge. Its economy for quite a while outperformed those of its European partners, unemployment was relatively low, and London transformed from a dreary city into a global capital. But at the same time, most public- and private-sector wages were stagnating badly and economic inequality soared. From 2010 onward, public spending was slashed under a rigorous austerity program. Hikes in college tuition forced a new generation into deeper debt as interest rates on student loans rose to 6 percent and higher. The new jobs that were created were increasingly low-paid and precarious. Imagine the U.S. economy of the past two decades but with serious cuts to entitlements and public spending instead of the 2009 Recovery Act and tax relief.

by Andrew Sullivan, New York Magazine |  Read more:
Image: Rick Findler/PA Images via Getty Images
[ed. Fascinating (to me anyway... not knowing much about British politics).]

Pink Restaurants Used to Be Edgy

Now They’re (Mostly) Derivative Instabait.

Restaurant design trends come and go: Dark walls, bare bricks, and Edison bulbs give way to white-washed spaces accented with natural wood and succulents. The latest restaurant-interior fad, however, is not a checklist of design hallmarks, but a single color: pink.

Pink is everywhere in dining today: On restaurant walls (see June’s All Day in Austin, Gabrielle in Charleston, Cha Cha Matcha in New York City), in logos and branding (Momofuku Milk Bar’s neon-inspired logo, Tartine Manufactory’s espresso bean bags), and even in the food and drinks themselves (hello, radicchio del Veneto and hibiscus-spiked cocktails). You’ll find pink to-go bags at the fast-casual chain Dig Inn, pink kitchen cabinet doors for your Ikea kitchen from Los Angeles-based Semihandmade, and pink tableware from trendy direct-to-consumer brand Year & Day. The color now seems to be visual shorthand for healthy-leaning, fashion-forward dining destinations.

The pink restaurant trend is, of course, a subtrend of the overall rise of pink — and yes, by “pink” I mean “millennial pink,” but what I prefer to think of as the “new” pink. The new pink spans a broad spectrum, from a dusty, grayish blush to salmon, often with a bit of dirtiness to its tone; while its hue varies, it is universal in what it is not: bubblegum pink, hot pink, fuchsia. The new pink has taken over fashion, packaging design, and residential and commercial interiors. It’s a rare tsunami of a single color dominating across categories. Leatrice Eiseman, a color consultant and executive director of the Pantone Color Institute, attributes this cross-category color trending to our increasingly connected digital age. “In the 20th century, it took seven years for a color to migrate from fashion into the home,” says Eiseman. “Today it’s almost instantaneous.”

2014 was a breakout year for the new pink. Disrupter beauty brand Glossier launched with its signature pink packaging. Wes Anderson’s The Grand Budapest Hotel opened in theaters with vivid doses of pink throughout the film, including the namesake hotel’s exterior and the perfectly pink boxes that fill Mendl’s bakery. It was also the year that architect and designer India Mahdavi and artist David Shrigley opened their redesign of the Gallery at Sketch in London.

The Gallery at Sketch is the restaurant that spawned dozens of rosy imitators. Speaking about the design to Lauren Collins in the the New Yorker last year, Mahdavi said, “Today we’re subjected to spending a lot of time dealing with these cold digital interfaces. I think we’re seeking visual comfort.” In an email, Mahdavi further explained this idea of pink as visual comfort: “It reminds me of my childhood growing up in Cambridge, Massachusetts, in the mid-’60s — from strawberry milkshake to the color of the typical objects of that period.” Perhaps the Gallery at Sketch’s instantaneous popularity was due not only to its cinematic look, but in part to that feeling of comfort it offered.

Following pink’s breakout year, it was a slow and steady rise until our current moment of peak pink. Throughout the 2010s, rosé (and its 2016 Instagram-darling cousin frosé) has also experienced increasing popularity, with seemingly no end in sight: 2017 saw sales up 53 percent in the U.S., according to Nielsen. As rosé gained more and more cultural brain space, so did pink. In 2016, the Pantone Institute named Rose Quartz 13-520 one of two colors of the year (perhaps not coincidentally, Rose Quartz 13-520 is the same Pantone color Mahdavi referenced for Sketch).

Later that same year, writer Veronique Hyland is credited with coining the term “millennial pink” in a piece for the Cut. In Hyland’s 2016 story, she wrote: “But ask yourself: Do I like this because I like this or because I’m buying back my own re-packaged childhood in the form of blush-toned lip gloss and stickers?” If the trend had gone away, I would have been inclined to answer that it was the latter, but the new pink remains popular, suggesting its pull runs deeper than marketers’ influence or personal nostalgia.

In the era of Trump and #MeToo, the new pink’s appeal may also lie in what it is not: the bright, garish pink of Barbie and Victoria’s Secret that the modern feminist has spent her life eschewing. “Today’s pinks are not connected with cutesy baby-doll concepts,” says Eiseman. “There is a bit of power in it.”

By email, Mahdavi echoed this idea of power in her design for the Gallery at Sketch, writing: “Pink is treated in a very radical and masculine way.” That strength and the surprise of pink’s power is what appeals to brands that aim to market their feminist credentials, like the all-women’s co-working space the Wing, which decked out its flagship location in pastel pink and has used the shade in every subsequent location, or menstrual panty company Thinx, which chose a muted pink for its launch advertisements.

Restaurant interiors overall have taken a turn to softer, lighter colors and playful design elements, perhaps as a reaction to the dark, heavy, almost industrial designs that had been the norm. (Just take a look at Eater’s picks for the most beautiful restaurants to open last year: You’ll see botanical-patterned wallpaper, pastel upholstery, and whimsical color galore.) At the beginning of the 21st century, design-forward restaurants were predominantly “masculine” and moody, furnished with reclaimed lumber, featuring exposed brick, and lit by bare bulbs. Will Cooper, chief creative officer at ASH NYC, says his team notices this contrast every time the firm’s recently opened Candy Bar, a pastel-pink jewel box of a bar at the Siren Hotel in Detroit, appears in roundups of the best bars. “We’re always the pastel pink outlier,” he laughs, noting that his team settled on pink after imagining the glamorous people who might have visited the hotel at its opening in 1926, looking to Los Angeles’s Perino’s, an old Hollywood hot spot, for inspiration.

This visual transformation reflects a transformation of the way we eat today: trendy restaurants have moved away from the bacon-fueled richness that characterized early-aughts dining, opting instead for breezier, vegetable-centric fare that can comprise an all-day menu. Plus, the pink decor trend has roots in this new wellness-adjacent way of eating. Dimes, the influential hipster-health-food restaurant on New York’s Lower East Side, opened in 2014 with one pink-topped table that became so desirable as an Instagram backdrop that the restaurant had to do away with the table altogether. Bread & Circus in Sydney, Australia, opened in 2011, is one of the progenitors of the all-day-cafe concept. Outfitted with pink tile, cabinets, and dishes (the same ones you’ll find at its sister restaurant, the all-pink Carthage Must Be Destroyed in Bushwick, Brooklyn), it may also be the first to pair pink decor and healthy cuisine. In turn, decorating your restaurant in rosy hues may create a health halo for your brand.

The famous table at Dimes might be the biggest clue to why so many designers have been, ahem, “inspired” by Mahdavi’s design: Pink gets an awful lot of likes (according to the New Yorker profile of Mahdavi, Sketch is reportedly the most Instagrammed restaurant in London). Perhaps restaurateurs see diners flocking to restaurants like Sketch and think pink will lure in customers. Some are unsubtle in their Instagram baiting, such as Pietro, a very pink Italian restaurant that opened in Manhattan in 2016 that has emblazoned its motto, “Pink as fuck,” on menus, takeaway cups, and T-shirts you can buy as souvenirs. (Pietro’s designer, Jeanette Dalrot, told the New York Times that the Memphis Group, another decor trend du jour, was her inspiration for Pietro). In going long on pink, restaurateurs are also appealing to Instagram’s core demographic: 68 percent of the platform’s users are women.

by Laura Fenton, Eater |  Read more:
Image: Ed Reeve courtesy of India Mahdavi

In the Kingdom of Mitch

What comes after McConnellism?

The worst political leaders have a way of unifying their opposition. George W. Bush, with utter lack of self-awareness, campaigned on the delusion he could be “a uniter, not a divider.” By the end of his presidency, the nation was united in the judgment that his two terms had been a divisive and bloody mess of war and financial calamity. When Trump’s corruption and incompetence eventually drag the economy down, he’ll face the same reckoning.

But there’s another leader who has great unifying potential. At times he seems capable of the impossible: bringing America’s centrists, liberals, and leftists into a sincere if tenuous alliance. Yes, we stand together in our mutual contempt for the loathsome, unctuous, chinless invertebrate known as Senate Majority Leader Mitch McConnell. We despised him when he made it the defining cause of his career to fend off any kind of campaign finance reform and to protect the power of the wealthy to buy whatever politicians and policies they saw fit. We despised him when he made up a new rule that prevented a twice-elected president from choosing a Supreme Court justice in his final year. We despised him as recently as last week, when he stated that a package of anti-corruption and election-reform legislation that passed the House will not get a hearing in the Senate “because I get to decide what we vote on.”

The cynicism that oozes from McConnell was never more apparent than in the early weeks of this year when, after repeatedly warning Trump against declaring a national emergency to fund a border wall, he then announced he would support the rogue president’s craven power grab. The emergency declaration, he said, “is the predictable and understandable consequence of Democrats’ decision to put partisan obstruction ahead of national interest.” That’s a sentence-wide glimpse into what makes McConnell so detestable. When called to defend his views in public, he reveals himself not as the stately Bluegrass compromiser found in his own imagination, but as the unscrupulous lackey of a brazen and unstable president.

A while ago, when I confessed my splenetic feelings about McConnell to a friend who is a veteran Washington journalist, he reacted with a verbal shrug: “He’s just a pure partisan.” Or is he something worse? After McConnell caved to Trump on the national emergency, former Democratic speechwriter Michael A. Cohen assailed the Senate leader as a “Republican nihilist” in the New York Review of Books. “He is a remorselessly political creature, devoid of principle, who, more than any figure in modern political history has damaged the fabric of American democracy,” Cohen wrote. “That will be his epitaph.”

Cohen cited a previous NYRB consideration, by historian Christopher R. Browning, that likened McConnell to Paul von Hindenburg, the German president who aided Hitler’s rise to power. “If the U.S. has someone whom historians will look back on as the gravedigger of American democracy, it is Mitch McConnell,” Browning wrote. “He stoked the hyperpolarization of American politics to make the Obama presidency as dysfunctional and paralyzed as he possibly could.”

Centrists such as Norman Ornstein of the American Enterprise Institute have seen McConnell as a key figure in accommodating the extremism of the Republican Party and of its destruction of long-prevailing rules and norms. For his obstructionist role in the Obama era, Ornstein wrote last year, McConnell “will go down in history as a villain.” More pointedly, perhaps, writers on the left have also indicted McConnell. Our own Baffler contributor Maximillian Alvarez wrote in this space almost two years ago of his “unhealthy obsession” with McConnell’s odiousness. “Everyone knows McConnell is a slimy hypocrite,” Alvarez wrote. “He is the soulless corpse that’s left when every fantasy about how politics is supposed to be is stripped away.”

Everyone also knows that eventually his time will pass (the man turned 77 last month) and so we speculate about what his lasting legacy will be. Political scientist David Faris, in last year’s It’s Time to Fight Dirty, called McConnell “Kentucky’s dollar-store Machiavelli” and speculated that “future historians . . . will almost certainly write about Mitch McConnell the way today’s scholars write about Joseph McCarthy or Andrew Johnson—as dangerous scoundrels whose machinations imperiled both the American democratic experiment as well as vital civil rights for millions of people.”

by Dave Densison, The Baffler |  Read more:
Image: uncredited
[ed. Yup. Worst of the worst. Trump is the symptom, McConnell is the disease.]

The Best of a Bad Situation

This is what extinction feels like from the inside.

In our age of Republican minority despotism, attempts to grapple with anthropogenic climate destruction have been warped to encourage several varieties of despair, rendered acute by the ticking-time-bomb nature of the problem. The losses suffered by Earth and its populations — plant and animal — are neither reversible nor remediable. There is no future filled with reparations. There is no long moral arc. Ten or fifteen years ago it was possible to think of the polar bear and the white rhinoceros as martyrs, dying off to shame us into better harmony with the natural world. Not ruined archaic torsos but videos of extinct creatures would say, “You must change your life.” The same hope held with respect to coral reefs, forests, and certain small Pacific Islands. A dark glimmer of progressive thinking (the “bargaining phase,” as it were) was discernible in the Kyoto Protocol and at the Paris conference, where the prime minister of Tuvalu’s call to impose a strict not-to-be-exceeded target of a 1.5-degree-Celsius rise in global temperature — the minimum required to save his people from a homeless future in a world hostile to refugees and immigrants — was dismissed in favor of pragmatic mitigating maneuvers intended to induce the cooperation of holdout nations such as the United States, Russia, and Saudi Arabia.

At least now we can see things clearly — if only we could focus on the problem. Whatever they may say or tweet, the Trump Administration is not in denial about climate change. In fact, it has the perverse distinction of being the first US administration to address it head-on. In 2000, we had a presidential candidate who understood the perils facing us, even if he underplayed them to try to get elected. (By a margin of one United States Supreme Court justice, he was not elected.) Instead, the Bush Administration pretended climate change did not exist, though back then it was called global warming; “climate change” was a Bush/Rove term of obfuscation that eventually carried the day, even among scientists. President Obama spoke softly about the seriousness of human-driven climate change in public while his administration chipped away at automobile emissions and provided token green-energy incentives. These may have been the correct policies for a major, developed nation . . . in the early 1990s. But like much else after the financial crisis in 2008, the opportunity for a visionary shift in national focus — one that would have required investment at least equal to that being poured into the unwinnable war on terror — was bartered away to chase after an illusory political consensus with the terminally uncompromising opposition.

By contrast, from its first days the Trump presidency brought a series of cabinet appointments and executive orders clustered around the single purpose of hastening ecological collapse: Bring back coal! Shackle and corrupt the EPA! Remove climate change information from government websites! Withdraw from the Paris Agreement! A candidate whose platform called for pushing carbon dioxide levels past the frontier of scientists’ most dire predictions could not have expressed that desire more swiftly or succinctly. It was almost as if that were the whole point. As indeed it was.

There are two clearheaded ways to deal with what’s happening to the Earth. One is to Manhattan-Project the implementation of clean energy sources and immediately stop burning fossil fuels. We also need to ditch the patriarchal models of wealth and status reproduction that have been constitutive of nearly all expansionist, war-making, and resource-depleting societies of the past ten thousand years. While we do that, we can try to ameliorate the many catastrophes that have already been set in motion.

The other way, the path we’re on currently, is to concede that billions of people will see their economic and cultural lives ruined before dying off at a scale to make the casualties of World War II appear insignificant — and “gameplan” not to be among them. That’s what “winning” in the climate-changed future amounts to, and that’s the world the Republican Party has committed itself — and the rest of us — to endure: a social-Darwinist survival of the “fittest,” “wealthiest,” or most prepared, at least in the sense of stockpiling the most guns and canned food. It’s been painfully apparent since the term ecological refugee was popularized by a UN report in the mid-1980s that unthinkable numbers of people would be forced into migration in coming decades by climate change. Immigration, national borders, and food, water, and energy distribution will be the central issues facing all governments. From there it’s a short step, if it’s even a step at all, to a vehement resurgence of open racism and bigotry among those with the good fortune to inhabit the least immediately vulnerable areas, be they the highlands of Burma, the fertile Pannonian plain of Hungary, or the plunder-enriched sprawl of the United States.

The looming prospect of a panoply of belligerent, Blut und Boden regimes has always been one of the scariest potential political outcomes of widespread ecological collapse. Through a series of accidents and “influences,” we got our version early in the United States. We can and should get rid of it, but the paranoid energies that enabled its triumph are durable and already have pervaded much of the world. Trumpism is our first national response to climate change, and it’s a brutal, fearful, vengeful, and gloating response — one that predicts and invites warfare on a global scale. For all the terrible statistical projections, alarming models, and buried reports, what’s most immediately terrifying to the human imagination about climate change is the revelation of how large numbers of our species behave under conditions of perceived threat, scarcity, and danger. (...)

Truly, we have fucked it up in so many ways! Yet while climate change increasingly feels like an inescapable doom upon humanity, our only means of recourse remains political. Even under the heavy weather of present and near-future conditions, there’s an imperative to imagine that we aren’t facing the death of everyone, or the end of existence. No matter what the worst-case models using the most advanced forecasting of feedback loops may predict, we have to act as if we can assume some degree of human continuity. What happens in the next decades is instead, as the climate reporter Kate Aronoff has said, about who gets to live in the 21st century. And the question of who gets to live, and how, has always been the realm of politics.

The most radical and hopeful response to climate change shouldn’t be, What do we give up? It should remain the same one that plenty of ordinary and limited humans ask themselves each day: How do we collectively improve our overall quality of life? It is a welfare question, one that has less to do with consumer choices — like changing light bulbs — than with the spending of trillions and trillions of still-available dollars on decoupling economic growth and wealth from carbon-based fuels and carbon-intensive products, including plastics.

The economist Robert Pollin makes a convincing case that only massive investment in and commitment to alternative energy sources stands any chance of lowering emissions to acceptable levels. All other solutions, from “degrowth” to population control, will fall well short of intended targets while causing greater societal pain and instability. To achieve a fairly modest 40 percent reduction in carbon emissions within twenty years, Pollin suggests in a recent New Left Review essay, we would have to invest, per year, “1–1.5 per cent of global GDP — about $1 trillion at the current global GDP of $80 trillion,” and continually increase that investment, “rising in step with global growth thereafter.” Whether we call this a Manhattan Project for renewable, sustainable energy or a Green New Deal, as Pollin and politicians like Alexandria Ocasio-Cortez have named it, the point is to change the political discourse around climate change from either mindless futurism of the kind that proposes large scale “geoengineering” projects or fruitless cap-and-trade negotiations at the mercy of obstructionists. Only a great potlatch of what we have can save us from a bonfire of the vanities on a planetary scale.

In the short term, a true Green New Deal would need to be more like a Green Shock Doctrine. As hurricanes, fires, and floods pile up, each one would provide the occasion to unhook more people from the fossil-fuel grid. At the scale Pollin envisions, it would be naive to assume that a switch from fossils to renewables could happen smoothly. There would be disruptions to almost every aspect of economic life, including food supplies, the power grid (even the internet!), and daily work rhythms and commutes. There would be black markets in banned fuels, and even some forms of violence, like the current populist French riots against Macron’s gasoline taxes. If even such small measures aimed at reducing carbon consumption result in such aggressive pushback, there is no reason to be moderate. Compared with what awaits us if we continue as we are, such shocks are as a rainstorm to a hurricane, or the 1977 blackout of New York City to the bombing of Dresden.

The economic costs of climate change can already be measured by toting up the losses incurred during every single hurricane, wildfire, drought, and war of the past ten years or longer. Because these costs have not yet been borne by any of the major stakeholders in the US or — really — the global economy, they are written off as the price of doing business. No sane group of investors or empowered body of citizens, however, would make these trade-offs to ensure a few more years of short-term profits when measured against the prospects of what would be the last and most profound crash in the history of capitalism.

by The Editors, N+1 |  Read more:
Image: Amanda Means, Light Bulb 00BY1. 2007

Thursday, March 14, 2019

Los Lobos

Gentrification Is a Feature, Not a Bug

Of Capitalist Urban Planning

Capitalism and state planning have a complicated relationship. Capitalist ideology insists that markets are the best mechanism for economic, social, and environmental decision-making, and that consumer choice is the fairest and most efficient arbiter of public will. Deregulation has been the byword of the business class for decades, and diminished government has been the goal of conservative politicians at all levels.

Grover Norquist of the right-wing Americans for Tax Reform famously claimed he wanted to shrink government “to the size where I can drag it into the bathroom and drown it in the bathtub.”

That’s what capitalists say; it’s not really what they do. Capitalists and political conservatives are quick to call for an expansion of the state when it comes to its carceral capacities or its military might, and those expressions of state power have been ballooning budgets at the local, state, and federal levels. Big businesses love the kinds of complex regulations that keep smaller firms from competing with them; they can hire armies of lawyers to whack through the weeds, while their competitors get mired in the muck. They herald expansions of state power that increase inequalities and suppress insurgencies as government doing its job.

On the level of city planning and land use policy, the rhetoric and the reality are similarly mismatched. Capitalists have serious and specific demands of the state, without which they are unlikely to function in the long term, or even on a day-to-day basis. They want the state to make big, fixed-capital investments in infrastructures that enable their own profit-making. They also want government to ensure some degree of support for people’s social reproduction, in order to assure they have a living, breathing workforce to exploit in the first place. Without these investments — planned, paid for and coordinated by the state — they have little basis on which to operate.

The Contradictions of Capitalist Planning

Look a little closer, however, and some important cracks arise. In his classic 1986 book Planning the Capitalist City, Richard Foglesong analyzes the relationship between capitalism and city planning as it evolved in the United States from the colonial period through the 1920s. He frames the book around two primary contradictions: one he calls “the property contradiction,” and the other “the capitalist-democracy contradiction.”

The property contradiction arises because capitalists demand certain planning interventions from the state to enable their mode of accumulation, but then deny the utility of planning as some sort of socialist sickness. Crucially, beyond certain fundamentals, urban capitalists do not want the same things from city planners. Their demands crudely break down along industry lines. Manufacturing capitalists might bristle at environmental regulations that curb their abilities to exploit land, water, and air without legal consequences. They could, however, be broadly supportive of planning interventions meant to cool rising land and housing prices, as they view land as a cost factor of production and housing prices as a cause around which their workers could rally and demand higher wages.

Real estate capitalists, on the other hand, might welcome environmental regulations that limit pollution if they see smog and grime as factors that might bring down the value of their buildings. They would not, however, cheer the state for imposing rent controls or building high-quality public housing, as those measures might threaten their very business model. Planners, then, must manage a double bind: meeting the competing demands of various types of capitalists, without doing so much planning that the capitalists freak out.

In trying to thread that needle, urban planners face the capitalist-democracy contradiction. Actual capitalists — those who own the means of production, not just those who think like them — are always the numerical minority. In a republican government and a capitalist economy, planners must incorporate the working class into their process or risk a legitimacy crisis. At the same time, however, they are entrusted to appease the capitalists for whom the system is designed to work. To navigate this dilemma, cities have devised elaborate land use review systems (in which public comment is encouraged but non-binding) and public city planning commissions (which are generally staffed by real estate experts and business elites).

According to this model, urban planners’ main job is to contain these two contradictions; neither can be resolved, but both can be managed. It’s a complicated bind. They are supposed to make certain land use interventions, but are prevented from making more sweeping changes. Their process must be open to the public, while simultaneously guaranteeing that ultimate power resides in the hands of propertied elites. It can be a pretty shitty job. (...)

With real estate concentrating and manufacturing dispersing, the relationship between urban capital and urban planning has shifted in important ways. If manufacturers no longer make up a powerful capitalist constituency for lower central city land and housing costs, planners managing “the property contradiction” are really only hearing from real estate capitalists and those aligned with their growth agenda, who are calling for policies that push land and property values ever-upward. Even when attempting to solve urban quandaries that have little do with real estate directly — education, transportation, parks, etc. — real estate capital demands planning interventions that enhance speculation. (...)

Whatever problems planners attack, the solutions they propose are likely to include luxury development as a key component — even when that problem is a lack of affordable housing. Planners in the real estate state are tasked with stoking property values: either because they are low and investors want them higher, or because they are already high and if their deflation could bring down an entire budgetary house of cards. Working to curb speculation and develop public and decommodified housing seem like absurd propositions to a planning regime whose first assumption is that future public gains come first through real estate growth.

In this system, gentrification is a feature not a bug. It is surely an economic and social force, but it is also the product of the state — a planned process of channeled reinvestment and targeted displacement. Urban planners, however, are not just corporate tools or government stooges. For the most part they join the profession to have a positive impact on cities. Many come from radical backgrounds and see planning as a means to impose control on capital’s chaos. But under the strictures of the real estate state, producing space for purposes other than profit is an enormous challenge.

by Samuel Stein, Jacobin |  Read more:
Image: A view of the Hudson Yards development zone. Stephanie Keith/Getty
[ed. See also: New York's Hudson Yards is an ultra-capitalist Forbidden City (The Guardian).]

Wednesday, March 13, 2019

Ketamine: Now by Prescription

Last week the FDA approved esketamine for treatment-resistant depression.

Let’s review how the pharmaceutical industry works: a company discovers and patents a potentially exciting new drug. They spend tens of millions of dollars proving safety and efficacy to the FDA. The FDA rewards them with a 10ish year monopoly on the drug, during which they can charge whatever ridiculous price they want. This isn’t a great system, but at least we get new medicines sometimes.

Occasionally people discover that an existing chemical treats an illness, without the chemical having been discovered and patented by a pharmaceutical company. In this case, whoever spends tens of millions of dollars proving it works to the FDA may not get a monopoly on the drug and the right to sell it for ridiculous prices. So nobody spends tens of millions of dollars proving it works to the FDA, and so it risks never getting approved.

The usual solution is for some pharma company to make some tiny irrelevant change to the existing chemical, and patent this new chemical as an “exciting discovery” they just made. Everyone goes along with the ruse, the company spends tens of millions of dollars pushing it through FDA trials, it gets approved, and they charge ridiculous prices for ten years. I wouldn’t quite call this “the system works”, but again, at least we get new medicines.

Twenty years ago, people noticed that ketamine treated depression. Alas, ketamine already existed – it’s an anaesthetic and a popular recreational drug – so pharma companies couldn’t patent it and fund FDA trials, so it couldn’t get approved by the FDA for depression. A few renegade doctors started setting up ketamine clinics, where they used the existing approval of ketamine for anaesthesia as an excuse to give it to depressed people. But because this indication was not FDA-approved, insurance companies didn’t have to cover it. This created a really embarrassing situation for the medical system: everyone secretly knows ketamine is one of the most effective antidepressants, but officially it’s not an antidepressant at all, and mainstream providers won’t give it to you.

The pharmaceutical industry has lobbyists in Heaven. Does this surprise you? Of course they do. A Power bribed here, a Principality flattered there, and eventually their petitions reach the ears of God Himself. This is the only possible explanation for stereochemistry, a quirk of nature where many organic chemicals come in “left-handed” and “right-handed” versions. The details don’t matter, beyond that if you have a chemical that you can’t patent, you can take the left-handed (or right-handed) version, and legally pretend that now it is a different chemical which you can patent. And so we got “esketamine”.

Am I saying that esketamine is just a sinister ploy by pharma to patent and make money off ketamine? Yup. In fact “esketamine” is just a cutesy way of writing the chemical name s-ketamine, which literally stands for “sinister ketamine” (sinister is the Latin word for “left-handed”; the modern use derives from the old superstition that left-handers were evil). The sinister ploy to patent sinister ketamine worked, and the latest news says it will cost between $590 to $885 per dose.

(regular old ketamine still costs about $10 per dose, less if you buy it from a heavily-tattooed man on your local street corner)

I’ve said it before: I don’t blame the pharma companies for this. Big Government, in its infinite wisdom, has decided that drugs should have to undergo tens of millions of dollars worth of FDA trials before they get approved. No government agencies or altruistic billionaires have stepped up to fund these trials themselves, so they won’t happen unless some pharma company does it. And pharma companies aren’t going to do it unless they can make their money back. And it’s not like they’re overcharging; their return to investment on R&D may already be less than zero. This is a crappy system – but again, it’s one that occasionally gets us new medicines. So it’s hard to complain.

But in this case, there are two additional issues that make it even worse than the usual serving of crappiness.

First, esketamine might not work.

Johnson & Johnson, the pharma company sponsoring its FDA application, did four official efficacy studies. You can find the summary starting on page 17 of this document. Two of the trials were technically negative, although analysts have noticed nontechnical ways they look encouraging. Two of the trials were technically positive, but one of them was a withdrawal trial that was not really designed to prove efficacy.

The FDA usually demands two positive studies before they approve a drug, and doesn’t usually count withdrawal trials. This time around, in a minor deviation from their usual rules, they decided to count the positive withdrawal trial as one of the two required positives, and approve esketamine. I suspect this was a political move based on how embarrassing it was to have everyone know ketamine was a good antidepressant, but not have it officially FDA-approved.

But if ketamine is such a good antidepressant, how come it couldn’t pass the normal bar for approval? Like, people keep saying that ketamine is a real antidepressant, that works perfectly, and changes everything, unlike those bad old SSRIs which are basically just placebo. But esketamine’s results are at least as bad as any SSRI’s. If you look at Table 9 in the FDA report, ketamine did notably worse than most of the other antidepressants the FDA has approved recently – including vortioxetine, an SSRI-like medication.

One possibility is that ketamine was studied for treatment-resistant depression, so it was only given to the toughest cases. But Table 9 shows olanzapine + fluoxetine doing significantly better than esketamine even for treatment-resistant depression.

Another possibility is that clinical trials are just really tough on antidepressants for some reason. I’ve mentioned this before in the context of SSRIs. Patients love them. Doctors love them. Clinical trials say they barely have any effect. Well, now patients love ketamine. Doctors love ketamine. And now there’s a clinical trial showing barely any effect. This isn’t really a solution to esketamine’s misery, but at least it has company.

Another possibility is that everyone made a huge mistake in using left-handed ketamine, and it’s right-handed ketamine that holds the magic. Most previous research was done on a racemic mixture (an equal mix of left-handed and right-handed molecules), and at least one study suggests it was the right-handed ketamine that was driving the results. Pharma decided to pursue left-handed ketamine because it was known to have a stronger effect on NMDA receptors, but – surprise! – ketamine probably doesn’t work through NMDA after all. So there’s a chance that this is just the wrong kind of ketamine – though usually I expect big pharma to be smarter than that, and I would be surprised if this turned out to be it. I don’t know if anybody has a right-handed ketamine patent yet.

And another possibility is that it’s the wrong route of administration. Almost all previous studies on ketamine have examined it given IV. The FDA approved esketamine as a nasal spray – which is a lot more convenient for patients, but again, not a lot of studies showing it works. At least some studies seem to show that it doesn’t. Again, usually I expect big pharma not to screw up the delivery method, but who knows?

Second in our litany of disappointments, esketamine is going to be maximally inconvenient to get.

The big problem with regular ketamine, other than not being FDA-approved, was that you had to get it IV. That meant going to a ketamine clinic that had nurses and anesthesiologists for IV access, then sitting there for a couple of hours hallucinating while they infused it into you. This was a huge drawback compared to eg Prozac, where you can just bring home a pill bottle and take one pill per day in the comfort of your own bathroom. It’s also expensive – clinics, nurses, and anesthesiologists don’t come cheap.

The great appeal of a ketamine nasal spray was that it was going to prevent all that. Sure, it might not work. Sure, it would be overpriced. But at least it would be convenient!

The FDA, in its approval for esketamine, specified that it could only be delivered at specialty clinics by doctors who are specially trained in ketamine administration, that patients will have to sit at the clinic for at least two hours, and realistically there will have to be a bunch of nurses on site. My boss has already said our (nice, well-funded) clinic isn’t going to be able to jump through the necessary hoops; most other outpatient psychiatric clinics will probably say the same.

This removes most of the advantages of having it be intranasal, so why are they doing this? They give two reasons. First, they want to make sure no patient can ever bring ketamine home, because they might get addicted to it. Okay, I agree addiction is bad. But patients bring prescriptions of OxyContin and Xanax home every day. Come on, FDA. We already have a system for drugs you’re worried someone will get addicted to, it’s called the Controlled Substances Act. Ketamine is less addictive than lots of chemicals that are less stringently regulated than it is. This just seems stupid and mean-spirited.

The other reason the drugs have to be given in a specially monitored clinic is because ketamine can have side effects, including hallucinations and dissociative sensations. I agree these are bad, and I urge patients only to take hallucinogens/dissociatives in an appropriate setting, such as a rave. Like, yeah, ketamine can be seriously creepy, but now patients are going to have to drive to some overpriced ketamine clinic a couple of times a week and sit there for two hours per dose just because you think they’re too frail to handle a dissociative drug at home?

I wanted to finally be able to prescribe ketamine to my patients who needed it. Instead, I’m going to have to recommend they find a ketamine clinic near them (some of my patients live hours from civilization), drive to it several times a week (some of my patients don’t have cars) and pay through the nose, all so that some guy with a postgraduate degree in Watching People Dissociate can do crossword puzzles while they sit and feel kind of weird in a waiting room. And then those same patients will go home and use Ecstasy. Thanks a lot, FDA.

And the cherry on the crap sundae is that this sets a precedent. If the FDA approves psilocybin for depression (and it’s currently in Phase 2 trials, so watch this space!) you can bet you’re going to have to go to a special psilocybin clinic if you want to get it. Psychedelic medicine is potentially the future of psychiatry, and there’s every indication that it will be as inconvenient and red-tape-filled a future as possible. If you thought it was tough getting your Adderall prescription refilled every month, just wait.

by Scott Alexander, Slate Star Codex |  Read more:
Image: Janssen Pharmaceuticals, Inc

Tuesday, March 12, 2019

Steak, Booze and a Sense of Dull Dread: What Really Happens at the NFL Combine

An NFL general manager stands in his suite at Lucas Oil Stadium watching the combine workouts. I'm not using his name; even though he's merely admitting what everyone privately acknowledges, he worries about saying it aloud because the combine is such a growth industry for the NFL. After years of coming to Indianapolis, he now understands that his presence here -- everyone's presence -- is simply to play a small part in a televised show, even if real futures are at stake. The players are running on the field down below, and they are running on the screens playing all around him, broadcast by the NFL Network. From his suite, this GM can barely read the names and numbers on their jerseys, so he watches on TV. Like most guys, he has an iPad where the stats and scores and results automatically update in his draft software. Except the results are always posted faster on the live television broadcast than in his own system. That's what cues his sense of dull dread: If I can just watch this on television, and if I don't even really care about the results anyway, then why exactly am I here?

Day One, Part I

Wednesday night, my first at the combine, first stop, first drink: a Guinness at the J.W. Marriott hotel bar, the front porch of the NFL combine. I nursed the beer and watched the football world stalk the room, looking for someone who might have information or want information. An agent named Kyle Strongin pulled up a chair. A long time ago, he worked at Ole Miss, which is in the town where I live, so we swapped Coach O stories and caught up on life. This year, he had three clients at the combine: Wisconsin running back Alec Ingold, South Carolina lineman Zack Bailey and Clemson cult hero receiver Hunter Renfrow.

He liked his guys, and he pulled out his phone to show me a picture that kind of sums up the singular question hovering over the combine: What can a team tell about a player by looking at him run, lift weights and flex? Kyle's photo showed the now-viral image of Ole Miss' D.K. Metcalf standing with his shirt off, his chest swollen and rippled. D.K. sent me two crying-laughing emojis when I texted him after it first hit Twitter, when his 1.9 percent body fat melted the internet. In Kyle's photoshopped version, next to him was Renfrow, short and skinny, looking exactly like the kind of player a teammate might mistake for a manager, or maybe a waterboy -- which actually happened his freshman year at Clemson.

Then Kyle's photo listed both their stats against Alabama.

Renfrow put up better numbers.

All Renfrow has ever done is catch big passes in big games and help his team win. The most recent Super Bowl MVP, Julian Edelman, is a player like him. And still, Renfrow's agent spent the week of the combine working to convince people to trust themselves and not a series of athletic tests that don't actually reveal much about a football player's future. That's the funny thing. The combine is a place where you can watch the battle between facts and narrative play out: Even though the smart football minds said they didn't learn much from the results, the drills being broadcast created an image that a player would have to struggle to shed. Hunter was in town fighting group-think about his size and speed. One scouting guy told Renfrow's agent, "I wouldn't draft him but he'd start for us."

So Renfrow needed to do well enough to let his career define him instead of these times and reps. Strongin told me that Hunter would run his 40-yard dash on Saturday. If he could score in the low 4.5s, then a team will draft him in the third or fourth round as a starting slot receiver.

If he ran much slower than that, he might not get drafted at all.

Day One, Part II

This year's combine was my first, which made me not quite prepared for the daily marathon: from morning coffee at the J.W. Starbucks, where the new Browns head coach would ask for his coffee cool enough to chug -- "kids' temperature," one barista said to another; to the convention center where nearly a thousand reporters look for state secrets about hamstrings and muscle cramps; to one of several wood-paneled, masculine steakhouses like St. Elmo, with its horseradish-spiked shrimp cocktail; to a restaurant bar named Prime 47, where nearly every night ends up in a haze of passed business cards, whispered gossip and behavior some coaches would rather not hit the internet. A lot of secrets get told, news broken. Alcohol numbs everyone's deeply hardwired urge to lie.

The NFL is famously secretive and paranoid, so these bars in Indianapolis are among the few places in the world where you can actually ask a straight question and get a true answer. The curtain gets yanked back and, like in the movie, the guy pulling the levers always seems smaller up close. There's John Elway eating at P.F. Chang's. There's Dan Marino drinking chardonnay. There's Sean Payton dining with two reporters in a dark steakhouse. There's every agent and scout and general manager moving in a carefully defined orbit around downtown Indianapolis. Prime awaits at the end of the night. It's a verb. Let's Prime.

Women react strongly to the predatory energy at Prime -- "Soooo many men," a female reporter said, standing next to me in a corner -- and most of the women I work with have stories, some of which make you roll your eyes and some of which make you ball up your fists. Around 2 a.m., I sat at the bar and watched someone grab the waitress' ass. When I pulled the waitress aside to ask if she was OK, she smiled thinly and said, "Welcome to the combine." (...)

For reporters and coaches and scouts, the combine is part work and part play, like a legal convention in Las Vegas or something. Yes, there's combine stuff to do, but that all feels secondary on the ground to drinking expensive wine and eating big steaks at places like St. Elmo's -- the emotional center of Indy during the combine, with its great light and high ceilings.

Normally their most popular steak is the filet. The week of the combine it was the dry-aged Tomahawk ribeye. Big cabernets flew out of the cellar, Jerry Jones buying his large formats of Silver Oak -- jeroboams and methuselahs, son -- while smaller fish pop for 750s of Caymus. A St. Elmo's staff member said the combine crowds don't buy the really great wine, just wine that normal people will recognize as expensive. The strut is more important than the taste. Drinks flowed. Shrimp cocktails arrived, and huge steaks, too: bone-in, medium rare. A reporter sent a round of tequila shots to a table of Patriots PR people. They'd had quite the week, after the Orchids of Asia. Outside, some NFL guys walked down the street joking about needing to find a massage parlor to get a "Krafty." (...)

The scouts know this week doesn't matter, but the league knows that fans will watch on television and that talk radio and popular culture will turn this into an essential event on the annual sporting calendar. That's the tension that everyone can feel even if they can't articulate it. The whole thing has the whiff of reality television, with a twist: As these kinds of drills become less and less relevant to the best minds in the game, they become more and more important in the culture. Imagine if getting kicked off "The Bachelor" meant you had to stay single forever.

by Wright Thompson, ESPN |  Read more:
Image: Joe Robbins/Getty Images

To Brits, Trump Makes Dubya Look Smart

Sometimes even those of us who take great pride in our writings come across pieces that inspire us to be more elegant, more precise, just more literate. I have a fondness for the King’s English and once had an editor, a woman in her late 70’s and very, very English ask me why I wrote as though I were writing in the 19th century? I, of course, not missing a beat replied that I thought that was the ultimate compliment. She vehemently disagreed. Oh well, to each their own.

But just a while ago I came across the accompanying article that I found to be so profoundly poignant yet humorous that it led me to one of those moments where my only reaction was the proverbial “I wish I had said that.” So I would like to share it with those who may not have been introduced to such witty repartee as they perused their daily dose of Facebook.

So with no further suspense I would like to offer the following, which appeared on February 11, 2019, on the internet site Quora.com, for your reading pleasure.

Someone on Quora asked “Why do some British people not like Donald Trump?” Nate White, an articulate and witty writer from England wrote the following response:
A few things spring to mind. 
Trump lacks certain qualities which the British traditionally esteem. 
For instance, he has no class, no charm, no coolness, no credibility, no compassion, no wit, no warmth, no wisdom, no subtlety, no sensitivity, no self-awareness, no humility, no honour and no grace – all qualities, funnily enough, with which his predecessor Mr. Obama was generously blessed.
So for us, the stark contrast does rather throw Trump’s limitations into embarrassingly sharp relief. 
Plus, we like a laugh. And while Trump may be laughable, he has never once said anything wry, witty or even faintly amusing – not once, ever. 
I don’t say that rhetorically, I mean it quite literally: not once, not ever. And that fact is particularly disturbing to the British sensibility – for us, to lack humour is almost inhuman. 
But with Trump, it’s a fact. He doesn’t even seem to understand what a joke is – his idea of a joke is a crass comment, an illiterate insult, a casual act of cruelty. 
Trump is a troll. And like all trolls, he is never funny and he never laughs; he only crows or jeers. 
And scarily, he doesn’t just talk in crude, witless insults – he actually thinks in them. His mind is a simple bot-like algorithm of petty prejudices and knee-jerk nastiness. 
There is never any under-layer of irony, complexity, nuance or depth. It’s all surface. 
Some Americans might see this as refreshingly upfront. 
Well, we don’t. We see it as having no inner world, no soul. (...)
And worse, he is that most unforgivable of all things to the British: a bully. 
That is, except when he is among bullies; then he suddenly transforms into a snivelling sidekick instead. 
There are unspoken rules to this stuff – the Queensberry rules of basic decency – and he breaks them all. He punches downwards – which a gentleman should, would, could never do – and every blow he aims is below the belt. He particularly likes to kick the vulnerable or voiceless – and he kicks them when they are down. 
So the fact that a significant minority – perhaps a third – of Americans look at what he does, listen to what he says, and then think ‘Yeah, he seems like my kind of guy’ is a matter of some confusion and no little distress to British people, given that: 
• Americans are supposed to be nicer than us, and mostly are. 
• You don’t need a particularly keen eye for detail to spot a few flaws in the man.
This last point is what especially confuses and dismays British people, and many other people too; his faults seem pretty bloody hard to miss. 
After all, it’s impossible to read a single tweet, or hear him speak a sentence or two, without staring deep into the abyss. He turns being artless into an art form; he is a Picasso of pettiness; a Shakespeare of shit. His faults are fractal: even his flaws have flaws, and so on ad infinitum.
by  Lance Simmens, LA Progressive |  Read more:
Image: uncredited

Here’s to Naps and Snoozes

A few months ago, two Americans arrived for a meeting at a sprawling, corporate campus in Sichuan Province in China. (They asked not to be named because their work is confidential.) To get to the conference room, they crossed a vast span of cubicles where hundreds of young engineers were busy at their desks, a scene replicated on every floor of the 10-storey building. The meeting was to discuss a dense, text-heavy document, and it began with the client reviewing the day’s agenda: they’d talk until 11am, break for lunch, have nap time, and then start again at 2pm.

Lunch was in a cafeteria the size of a football field where women with hair nets and soup ladles regulated the movement of a column of people. The visitors lost sight of their hosts, so they got into line, bolted down their meal, and retraced their way to the building where they’d had their meeting. When the elevator door opened, the window blinds were drawn, the computer screens were off, and the whole floor lay in grey shadow. The workday could have been over but for the fact that people lay about everywhere, as switched off as the ceiling lights.

The Americans hadn’t seen anything like it since morning-after scenes at their college fraternities. They had to step over some bodies. Other people were tilted forward in their seats with their faces on their desks, like they’d been knocked out from behind, while others still had cleared their desks and lay on them face-up.

The Americans hoped that their hosts, upper-tier executives, would be awake in the meeting room, but they were just as dead to the world as everyone else. One of the Americans coughed into his fist. No one stirred. There were still 45 minutes to go till the 2pm meeting. So he took a seat and pretended to join the mass nap. He didn’t feel like sleeping and would have felt too vulnerable even if he did, but it was a tight space, the woman facing him, a lawyer, was snoring away, and he was afraid that, if she woke up, she’d think he was staring at her. ‘I figured it was safer if I just closed my eyes,’ he told me.

The ordeal ended, finally, with a gong. The lights came back on, music (a military march) played, and people just opened their eyes and resumed their working posture. Nap time was done.

That the incident seemed strange illustrates how people raised in the United States (or who identify with its values) often think about sleep: we can be dominated and bullied by early risers, and tend to look down upon other customs such as siestas.

These are some of our conventions: a person should not sleep too long – as a matter of personal virtue and social capital, the less the better. The average American sleeps for 6 hours 31 minutes during the working week, the least of any country but Japan (6 hours, 22 minutes). The higher limit of what you can admit to is eight hours. Sleep is a waste of time, robbing you of the finite resource of conscious, productive time. Collective nap times or public sleeping bring to mind nurseries and nursing homes. You don’t sleep with co-workers, ever, in any sense of the term. If you really have to sleep, you slink off somewhere out of view and, if anyone asks, you manufacture an alibi, or say something like: ‘I just wanted to close my eyes,’ as if to plead a felony charge down to a misdemeanour. Or you call it a ‘power nap’, as if it was really a strength-training session at the gym.

‘Every society is judgmental about its core issues of value,’ said Carol Worthman, a biological anthropologist at Emory University in Atlanta. But when it comes to sleep, the need for safety – versus value judgment – seems to have prevailed in cultures beyond our own. Indeed, in Worthman’s research around the world, sleep has emerged as both more flexible and more social than one would think from the perspective of the West. ‘Human sleep evolved in risky settings that fostered complex sleep architecture and regulation of vigilance in sleep to suit local circumstances,’ she writes in Frontiers Reviews; and those circumstances varied from place to place.

by Todd Pitock, Aeon |  Read more:
Image: Jason Lee/Reuters

Ghost, Come Back Again

A boarding school in the British Isles. Reverent children huddle in a gloomy chamber, watching as one of their fellow students assays a devilishly difficult trick. The boy’s hand trembles. And then — success! A jet of fire, a “cold and beautiful purple-blue enchantment,” fills the ancient tower with an indescribable illumination.

Is this Hogwarts? Are these boys practicing spells that might one day protect the world from evil? No, it’s Seabrook College, the Dublin boys’ school of Paul Murray’s heartfelt and profane new novel, “Skippy Dies” — and that “magnificent plume of flame” isn’t coming from a wand. Boys in close quarters will always, always find a way to make their own miracles.

The extravagantly entertaining “Skippy Dies” chronicles a single catastrophic autumn at Seabrook from a good 20 different perspectives: students, teachers, administrators, priests, girlfriends, doughnut shop managers. At the center of it all is Daniel Juster, known as Skippy, whose death — on the floor of Ed’s Doughnut House, just after writing his beloved’s name on the floor in raspberry filling — opens the novel. “Skippy Dies” then flashes back to the months preceding, months in which the gloomy, doomed 14-year-old falls in love, wins a fight, keeps a secret and attracts the attention of members of the faculty who do not have his best interests at heart.

Along the way we get to know Skippy’s friends and tormentors, each drawn with great affection: Ruprecht, Skippy’s doughy genius roommate, who pursues experiments in string theory despite spending much of his time head-down in the toilet; Dennis, “an arch-cynic whose very dreams are sarcastic”; Carl, Skippy’s romantic rival and a budding psychopath; Lori, the possibly unworthy object of Skippy’s affections, who’s obsessed with a ­Britney-like pop tart and who keeps her diet pills hidden in her teddy bear’s tummy. (...)

Our guide to Seabrook’s staff room, meanwhile, is “Howard the Coward” ­Fallon, Seabrook ’93, once a Skippyish nerd but now a history teacher at his alma mater. (The book is set in the early part of this decade, in the midst of the Celtic ­Tiger economic boom.) “I suppose I thought there’d be more of a narrative arc,” Howard, working on an early midlife crisis, confides to a colleague, even though his life has in fact been a perfectly structured disappointment — beginning with that persistent schoolboy nickname, through his failure as a futures trader, up to his current position trying to get snoozing nitwits to care about World War I.

In a reflective moment, Howard thinks that his classes themselves resemble trench warfare, “a huge amount of labor and bloodshed for a dismally small area of terrain.” So uninterested in the past are his students that they indiscriminately refer to any time before today as “days of Yore.” But when he attempts to jump-start the boys’ enthusiasm with an impromptu excursion to a war memorial, he’s berated by Seabrook’s efficiency-obsessed acting principal: “Do you think this is some kind of a ‘Dead Poets Society’ situation we’re in here, is that it?”

Living with a nice American writer whom he can’t bring himself to marry, Howard is as adrift romantically as he is professionally. He’s ripe for an awakening, and it comes courtesy of Aurelie ­McIntyre, a fetching substitute geography teacher whose presence has turned the entire student body into dazed geography buffs. She empties Howard’s mind just as effectively, for the adults of Seabrook are as in thrall to their whims and appetites as their spotty, shame-faced students are.

That’s not always a source of comedy, of course, especially to readers for whom the book’s religious-school setting will call to mind a decade of news about the sexual abuse of children by priests. “Skippy Dies” doesn’t shy away from this issue. In fact, Seabrook’s students come to suspect a priest of abuse, although it’s to Murray’s credit that the man is neither exactly as guilty as you think, nor quite as blameless as you might hope.

In fact, the ambitious length of “Skippy Dies” allows Murray to take on any number of fascinating themes. One of the great pleasures of this novel is how confidently he addresses such disparate topics as quantum physics, video games, early-20th-century mysticism, celebrity infatuation, drug dealing, Irish folklore and pornography — as well as the sad story of the all-Irish D Company of the Seventh Royal Dublin Fusiliers, sent to their doom at Gallipoli in 1915. There’s even room for an indecent close reading of Robert Frost’s “Road Not Taken” that’s so weirdly convincing I’ll never again be able to read that poem without sniggering.

Murray confidently brings these strands together, knitting them into an energetic plot that concerns Skippy’s death — and his roommates’ attempts to contact him afterward — but also expands into an elegy for lost youth. For Murray remembers, better than most writers, the “grim de-­dreamification” of growing up. You won’t be a pop singer or a ninja superspy in the future. You won’t be exceptional at all, despite the promises of TV, video games and your parents. “Santa Claus,” Murray notes, “was just the tip of the iceberg.”

by Dan Kois, NY Times |  Read more:
Image: Rutu Modan
[ed. Highly recommended. See also: Paul Murray and ‘Skippy Dies’ (Paris Review).]