Sunday, May 14, 2017


Craig Ferguson
, Yakitori alley, Tokyo
via:

Ash Wednesday

I’m worried about my trip up to New York to attend a party.

I worry that I am not traveling with a young assistant who is far more skillful at pushing the buttons on my iPhone (or laptop, if I hadn’t drowned the keyboard in coffee and lost the damned thing even before I became that comfortable using it) than I am. I worry about the latest Theory of Everything (this decade it’s ADD) which does such a good job of holding people in their various social tracks, so that someone who is dyslexic (like me) is also said to have “a form of ADD.” My friend Bob Woof tells me a third of the people who will be at the party have it too. Our mutual friend Eric says Bob is practically a hoarder, which makes it likely he has a touch of it himself. But despite both of those worries, I’m on the bus and headed north.

Bob has invited me to one of his Prime Timers parties on Sunday evening, March 5, 2017. He’s been inviting me to these gatherings for more than a year, but this time I’ve decided to accept and write a few notes on it as well. (In a notebook. I can’t handle them any other way.)

I’m combining the trip with another visit I’ve been wanting to make for several years, to see my old fuck buddy, Maison, and his husband, Fred, who live further upstate near Poughkeepsie: I’ll continue by the Metro-North train and stay with him and Fred Monday night, March 6, and Tuesday night, March 7, before returning to New York City by train and, after a walk across town from Grand Central Station to Port Authority with my grey plastic rollaway and my grubby white Zabar’s bag, back to Philadelphia on the morning of Wednesday, March 8—on a Peter Pan bus.

But that’s getting ahead of things.

The Prime Timers is a group of older New York City–based gay men who have a sex party every month. This time it is at the DoubleTree hotel on the southeast corner of Forty-Seventh and Seventh. The party is in room 3905—two rooms actually, both given over to sex from 5:30 pretty much till midnight.

While I was not particularly nervous sexually about what would happen, there was my worsening ADD: the shattering of my self-confidence last year had left me with exactly the kind of uncertainties that Bob prided himself on being able to take care of in the elderly men who came to his parties. Would I arrive with phone and luggage intact? Would I be able to get back with everything I started out with? Would I be able to negotiate my medications, food? Sleep? With ADD wreaking havoc on logic and focus, would I be able to document the trip as I hoped?

About a year ago, Bob brought a car full of guys to have lunch with me out at a mall restaurant in Wynnewood, Pennsylvania, where Dennis, my partner of twenty-seven years, and I were living with my daughter and son-in-law. The guys Bob brought were civilized, seemingly well off, and friendly. One big fellow in jeans and a jean jacket was driving the group back to New York City from somewhere.

One man, John, in a navy pea jacket remarked on what a nice guy I was. Bob sucked my fingertip at the restaurant table. Nobody else in the restaurant seemed particularly interested in us. Dennis didn’t come that day, I remember, for whatever reason.

I’d met Bob at an academic convention on gay comic art, at the CUNY Graduate Center, where he’d walked up to me, put his arms around me, and began to kiss me. He was fifty-six and I was seventy-two. He told me that he was really mad over “silver daddy bears.” He was a guy with glasses and a short white beard, who traveled in jeans and plaid shirts, as I did. My beard was notably longer, and white.

Through the rest of the program, he hung out with me even though I had come with three younger friends (Mia Wolff, Ann Matsuchi, and Alex Lozupone); it was the day I met Alison Bechdel, and we mentioned my part in the formation of “The Bechdel Test,” and met a number of other folks. While Bob verged on the annoying, his brazenly direct sexual come on was intriguing.

What has always interested me about gay male society is the way it seems to operate differently from what one might call normative heterosexual society.

I learned that Bob ran a group for men such as myself—the Prime Timers: gay and over fifty. (What this had to do with gay comic books, I never really understood; but, well, there was some connection. . . .) For better or worse, however, I felt I could learn something from him. He seemed naturally kind, concerned and caring.

I’m known as a “sex radical, Afrofuturist, and grand master of science fiction,” but the fact is, I am nowhere near as sexually radical as many, and for all my interest lots of things have passed me by. I felt there was a world of experience that had been slipping away. I wanted at least to know something about it, to write about it.

by Samuel R. Delaney, Boston Review |  Read more:
Image: Samuel R. Delaney

Winners and Losers of the Recent Nuclear Holocaust

The nation was recently rocked by retaliatory nuclear blasts that have turned much of America into a barren wasteland, decimating the population, triggering the rise of firestorms and supervolcanoes, and generally bringing civilization to the brink of collapse. Let’s take a look at the political fallout.

Winners
  • Congressional Republicans: Widespread destruction aside, this was a kumbaya moment for a caucus that has had its share of family spats of late. For the first time since coming together to narrowly pass the American Health Care Act in May, Speaker Paul Ryan wonkily persuaded the House GOP’s version of the Hatfields and McCoys — the principled hardliners of the Freedom Caucus on one hand, and the reasonable moderates of the Tuesday Group on the other — to set their bickering aside just long enough to squeak through a resolution in support of President Trump’s plan, tweeted out at 3:29 a.m. on Thursday morning, to “FRANCE IS LOOKING FOR TROUBLE. Sick country that won’t solve its own problems. Maybe nucluar?” Concerns that a more deliberative Senate would splash cold water on a rare show of Republican unity proved unfounded when Senator Susan Collins (R-ME), the human fulcrum perched stoically at the precise center of American politics, revealed in a nationally televised special that she would vote to authorize nuclear war to balance out the fact that she had recently broken ranks with her party on an agriculture appropriations bill.
  • CNN: As every news producer knows, nothing makes for better theater than war — and nothing makes for better CNN than theater. Right up until the moment when the first blast’s electromagnetic pulse wiped out all of the technology on the eastern seaboard, the cable giant was in fine form, drawing record viewership to a number of its weekday staples. The roiling debate over whether or not to abruptly drop hydrogen bombs on traditional allies proved to be compelling fodder; one particularly juicy squabble between contributors Jeffrey Lord and Lanny Davis will likely go down in history as the second-to-last thing to go viral. Time will tell whether Ari Fleischer’s observation that a nuclear conflict “could be the victory that Donald Trump needs to right the ship of this administration” holds true, but one thing’s for certain — this moment was CNN as it was meant to be: a grand arena where intellectual titans come to match wits and battle it out over issues with no clear answer.
  • Donald Trump: Sure, the verdict may not be in just yet. But when the radioactive dust settles, we could be looking at a game-changing moment for a young presidency. Trump may have ruffled some feathers with less-than-sensitive remarks to the New York Times’ Maggie Haberman that the nuclear holocaust would be “way bigger than the old Holocaust,” but let’s be clear — political correctness has never been this man’s game. For a president with his eye on 2020, an uncertain path to reelection just got a whole lot more manageable, with the threshold for victory in the Electoral College now down from 270 votes to 14. While thermonuclear annihilation may be an inelegant solution, it burnishes the public impression of Trump as a man of action — eccentric, perhaps, but someone who at the end of the day isn’t afraid to get his hands dirty or seek out unorthodox solutions. Those who are still parsing whether the first wave of mortal attacks were justified are asking all the wrong questions. The truth is, it doesn’t matter — this president will be remembered as The Great Disruptor for taking strong and decisive action again and again. Goodbye Armageddon. Hello, Arma-mentum.
Losers
  • Hillary Clinton: The former Secretary of State was spared from the vast and merciless extermination due to scheduled travel. To Wisconsin, you might ask? Of course not. Instead, the one-time Democratic nominee had jetted off to Tanzania to take part in a symposium on empowering women and girls in the world’s fastest-growing economies — an excursion that is sure to raise new questions about her ability to connect with everyday Americans. It’s the same old story: as ever, a politician notorious for being out-of-touch with regular people goes out of her way to prove it once again, this time by failing to relate to the now-quintessential American experience of being instantaneously vaporized into ash by a 500 kiloton wall of unsparing white light that — unlike some people we know — actually deigns to visit blue collar communities in every state.
by Dan Cluchey, McSweeny's |  Read more:
Image: uncredited via:

Saturday, May 13, 2017

The American Obsession with Lawns

Warmer weather in the northern states means more time outside, and more time to garden. While urban gardeners may be planning their container gardens, in the suburbs, homeowners are thinking about their lawns. It’s the time of year when the buzz of landscaping equipment begins to fill the air, and people begin to scrutinize their curb appeal.

The goal—as confirmed by the efforts of Abraham Levitt in his sweeping exercise in conformity (although it had been established well before that)—is to attain a patch of green grass of a singular type with no weeds that is attached to your home. It should be no more than an inch and a half tall, and neatly edged. This means you must be willing to care for it. It must be watered, mowed, repaired, and cultivated. Lawns are expensive—and some regard them as boring in their uniformity—but they are a hallmark of homeownership. Why do Americans place so much importance on lawn maintenance?

In The Great Gatsby when Nick Carraway rents his house on the West Egg, he apparently spends little time on lawn care. The disparity between his patch of greenery and the immaculately manicured grounds of Jay Gatsby's mansion is clear: “We both looked at the grass—there was a sharp line where my ragged lawn ended and the darker, well-kept expanse of his began,” reports Carraway. In preparation for Gatsby’s luncheon with Daisy, Gatsby is so troubled by this difference that he sends his own gardeners to take care of the offensive strip of grass.

This concern is not limited to fiction. The state of a homeowner’s lawn is important in relation to their status within the community and to the status of the community at large. Lawns connect neighbors and neighborhoods; they’re viewed as an indicator of socio-economic character, which translates into property- and resale values. Lawns are indicative of success; they are a physical manifestation of the American Dream of home ownership. To have a well maintained lawn is a sign to others that you have the time and/or the money to support this attraction. It signifies that you care about belonging and want others to see that you are like them. A properly maintained lawn tells others you are a good neighbor. Many homeowner associations have regulations to the effect of how often a lawn must be maintained. So important is this physical representative of a desired status that fines can be levied if the lawn is not maintained. It’s no wonder that Gatsby wanted Carraway’s lawn addressed: it would reflect on him in a variety of ways if it were not.

by Krystal D'Costa, Scientific Amercian |  Read more:
Image: Oliur Rahman Pexels

Our Mothers as We Never Saw Them

In one of my favorite photographs of my mother, she’s about 18 and very tan, with long, blond hair. It’s the 1970s and she’s wearing a white midriff and cutoffs. My dad is there, too, hugging her from behind, and from the looks of it, they’re somewhere rural — maybe some pastoral patch of small-town New Jersey where they met.

I haven’t seen this photo for years, I have no idea where it is now, but I still think of it — and, specifically, my mom in it. She looks really sexy; wars have been waged over less impressive waist-to-hip ratios. And she is so young and innocent. She hasn’t yet dropped out of college, or gotten married. The young woman in this photo has no idea that life will bring her five children and five grandchildren, a conversion to Judaism, one divorce, two marriages, a move across the country.

For me, as for many daughters, the time before my mother became a mother is a string of stories, told and retold: the time she got hit by a car and had amnesia; the time she sold her childhood Barbie to buy a ticket to Woodstock; the time she worked as a waitress at Howard Johnson’s, struggling to pay her way through her first year at Rutgers. The old photos of her are even more compelling than the stories because they’re a historical record, carrying the weight of fact, even if the truth there is slippery: the trick of an image, and so much left outside the frame. These photos serve as a visual accompaniment to the myths. Because any story about your mother is part myth, isn’t it?

After finishing my most recent novel, in part about mother-daughter relationships, I put out a call on social media for photos from women of their mothers before they were mothers. A character in the book, a young artist, does something similar, so I’d thought a lot about what the process might be like. I wasn’t prepared, however, for how powerful the images I received would be.

The young women in these pictures are beautiful, fierce, sassy, goofy, cool, sweet — sometimes all at once. I asked contributors to tell me about their moms or the photo submitted, and they often wrote that something specific and special about their present-day mother — her smile, say, or her posture — was present in this earlier version. What solace to know that time, aging and motherhood cannot take away a woman’s essential identity. For daughters who closely resemble their moms, it must be an even bigger comfort; these mothers and daughters are twins, separated by a generation, and an old photo serves as a kind of mirror: How do I look? Even if there isn’t a resemblance, we can’t help but compare ourselves to our young mothers before they were mothers. (...)

The photos women sent me offer a key to how we, as daughters, want to perceive young womanhood. Pluck, sex appeal, power, kindness, persistence: We admire and celebrate these characteristics, and we long for the past versions of our moms to embody them. But if these characteristics are a prerequisite for a properly executed womanhood, does becoming a mother divest a woman of such qualities? In studying these photos, and each daughter’s interpretation of them, I’ve come to wonder what traits we allow our mothers to have, and which ones we view as temporary, expiring with age and the beginning of motherhood. Can a woman be both sexual and maternal, daring and responsible, innocent and wise? Mothers are either held up as paragons of selflessness, or they’re discounted and parodied. We often don’t see them in all their complexity.

by Edan Lepucki, NY Times |  Read more:
Image: Edan Lepucki

Raf Cruz, Working Class Zero, Collage 2015
via:

New Gene Tests Pose a Threat to Insurers

Pat Reilly had good reason to worry about Alzheimer’s disease: Her mother had it, and she saw firsthand the havoc it could wreak on a family, much of it financial.

So Ms. Reilly, 77, a retired social worker in Ann Arbor, Mich., applied for a long-term care insurance policy. Wary of enrolling people at risk for dementia, the insurance company tested her memory three times before issuing the policy.

But Ms. Reilly knew something the insurer did not: She has inherited the ApoE4 gene, which increases the lifetime risk of developing Alzheimer’s. “I decided I’d best get long-term care insurance,” she said.

An estimated 5.5 million people in the United States have Alzheimer’s disease, and these patients constitute half of all nursing home residents. Yet very few people in the United States have been tested for the ApoE4 gene.

But last month, with the approval of the Food and Drug Administration, the gene testing company 23andMe began offering tests that reveal whether people have the variant, as well as assessing their risks for developing such conditions as Parkinson’s and celiac disease.

Other genetics companies are planning to offer similar tests, and soon millions of people will have a better idea what their medical futures might be. Recent research has found that many, like Ms. Reilly, are likely to begin preparing for the worst.

But for companies selling long-term care insurance, these tests could be a disaster, sending risky patients in search of policies even as those with fewer risks shy away, damaging an already fragile business. “There is a question about whether the industry is in a death spiral anyway,” said Robert Hunter, director of insurance at the Consumer Federation of America. “This could make it worse.”

The tests are simple: All people have to do is send away a saliva sample and pay $199. Their disease risks, if they say they want to know them, will be delivered with a report on ancestry and on how their genes influence such traits as flushing when they drink alcohol or having straight hair.

The company will not reveal how many people have received disease-risk data, but it says that in Britain and Canada, where it has offered such testing for several years, about three-quarters of their customers have asked for it. 23andMe has sold its genetic services to more than two million people worldwide since 2007.

The issue for now is with long-term care insurance, not employment and not — at least so far — health insurance.

Under the Genetic Information Nondiscrimination Privacy Act, companies cannot ask employees to take gene tests and cannot use any such results in employment decisions; insurers are not permitted to require gene tests or to use the results in coverage decisions.

But legislation proposed in the House would exempt corporate “wellness” programs from some of these requirements. And the American Health Care Act, passed by the House, would permit states to waive some insurance safeguards regarding pre-existing conditions.

At the moment, companies selling long-term care insurance — unlike medical insurers — are permitted to ask about health status and take future health into consideration when deciding whom to insure and how much to charge.

The 23andMe test results will not appear in people’s medical records, and the company promises not to disclose identifiable findings to third parties. It is up to the customers to reveal them — and the fear for insurers is that many will not.

Two-thirds of nursing home residents are on Medicaid, and the remaining private insurers are already struggling. In the early 2000s, more than 100 firms offered long-term care insurance, according to the Treasury Department. By the end of 2015, only 12 firms offered it, and new enrollees fell from 171,000 to 104,000.

The insurers charged too little for these policies, experts say; policyholders have turned out to be much sicker than anticipated. To pay for an unanticipated increase in policyholders who develop Alzheimer’s, insurers would have to raise prices, said Don Taylor, a professor of public policy at Duke University who has studied the issue.

Increasing numbers of people at low risk might decide the insurance was not worth the rising price. Even many at high risk would eventually find the policies unaffordable. It is the definition of an insurance death spiral.

by Gina Kolata, NY Times |  Read more:

Ten Year Futures

Now that mobile is maturing and its growth is slowing, everyone in tech turns to thinking about what the Next Big Thing will be. It's easy to say that 'machine learning is the new mobile' (and everyone does), but there are other things going on too.

On one hand, we have a set of profound changes coming as a result of new primary technology. Electric and autonomous cars will change cities, virtual and mixed reality will change the entire computing experience, and machine learning is changing the kind of questions that computers can answer. But each of these is also just beginning, especially relative to their potential - they are at the bottom of the S-Curve where smartphones are now getting towards the top. On the other hand, I think we can see a set of changes that come not so much from any new technology as from shifts in consumer behaviour and operating economics. These changes are potentially just as big, and might be starting sooner.

Electric and autonomous cars are just beginning - electric is happening now but will take time to grow, and autonomy is 5-10 years away from the first real launches. As they happen, each of these destabilises the car industry, changing what it means to make or own a car, and what it means to drive. Gasoline is half of global oil demand and car accidents kill 1.25m people year, and each of those could go away. But as I explored here, that's just the start: if autonomy ends accidents, removes parking and transforms what congestion looks like, then we should try to imagine changes to cities on the same scale as those that came with cars themselves. How do cities change if some or all of their parking space is now available for new needs, or dumped on the market, or moved to completely different places? Where are you willing to live if 'access to public transport' is 'anywhere' and there are no traffic jams on your commute? How willing are people to go from their home in a suburb to dinner or a bar in a city centre on a dark cold wet night if they don't have to park and an on-demand ride is the cost of a coffee? And how does law enforcement change when every passing car is watching everything?

Then, virtual reality and mixed reality are also some years away from mass-market adoption. We have some VR products in market today and some very early MR, but for both, it feels as though we are in the 2005-2006 phase of multitouch smartphones - almost, but not yet. Once these really come to market, they may change the world just as much as the iPhone. Mixed reality in particular could change things a great deal, if we all have a pair of glasses that can place something in the world in front of you as though it was really there. Predicting what this could be today reminds me of trying to predict the mobile internet not in 2007 but in 1999 - "stock tips, news headlines and the weather" don’t really capture what has happened since then.

Machine learning is happening right now, and rolls through or perhaps underneath the entire tech industry as a new fundamental computer science capability - and of course enables both mixed reality and autonomous cars. Like, perhaps, relational databases or (in a smaller way) smartphone location, machine learning is a building block that will be part of everything, making many things better and enabling some new and surprising companies and products. I don't think we quite understand what it means to say that computers will be able to read images, video or speech in the way that they've been able to read text and numbers since the 1970s or earlier. But though we are creating machine learning now, again, it's still very early to see all of the implications. It's at the beginning of the S-Curve.

So, we have these hugely important new technologies coming, but not quite here yet. At the same time, though, we have a set of more immediate changes, that have much more to do with consumer behaviour, company strategy and economic 'tipping points' than with primary, frontier technology of the kind that Magic Leap or Waymo are building.

by Benedict Evans |  Read more:

Can Prairie Dogs Talk?

Con Slobodchikoff and I approached the mountain meadow slowly, obliquely, softening our footfalls and conversing in whispers. It didn’t make much difference. Once we were within 50 feet of the clearing’s edge, the alarm sounded: short, shrill notes in rapid sequence, like rounds of sonic bullets.

We had just trespassed on a prairie-dog colony. A North American analogue to Africa’s meerkat, the prairie dog is trepidation incarnate. It lives in subterranean societies of neighboring burrows, surfacing to forage during the day and rarely venturing more than a few hundred feet from the center of town. The moment it detects a hawk, coyote, human or any other threat, it cries out to alert the cohort and takes appropriate evasive action. A prairie dog’s voice has about as much acoustic appeal as a chew toy. French explorers called the rodents petits chiens because they thought they sounded like incessantly yippy versions of their pets back home.

On this searing summer morning, Slobodchikoff had taken us to a tract of well-trodden wilderness on the grounds of the Museum of Northern Arizona in Flagstaff. Distressed squeaks flew from the grass, but the vegetation itself remained still; most of the prairie dogs had retreated underground. We continued along a dirt path bisecting the meadow, startling a prairie dog that was peering out of a burrow to our immediate right. It chirped at us a few times, then stared silently.

“Hello,” Slobodchikoff said, stooping a bit. A stout bald man with a scraggly white beard and wine-dark lips, Slobodchikoff speaks with a gentler and more lilting voice than you might expect. “Hi, guy. What do you think? Are we worth calling about? Hmm?”

Slobodchikoff, an emeritus professor of biology at Northern Arizona University, has been analyzing the sounds of prairie dogs for more than 30 years. Not long after he started, he learned that prairie dogs had distinct alarm calls for different predators. Around the same time, separate researchers found that a few other species had similar vocabularies of danger. What Slobodchikoff claimed to discover in the following decades, however, was extraordinary: Beyond identifying the type of predator, prairie-dog calls also specified its size, shape, color and speed; the animals could even combine the structural elements of their calls in novel ways to describe something they had never seen before. No scientist had ever put forward such a thorough guide to the native tongue of a wild species or discovered one so intricate. Prairie-dog communication is so complex, Slobodchikoff says — so expressive and rich in information — that it constitutes nothing less than language.

That would be an audacious claim to make about even the most overtly intelligent species — say, a chimpanzee or a dolphin — let alone some kind of dirt hamster with a brain that barely weighs more than a grape. The majority of linguists and animal-communication experts maintain that language is restricted to a single species: ourselves. Perhaps because it is so ostensibly entwined with thought, with consciousness and our sense of self, language is the last bastion encircling human exceptionalism. To concede that we share language with other species is to finally and fully admit that we are different from other animals only in degrees not in kind. In many people’s minds, language is the “cardinal distinction between man and animal, a sheerly dividing line as abrupt and immovable as a cliff,” as Tom Wolfe argues in his book “The Kingdom of Speech,” published last year.

Slobodchikoff thinks that dividing line is an illusion. To him, the idea that a human might have a two-way conversation with another species, even a humble prairie dog, is not a pretense; it’s an inevitability. And the notion that animals of all kinds routinely engage in sophisticated discourse with one another — that the world’s ecosystems reverberate with elaborate animal idioms just waiting to be translated — is not Doctor Dolittle-inspired nonsense; it is fact. (...)

It did not take long for Slobodchikoff to master the basic vocabulary of Flagstaff’s native prairie dogs. Prairie-dog alarm calls are the vocal equivalent of wartime telegrams: concise, abrupt, stripped to essentials. On a typical research day, Slobodchikoff and three or four graduate students or local volunteers visited one of six prairie-dog colonies they had selected for observation in and around Flagstaff. They usually arrived in the predawn hours, before the creatures emerged from their slumber, and climbed into one of the observation towers they had constructed on the colonies: stilted plywood platforms 10 feet high, covered by tarps or burlap sacks with small openings for microphones and cameras. By waiting, watching and recording, Slobodchikoff soon learned to discriminate between “Hawk!” “Human!” and so on — a talent that he says anyone can develop with practice. And when he mapped out his recordings as sonograms, he could see clear distinctions in wavelength and amplitude among the different calls.

He also discovered consistent variations in how prairie dogs use their alarm calls to evade predators. When a human appeared, the first prairie dog to spot the intruder gave a sequence of barks, which sent a majority of clan members scurrying underground. When a hawk swooped into view, one or a few prairies dogs each gave a single bark and any animal in the flight path raced back to the burrow. (Slobodchikoff suspects that, because of a hawk’s speed, there’s little time for a more complex call.) The presence of a coyote inspired a chorus of alarm calls throughout the colony as prairie dogs ran to the lips of their burrows and waited to see what the canine would do next. When confronted with a domestic dog, however, prairie dogs stood upright wherever they were, squeaking and watching, presumably because tame, leashed dogs were generally, though not always, harmless.

Something in Slobodchikoff’s data troubled him, however. There was too much variation in the acoustic structure of alarm calls, much more than would be expected if their only purpose was to distinguish between types of predator. Slobodchikoff arranged for various dogs — a husky, a golden retriever, a Dalmatian and a cocker spaniel — to wander through a prairie-dog colony one at a time. The recorded alarm calls were still highly variable, even though the intruders all belonged to the same predator class. “That led me to think, What if they are actually describing physical features?” Slobodchikoff remembers. What if, instead of barking out nouns, prairie dogs were forming something closer to descriptive phrases?

by Ferris Jabr, NY Times |  Read more:
Image: Ronan Donovan

Friday, May 12, 2017

Down With Chef Worship

Last month, celebrated Danish chef René Redzepi opened up a pop-up restaurant in Tulum, Mexico. This restaurant will only be open for seven weeks. Dinner costs $600 and lasts well past midnight. You will not eat at this restaurant. You will not even come close to ever being able to eat at this restaurant. And yet, critics went anyway, and—SHOCKEROO—they really enjoyed themselves. Redzepi’s fellow chefs also made the pilgrimage, because the high-end food industry now is just apparently a roving club of people waxing poetic over their ability to shamelessly indulge one another.

The amazing thing is how successful these chefs and critics have been in creating a cottage industry wherein consumers like me actively choose to live vicariously through them. I’m not immune to their charms. I love food, which means I also love me some food porn. I watch Parts Unknown. I eyebang the pretty photos in Bon Appétit. I read Pete Wells reviews of four-star joints with jaw-dropping prix fixe tabs. Hell, I was IN a goddamn food show. And I started watching Chef’s Table, the ultra-serious Netflix docu-series that chronicles a selected chef’s life as if they’re a sitting fucking President, and lovingly photographs every dish of theirs as if each one will be made into a permanent installation at the Museum of Modern Art.

Everything on Chef’s Table is filmed in geological slow motion. Certain cooking techniques, like the grinding of corn, are fetishized like nude bodies. It’s a beautifully made show, and I think I made it halfway through my second episode before realizing that I hated it.Take this profile of ramen master Ivan Orkin, for example:


“I’m a ‘go fuck yourself’ kinda guy.” That’s Orkin’s opening salvo. Which, okay. Fine. I’m glad you’re a cartoonishly irascible New Yorker. Can I just have my fucking bowl of soup, please? Not everything I eat needs a story, man.

And yet, that’s exactly what the high-end food scene does now. Food can’t just be food. There has to be a mythology behind it (one new restaurant in Seattle has its own goddamn encyclopedia). More important, there has to be a person behind it—an ICON—preferably some completely obsessive loon who simply won’t rest until he’s foraged for the exact right sized ramp for your chilled soup course. These chefs end up revered like tech bros and getting filmed and profiled well past the point of absurdity. (...)

All this chef idolization opens the door for any number of frauds and charlatans to take advantage. It’s an act of faith when you go into a nice restaurant. You have to trust that the chef cares, and that he hired good people under him, and that he didn’t wipe his ass with raw chicken before plating your carnitas. And there are too many fancy restaurants and too many chefs out there now for all of them to be trustworthy. The hard logic dictates that some of them, perhaps many of them, are probably conning you. Some of them have the LOOK down well enough to get away with serving you thoroughly average food. (Take it from someone who’s been lured in by a chalkboard menu plenty of times).

You shouldn’t just buy that every one of these chefs is a genius. Don’t get me wrong, I’m glad that chefs are no longer looked down upon. I’m glad the reputation of the profession has been elevated. But this is a GROSS overcorrection. I don’t need a six-volume biography of some surly noodle guy. (...)

I’m happy for the food revolution. I’m happy that food has become such a vital part of the greater culture. I’m glad everyone has a chance to eat sushi, and bulgogi, and delicious breakfast tacos. So it’s strange to me that so much of the food scene now is centered less on what you can eat than what you cannot, and it prizes many chefs who are bizarrely determined to NOT give everyone a chance to try their best food, whether it’s opening some pop-up restaurant a thousand miles away, or changing the menu every hour because they wanna “push the envelope,” or opening an affordable restaurant that can only seat five and a half people at a time and demands testing the limits of both your patience and comfort. We live in a time of plenty and yet these guys are cranking out FOMO at an almost inhuman pace.

by Drew Magary, Deadspin |  Read more:
Image: You Tube

Have You Ever Had an Intense Experience of Mystical Communion with the Universe, Life, God, etc?

Here are two things to know about architects. First, they are fastidious and inventive with their names. Frank Lincoln Wright was never, unlike Sinatra, a Francis. He swapped in the Lloyd when he was 18—ostensibly in tribute to his mother’s surname on the occasion of her divorce, but also to avoid carrying around the name of a still more famous man, and for that nice three-beat meter, in full anticipation of seeing his full name in print. In 1917, Charles-Edouard Jeanneret-Gris—who is to modern architecture what Freud is to psychoanalysis—was given the byline Le Corbusier (after corbeau, crow) by his editor at a small journal, so that he could anonymously review his own buildings. The success of the sock puppet critic meant that after the critiques were collected into a book-length manifesto, the nom-de-plume eventually took over Jeanneret-Gris’ architect persona, as well. Ludwig Mies—the inventor of the glass-walled skyscraper—inherited an unfortunate surname that doubled as a German epithet for anything lousy or similarly defiled. He restyled himself Miës van der Rohe—vowel-bending heavy-metal umlaut and all—with the Dutch geographical tussenvoegsel “van” from his mother’s maiden name to add a simulation of the German nobiliary particle, von. Ephraim Owen Goldberg became Frank Gehry.

Second, all architects are older than you think. Or than they want you to think. Unlike the closely adjacent fields of music and mathematics, architecture has no prodigies. Design and construction take time. At 40, an architect is just starting out. Dying at 72 in architecture is like dying at 27 in rock and roll. The body of knowledge required is broad and intricate, philosophical and practical, and the training is long. The American Institute of Architects, a self-appointed enforcer of the title, requires five to seven years of school, then years more of closely monitored internships, and a further year of exams to confer the word architect on its dues-payers. This has formalized a longstanding tradition in which, architecture schools being rare (for a long time the Paris Académie des Beaux-Arts and the Massachusetts Institute of Technology were your choices), it wasn’t unusual for would-be architects to lose a decade in a kind of autodidactical twilight among artists and builders, before commencing their practice. Even now, architects start late. And they never stop, working well into their eighties and nineties, or famously in the case of Brazilian midcentury master Oscar Niemeyer, their hundreds. This is why the life story of a very different midcentury master, the Philadelphian Louis Kahn, who got started only around age 50 and died at around 70—is, among other things, a tragedy. A student once asked Kahn why it took so long to become an architect. “Why not,” he answered, “you want to die earlier?”

These matters of name and age reflect the social uncertainty and financial precarity of the profession. Prominent architects, from Palladio to Mies, were sons of stonemasons who jumped up socially thanks to gentleman patrons. The class ambiguity persists to this day; the architecture studios I teach are full of people who are the first in their families to enter any of the professions. Or are the opposite: would-be bohemian artiste children of first-generation professionals who have compromised with their elders. These exchanges of capital and class, style and status, are complicated: ever since the upstart Medici family employed Giorgio Vasari to put up pageants and palaces to substitute for pedigree, the ornamental company of architects—though themselves only tradesmen and servants—has conferred a touch of the very class to which architects also aspire. The slow and resource-rich making of buildings is impossible without the patronage of invested clients. Architecture, like certain kinds of filmmaking, is an art of spending a lot of other people’s money: a successful architect, said the teacher of the single business class my design school obliged me to take, should be the poorest person in any room. Architects, relieved just to build, work for a tiny fractional fee of projects’ construction costs. And, pleased to imagine themselves worldly, they work without managers and agents. The hours are long. The pay is bad. When Kahn died, his firm—slow-rolling chaos held together by a long-suffering Quaker deputy named David Wisdom—owed its creditors $464,423.83. In 1974 dollars.

Wendy Lesser, in her monumental new biography of Kahn, You Say to Brick, chooses to call her subject, for the most part, Lou. As in Reed, Gehrig, and Costello—a name that connotes a kind of nostalgic American working-class heroism. Lesser recounts how Kahn disarmed a colleague, who was inclined to forever call him Professor, by saying, “In the office, everyone calls me Lou.” Wendy understandably follows that lead, but for me, her Lou, Lou, Lou rings like a cowbell: like a life of the prophet Isaiah that calls its subject Izzy. Within the small world of architecture, a self-regarding world that guards its heroes to a fault—a world where Kahn is almost alone in being almost universally revered (even by those who think of his work as a terminally perfected dead end)—the only people who say “Lou” are the dwindling ranks of his former students. But even they all seem to say Lou-Kahn, one word, like lupin or Lacan. To the rest of us he has become that second single syllable, whose long open vowel—sound of submission and satisfaction—echoes in its evocation of preeminence the exotic imperial honorific, Khan.

by Thomas De Monchaux, N+1 |  Read more:
Image:Gunnar Klack

Where Have All the Insects Gone?
Image: Paul Van Hoof/Minden Pictures

Politics 101

‘He Doesn’t Give a Crap Who He Fires’

Totally unprecedented. Totally unsurprising.

When President Donald Trump on Tuesday fired FBI Director James Comey, people steeped in presidential and legal history sounded alarms. “No president has ever dismissed an FBI director under such circumstances,” bestselling author Jon Meacham said on Twitter. “It’s a constitutional crisis,” David Cole of the Georgetown University Law Center wrote. “This is the kind of thing that goes on in non-democracies,” Jeffrey Toobin said on CNN.

People who know Trump didn’t disagree, but they also responded with a combination of relative resignation and seen-it-all-before shoulder shrugs.

“Outrageous,” former Trump Organization vice president Barbara Res said when I reached her at her home shortly after the news of the firing broke.

But was she shocked? “No,” she said.

“This is an act of insanity,” a former Trump inner-circle associate told me, “but it’s how he functions.”

“Completely consistent, yes,” with his pattern of behavior, a onetime Trump political aide added.

The Trump signature here is far more than the familiar seismic John Hancock he jammed into the White House stationery with his customary dark, thick-tipped, heavy-handed pen. Trump as president has been the man and manager he’s always been. He’s practically never worked for anybody but himself, he’s always cultivated a workspace marked by competition and chaos, and the only difference between how he operated on the 26th floor of Trump Tower and how he’s steering the West Wing of the White House are the stakes.

A strategically incoherent, predictably unpredictable, private-sector lord who ran his family business by doing what he wanted when he wanted and with limited consideration for consequences stretching beyond his own immediate interests and gratification, Trump has spent the first not quite four months of his presidency running headlong into the constitutional checks and balances of American democracy. The system of safeguards against dictatorial intemperance has flummoxed him. Where there has been objective failure, Trump as usual has proclaimed historic success.

In instances, though, in which executive power is sufficient for actual action, he has been nobody but his imperious, impetuous, spiteful self. And here, according to the reporting of POLITICO and other news organizations, Trump made a fraught, monumental, republic-rattling decision the way he’s always made decisions—quickly—and for the same central reasons—vengeance and self-interest. Comey wasn’t the first person he fired—he canned National Security Adviser Michael Flynn and interim Attorney General Sally Yates—but this sacking in many ways was Trump’s quintessential act as the country’s chief executive.

Enraged by the FBI’s ongoing investigation into his and his campaign’s Russian ties, Trump had pliant Justice Department deputies outline Hillary Clinton-related reasons to fire Comey—reasons that seemed contrived given Trump’s praise for Comey on the campaign trail, the kiss he blew in Comey’s direction this past January and his statement of support just last month. On Tuesday, however, Trump wrote a short letter of his own in which the second paragraph in particular pulsed with telltale Trump, down to the self-serving and factually unsupported statement replete with clumsily inserted commas. “While I greatly appreciate you informing me, on three separate occasions, that I am not under investigation …”

And then he didn’t do the actual firing himself, which has been a feature of his management style through the years. He didn’t call Comey. He had Keith Schiller, his longtime bodyguard who now has the title of Director of Oval Office Operations, take his letter to Comey’s office. Famous for saying “You’re fired” on TV, Trump never has relished firing people in real life.

“I think the shamelessness of it is stunning, and stunningly consistent,” Trump biographer Gwenda Blair said. “He’s thinking 24-7 how to move the chess pieces around to benefit himself.”

“Listen,” said Jack O’Donnell, a former high-ranking Trump casino executive in Atlantic City, “he’s always been very protective of himself, first and foremost. In that regard, this, I believe, is consistent—because he’s certainly trying to protect himself.”

“This is who he is,” said Artie Nusbaum, one of the top bosses at the construction firm that built Trump Tower. “No morals, no nothing. He does what he does.”

People who work or have worked with Trump have been saying this forever.

by Michael Kruse, Politico | Read more:
[ed. Sorry for the lack of posts lately. The dysfunction, lying, and finger-pointing coming out Washington is sucking the air out of everything interesting. See also: The Real Obama]

Thursday, May 11, 2017

NYU Accidentally Exposed Military Code-Breaking Computer Project to the Entire Internet

In early December 2016, Adam was doing what he’s always doing, somewhere between hobby and profession: looking for things that are on the internet that shouldn’t be. That week, he came across a server inside New York University’s famed Institute for Mathematics and Advanced Supercomputing, headed by the brilliant Chudnovsky brothers, David and Gregory. The server appeared to be an internet-connected backup drive. But instead of being filled with family photos and spreadsheets, this drive held confidential information on an advanced code-breaking machine that had never before been described in public. Dozens of documents spanning hundreds of pages detailed the project, a joint supercomputing initiative administered by NYU, the Department of Defense, and IBM. And they were available for the entire world to download.

The supercomputer described in the trove, “WindsorGreen,” was a system designed to excel at the sort of complex mathematics that underlies encryption, the technology that keeps data private, and almost certainly intended for use by the Defense Department’s signals intelligence wing, the National Security Agency. WindsorGreen was the successor to another password-cracking machine used by the NSA, “WindsorBlue,” which was also documented in the material leaked from NYU and which had been previously described in the Norwegian press thanks to a document provided by National Security Agency whistleblower Edward Snowden. Both systems were intended for use by the Pentagon and a select few other Western governments, including Canada and Norway.

Adam, an American digital security researcher, requested that his real name not be published out of fear of losing his day job. Although he deals constantly with digital carelessness, Adam was nonetheless stunned by what NYU had made available to the world. “The fact that this software, these spec sheets, and all the manuals to go with it were sitting out in the open for anyone to copy is just simply mind blowing,” he said.

He described to The Intercept how easy it would have been for someone to obtain the material, which was marked with warnings like “DISTRIBUTION LIMITED TO U.S. GOVERNMENT AGENCIES ONLY,” “REQUESTS FOR THIS DOCUMENT MUST BE REFERRED TO AND APPROVED BY THE DOD,” and “IBM Confidential.” At the time of his discovery, Adam wrote to me in an email:
All of this leaky data is courtesy of what I can only assume are misconfigurations in the IMAS (Institute for Mathematics and Advanced Supercomputing) department at NYU. Not even a single username or password separates these files from the public internet right now. It’s absolute insanity.
The files were taken down after Adam notified NYU.

Intelligence agencies like the NSA hide code-breaking advances like WindsorGreen because their disclosure might accelerate what has become a cryptographic arms race. Encrypting information on a computer used to be a dark art shared between militaries and mathematicians. But advances in cryptography, and rapidly swelling interest in privacy in the wake of Snowden, have helped make encryption tech an effortless, everyday commodity for consumers. Web connections are increasingly shielded using the HTTPS protocol, end-to-end encryption has come to popular chat platforms like WhatsApp, and secure phone calls can now be enabled simply by downloading some software to your device. The average person viewing their checking account online or chatting on iMessage might not realize the mathematical complexity that’s gone into making eavesdropping impractical.

The spread of encryption is a good thing — unless you’re the one trying to eavesdrop. Spy shops like the NSA can sometimes thwart encryption by going around it, finding flaws in the way programmers build their apps or taking advantage of improperly configured devices. When that fails, they may try and deduce encryption keys through extraordinarily complex math or repeated guessing. This is where specialized systems like WindsorGreen can give the NSA an edge, particularly when the agency’s targets aren’t aware of just how much code-breaking computing power they’re up against.

Adam declined to comment on the specifics of any conversations he might have had with the Department of Defense or IBM. He added that NYU, at the very least, expressed its gratitude to him for notifying it of the leak by mailing him a poster. (...)

The documents, replete with intricate processor diagrams, lengthy mathematical proofs, and other exhaustive technical schematics, are dated from 2005 to 2012, when WindsorGreen appears to have been in development. Some documents are clearly marked as drafts, with notes that they were to be reviewed again in 2013. Project progress estimates suggest the computer wouldn’t have been ready for use until 2014 at the earliest. All of the documents appear to be proprietary to IBM and not classified by any government agency, although some are stamped with the aforementioned warnings restricting distribution to within the U.S. government. According to one WindsorGreen document, work on the project was restricted to American citizens, with some positions requiring a top-secret security clearance — which as Adam explains, makes the NYU hard drive an even greater blunder:
Let’s, just for hypotheticals, say that China found the same exposed NYU lab server that I did and downloaded all the stuff I downloaded. That simple act alone, to a large degree, negates a humongous competitive advantage we thought the U.S. had over other countries when it comes to supercomputing.
The only tool Adam used to find the NYU trove was Shodan.io, a website that’s roughly equivalent to Google for internet-connected, and typically unsecured, computers and appliances around the world, famous for turning up everything from baby monitors to farming equipment. Shodan has plenty of constructive technical uses but also serves as a constant reminder that we really ought to stop plugging things into the internet that have no business being there. (...)

Çetin Kaya Koç is the director of the Koç Lab at the University of California, Santa Barbara, which conducts cryptographic research. Koç reviewed the Windsor documents and told The Intercept that he has “not seen anything like [WindsorGreen],” and that “it is beyond what is commercially or academically available.” He added that outside of computational biology applications like complex gene sequencing (which it’s probably safe to say the NSA is not involved in), the only other purpose for such a machine would be code-breaking: “Probably no other problem deserves this much attention to design an expensive computer like this.”

by Sam Biddle, The Intercept |  Read more:
Image: Thinkstock via NSF

Wednesday, May 10, 2017

The Rise and (Maybe) Fall of Influencers

I never really expected to write these words, but: It was Kendall Jenner who did it for me.

Or, to be fair, not Kendall Jenner herself — or not entirely Kendall Jenner — but rather the 10th-anniversary Indian Vogue cover that featured Kendall Jenner, as conceived and photographed by Mario Testino. When released last week on the magazine’s social media accounts, it almost immediately became the center of a storm of social media ire, most of it along the lines of this : “Disgustingly inappropriate. Were ALL the Indian women unavailable??”

Which was then followed, as these things so often are, by a host of headlines like CNN.com’s “Vogue India Cover Lands Kendall Jenner in More Trouble.”

But here’s the thing: Was it really Ms. Jenner’s fault? Was she in any way culpable for this bad choice? After all, she was being used as work for hire: a body and a face to sell a magazine.

Or was she?

Therein lies the problem. Because the obvious assumption — the one made by all the worked-up folks in the Twittersphere — was that she had been employed not just as a clothes hanger (as models used to be known), but, at least in part, as herself: a public figure with an immediately recognizable name and face and family back story, along with approximately 80.3 million Instagram followers. An — and I cringe at the word, but it is in the Cambridge English Dictionary — Influencer. And that influence was part of what Indian Vogue was paying for by paying her to be on its cover.

It’s the same thing that the Fyre Festival, the famously failed music festival in the Bahamas, was paying for when it paid Ms. Jenner, along with Bella Hadid (12.7 million Instagram followers), Hailey Baldwin (10 million) and Emily Ratajkowski (12.8 million), to drum up excitement via promotional posts with said women cavorting in bikinis on a beach. It’s what, to a certain extent, Pepsi was paying for when it hired Ms. Jenner (sense a theme here?) and put her in an ill-conceived ad in which she uses a soda to soften up a police officer at a riot. It’s what Vogue Arabia was paying for when it put an only-semi-veiled Gigi Hadid on the cover of its first issue.

Whether it is obviously an ad, whether a Federal Trade Commission-required hashtag admission goes with it or not (and the F.T.C. is increasingly cracking down on influencer posts, recently writing to 45 celebrities to warn them about necessary disclosures), there is, as Lucie Greene, the worldwide director of the Innovation Group at J. Walter Thompson, said, “an implied individual choice.”

And that means that those involved are perceived as having a personal — not merely professional — relationship with the thing they are selling. Which in turn means they bear some responsibility for it. There is a downside to the upside of being an influencer.

Sometimes it’s literal: The Fyre Festival is now facing a class-action lawsuit in which the defendants include not only the organizers, but also a number of “Jane Does” who helped promote the festival.

Sometimes it’s reputational: After Selena Gomez, the proud possessor of the most Instagram followers badge (she had 120 million as of Wednesday), signed on to represent Coach — after having been an ambassador for Louis Vuitton, a brand with a very different aesthetic, one nonfan : “Selena Gomez, the previous face of Coca Cola, Louis Vuitton, Verizon, and the current face of Pantene and Coach … a joke to the industry???”

Either way, it’s real. As with all slippery slopes, it’s easy to hop on but also easy to end up in a heap at the bottom. Which raises the possibility that we are on the verge of a new (hopefully more considered) age in the evolution of Influencer culture.

by Vanessa Friedman, NY Times |  Read more:
Image: Vogue, Mario Testino

An Attack on American Democracy

At a time like this, it is important to express things plainly. On Tuesday evening, Donald Trump acted like a despot. Without warning or provocation, he summarily fired the independent-minded director of the F.B.I., James Comey. Comey had been overseeing an investigation into whether there was any collusion between Trump’s Presidential campaign and the government of Russia. With Comey out of the way, Trump can now pick his own man (or woman) to run the Bureau, and this person will have the authority to close down that investigation.

That is what has happened. It amounts to a premeditated and terrifying attack on the American system of government. Quite possibly, it will usher in a constitutional crisis. Even if it doesn’t, it represents the most unnerving turn yet in what is a uniquely unnerving Presidency.

Things like this are not supposed to happen in a liberal democracy, especially in one that takes pride, as the United States does, in safeguards put in place against the arbitrary exercise of power. The F.B.I. is meant to be an independent agency, above and beyond partisan politics and personal grudges. (That is why its directors are appointed for ten-year terms.) The President is supposed to respect this independence, especially when it comes to matters in which he has, or could have, a personal interest.

There is little in American history that compares to, or justifies, what Trump has now done. In recent times, the only possible precedent is the Saturday Night Massacre, of October 20, 1973, when Richard Nixon fired the special prosecutor investigating Watergate, Archibald Cox. Arguably, Trump’s Tuesday Afternoon Massacre was even more disturbing. In 1973, the two top law-enforcement officials in the land—the Attorney General, Elliot Richardson, and his deputy, William Ruckelshaus, refused to carry out Nixon’s dictatorial order to terminate Cox. It was left to the wretched Robert Bork, who was then the Solicitor General, to do the deed.

In contrast, Trump’s Attorney General, Jeff Sessions, was a central figure in the ouster of Comey. In March, Sessions—a close political ally of Trump’s—was forced to recuse himself from the Russia investigation after it emerged that he had failed to disclose meetings with Sergey Kislyak, the Russian Ambassador to Washington. But this recusal didn’t prevent Sessions from pushing for Comey’s dismissal. In its public statement announcing the firing, the White House said, “President Trump acted on the clear recommendations of both Deputy Attorney General Rod Rosenstein and Attorney General Jeff Sessions.”

Of course, the ultimate responsibility lies with Trump. In a brief letter to Comey, which the White House also released, he said, “While I greatly appreciate you informing me, on three separate occasions, that I am not under investigation, I nevertheless concur with the judgment of the Department of Justice that you are not able to effectively lead the Bureau. It is essential that we find new leadership for the FBI that restores public trust and confidence in its vital law enforcement mission.”

As Trump has amply demonstrated in the past, hardly anything he says can be taken at face value, and everything in his letter should be treated skeptically, especially his claims about what Comey told him. What we know for sure is that Comey, in his March 20th testimony on Capitol Hill, confirmed that the F.B.I. was conducting a criminal investigation into “any links between individuals associated with the Trump campaign and the Russian government and whether there was any coördination between the campaign and Russia’s efforts.” Although Comey refused to go into much detail about the investigation, he confirmed that it had been going on since last July, and he gave the distinct impression that, wherever it led, it would be pursued with vigor.

We also know that Comey issued a blunt public dismissal of Trump’s claims on Twitter that Barack Obama ordered U.S. spy agencies to wiretap Trump Tower during the Presidential campaign. “I have no information that supports those tweets, and we have looked carefully inside the F.B.I.,” Comey said, during his testimony.

This, surely, is the relevant context of Comey’s dismissal. By contrast, the two other documents that the White House released on Tuesday to justify Trump’s action—a letter from Sessions to the President, and a three-page memorandum from Rosenstein to Sessions—smacked of a desperate and unconvincing effort to cook up a pretext.

In his letter, Rosenstein, who hitherto had a reputation as an independent official, took issue with Comey’s handling of the investigation into Hillary Clinton’s use of a private e-mail server. He focussed in particular on the July 5, 2016, press conference at which Comey announced that the Bureau had closed its investigation without recommending any charges, while at the same time criticizing how Clinton and her aides had handled classified information. This, Rosenstein said, was “a textbook example of what federal prosecutors and agents are taught not to do.” He also brought up Comey’s subsequent announcement, on October 28, 2016, eleven days before the election, that the F.B.I. was reopening the Clinton case because of the discovery of thousands of e-mails on Anthony Weiner’s laptop. Rosenstein called the announcement a departure from the agency’s tradition of avoiding public comment during an election season.

Many observers would agree with at least some of Rosenstein’s points about the Clinton investigation—but so what? Are we seriously being asked to countenance the idea that Trump fired Comey because he didn’t treat Hillary Clinton fairly? The same Trump who seized upon Comey’s press conference last July and used it to buttress his claims that Clinton should be jailed? The same Trump who, on October 31st, said, “It took guts for Director Comey to make the move that he made in light of the kind of opposition he had”?

Until the White House comes up with a less ludicrous rationalization for its actions, we can only assume that Trump fired Comey because the Russia investigation is closing in on him and his associates, and he knew that he didn’t have much sway over the F.B.I. director. That is the simplest theory that fits the facts. And it is a cause for great alarm.

Ever since Trump took office, many people have worried about his commitment to democratic norms, the Constitution, and the rule of law. From the hasty promulgation of his anti-Muslim travel ban onward, he has done little to salve these concerns. Now he has acted like one of the authoritarian leaders he so admires—a Putin, an ErdoÄŸan, or an El-Sisi.

by John Cassidy, New Yorker | Read more:
Image: Mark Peterson/Redux
[ed. The question is, what will our Republican-led Congress do? Care to guess?]

Johnny Depp: A Star in Crisis and the Insane Story of His "Missing" Millions

Early one afternoon in October 2012, Jake Bloom and Joel Mandel left their respective Beverly Hills offices, slipped into their luxury cars and embarked on the roughly 30-minute journey to the Hollywood Hills compound of their client, Johnny Depp. Bloom was a rumpled and graying lawyer whose disheveled style camouflaged an intellect exercised on behalf of such luminaries as Martin Scorsese and Sylvester Stallone. Mandel, then in his early 50s, was a tall, rather amiable accountant who favored loose-fitting jeans and looser-fitting shirts, sartorial code designed to assure his clients he was just another boy in their band as well as a top-flight business manager steeped in the arcana of arbitrage and amortization.

Both men had been close to Depp for years. Bloom, indeed, was such a confidant to the actor that he had even joined him for an induction ceremony into the Comanche nation when he played Tonto in The Lone Ranger; as for Mandel, he had accompanied Depp to his three-island property in the Bahamas, atolls Mandel had helped his client buy for a total of $5.35 million.

These men were part of Depp's inner circle, at least as far as any lawyer or accountant could belong to the inner circle of an artist this mercurial, one with a skull-and-crossbones tattoo on his leg and "Death is certain" scrawled beneath it, whose soul mates were such creative titans as Marlon Brando, Keith Richards and Hunter S. Thompson — the journalist whose ashes Depp fired from a cannon hauled to the top of a 153-foot tower, a tribute for which the actor says he paid $5 million.

Leaving their cars that day, the advisers approached one of Depp's five houses on a dead-end stretch of North Sweetzer Avenue. A modernist affair that was simply referred to as 1480, the building had been converted into a recording studio and was an appendage to an eight-bedroom, castle-like mansion once owned by music producer Berry Gordy. One of the star's two omnipresent assistants led the men in, past a painting that British artist Banksy had created for Depp, and into a den, where the actor was leaning back in a slightly battered chair, surrounded by dozens upon dozens of classic guitars.

After the obligatory small talk, the visitors got to the point: Depp's cash flow had reached a crisis point, they declared. Even though the star had become wildly wealthy (later, Mandel would claim Depp earned more than $650 million in the 13-plus years he had been represented by The Management Group, the company Mandel had started in 1987 with his brother Robert), there just wasn't enough liquid money to cover Depp's $2 million in monthly bills.

Without a fire sale, Depp — then arguably the biggest star in Hollywood and certainly one of the best paid, thanks to the Pirates of the Caribbean franchise — would never be able to meet his obligations. Not the payments on his portfolio of real estate around the world. Not the impulse purchases such as the three Leonor Fini paintings he had bought from a Manhattan gallery (the first two for $320,000, the third as a $245,000 gift for then-girlfriend Amber Heard). Not the $3.6 million he paid annually for his 40-person staff. Not the $350,000 he laid out each month to maintain his 156-foot yacht. And not the hundreds of thousands of dollars he paid to sustain his ex-partner, Vanessa Paradis, and their children, Lily-Rose and Jack.

by Stephen Galloway, Ashley Cullins, Hollywood Reporter | Read more:
Image: uncredited via: