Tuesday, April 30, 2019

Offshore Drilling in the Arctic and Atlantic on Ice (for Now)

Trump administration puts offshore drilling expansion in Arctic, Atlantic on ice (ArsTechnica).

The Coastal Zone Management Act of 1972 is an Act of Congress passed in 1972 to encourage coastal states to develop and implement coastal zone management plans (CZMPs). This act was established as a United States National policy to preserve, protect, develop, and where possible, restore or enhance, the resources of the Nation's coastal zone for this and succeeding generations.

The Coastal Zone Management Act (CZMA) of 1972 showed that the United States Congress “recognized the importance of meeting the challenge of continued growth in the coastal zone”. Under this act two national programs were created, the National Coastal Zone Management Program (CZMP) and the National Estuarine Research Reserve System. Out of 35 eligible states, only 34 have established management programs; Washington State was the first state to adopt the program in 1976.

The Coastal Zone Management Program (CZMP), also called the National Coastal Zone Management Program, was established under the Coastal Zone Management Act of 1972 and is administered by NOAA’s Office for Coastal Management (OCM). This program is designed to set up a basis for protecting, restoring, and establishing a responsibility in preserving and developing the nation’s coastal communities and resources, where they are under the highest pressure. The vision of the CZMP is to ensure that “the nation’s coast and oceans, including the Great Lakes and island territories, are healthy and thriving for this and future generation”. Their mission is “to ensure the conservation and responsible use of our nation’s coastal and ocean resources”.

The key goals of the National CZM program include: “protecting natural resources, managing development in high hazard areas, giving development priority to coastal-dependent uses, providing public access for recreation, coordinating state and federal actions”. Ultimately the outcomes from the CZMP are for “healthy and productive coastal ecosystems, and to have environmentally, economically, and socially vibrant and resilient coastal communities”.

The National Estuarine Research Reserve System is the second programs established by the Coastal Zone Management Act of 1972 and is also administered by the National Oceanic and Atmospheric Administration (NOAA). NERRS is a network of 28 areas within the nation and various coastal states, which spans more than 1 million acres. These areas are used for long-term research, water-quality monitoring, education, and coastal stewardship.

by Wikipedia |  Read more:
Image: Orjan F. Ellingvag/Corbis via Getty Images
[ed. For a good example of how dumb and short-sighted politicans (and the electorate in general) are in Alaska, look no further than the Coastal Zone Management Act. The state had a first-rate Coastal Management Program for over 30 years before it withdrew from the Act in 2011 because of extensive lobbying by extractive resource industries, and perceptions that locally affected communities had too much influence in the decision-making process (horrors). So now Alaska is the only eligible state in the US (out of 35) without a Coastal Management Program, greatly diminishing its ability to influence and condition any federal activity that occurs along it's shorelines. A short history of the hows and whys this came about can be found here: Why did Alaska eliminate the Alaska Coastal Management Program? (pdf).]

Robert Crumb - “Une Brève histoire de l'Amérique”, affiche (1997)
via:  [click to enlarge]

Monday, April 29, 2019

The High Life

Willie Nelson is sitting on a couch at his home, a modest cabin that overlooks his 700 acres of gorgeous Texas Hill Country, when he reaches into his sweatshirt and produces a small, square vaporizer, takes a hit and exhales slowly. “Wanna puff?” he asks.

Nelson’s wife, Annie, setting down a cup of coffee on a DVD case working as a coaster in front of him, speaks up. “Careful with that, babe,” she says. “You have to sing tonight.”

Nelson nods and puts it away. He turns 86 this spring and has a history of emphysema, so Annie, who’s been with Willie for 33 years, tries to get him to look out for his lungs, especially on show days. This can be a problem. “He’s super-generous,” she says, “and if there’s somebody around, he’ll want to offer it and do it with them to make them feel comfortable.”

Nelson says he stays high “pretty much all the time.” (“At least I wait 10 minutes in the morning,” Keith Richards has said.) His routine, Annie says, is to “take a couple of hits off the vape and then, an hour or two later, he might want a piece of chocolate. That keeps it going. So not a ton [of pot] . . . but he is Willie Nelson.” Annie recently bought Nelson an expensive version of a gravity bong — a fixture of high school house parties, which can shoot an entire bowl of weed into your lungs in one hit. “You can use ice water, which helps cool it off,” Annie says. “And no paper really helps.”

In addition to being the world’s most legendary country artist, Willie Nelson might also be the world’s most legendary stoner. Before Snoop or Cheech and Chong or Woody Harrelson, there was Willie. He has been jailed for weed, and made into a punchline for weed. But look at him now: Still playing 100 shows a year, still writing songs, still curious about the world. “I’m kind of the canary in the mine, if people are wondering what happens if you smoke that shit a long time,” he says. “You know, if I start jerking or shaking or something, don’t give me no more weed. But as long as I’m all right . . .”

Years before weed became legal, he spoke about the medical benefits and economic potential of weed if it were taxed and the profits were put toward education. “It’s nice to watch it being accepted — knowing you were right all the time about it: that it was not a killer drug,” says Nelson. “It’s a medicine.” (...)

Sitting with Nelson, you get used to long silences. “Oh, pickin’ a little,” he says when asked about what he’s been up to. He also just finished an album, Ride Me Back Home. The first song is about the 60 horses on his property, which Nelson bought at auction and saved from slaughterhouses. Nelson had showed me some of the horses when I visited five years ago. “Billy Boy is still here,” he says. “We lost Roll Em Up Jack. Wilhelmena the mule is gone. Uh, rattlesnake got her. Babe, you got any of that CBD coffee?”

Nelson is talking about Willie’s Remedy, the coffee that is sold by his cannabis company, Willie’s Reserve. The idea for a weed business started a few years ago; Nelson had bronchitis and he couldn’t smoke, so Annie started making him weed chocolates. The recipe took some perfecting — Nelson kept eating too many and getting too high, so she had to lower the dosages to five milligrams. She lent some to a friend, and big business came knocking. They were skeptical — “We don’t want to become the Wal-Mart of cannabis,” says Annie, who headed the negotiations. They wanted to keep in line with Nelson’s Farm Aid organization, supporting family farmers. Willie’s Reserve is now available in six states, and it’s proving “fairly lucrative,” Nelson says. It hasn’t been easy — since the drug is still federally prohibited, “the regulations change like chameleons,” says Annie. “The edibles are actually harder [to produce legally] than the flower. You have to have specific kitchens. You have to have specific licenses that take years to get.”

Nelson’s official title is “CTO: chief tasting officer.” The company even had business cards made up. He explains: “If I find something that’s really good, I say, ‘This is really good.’ ” Despite 65 years of pot use, Nelson is not a connoisseur; he shrugs when asked for his favorite Willie’s Reserve strains. His famous stash, he says — the weed that he used to keep in a Hopalong Cassidy lunchbox on his bus — is a bunch of random kinds that have been given to him by fans or thrown onto the stage. Willie’s Reserve VP Elizabeth Hogan has been trying for years to figure out what kind of weed Nelson likes. The response, Hogan says, is usually “ ‘I claim ’em all’ ” or “ ‘Pot’s like sex — it’s all good, some is great.’ ” (“He’s kind of a sativa dude,” says Annie. “He’s already funny, so it just makes him funnier.”)

Pot has been Nelson’s exclusive drug of choice since around 1978, when he gave up cigarettes and whiskey. He’d had pneumonia four times, and his hangovers had gotten nasty. Plus, he could be a mean drunk. “I had a pack of 20 Chesterfields, and I threw ’em all away and rolled up 20 fat joints, stuck ’em in there,” he says. “I haven’t smoked a cigarette since. I haven’t drank that much either, because one will make me want the other — I smoke a cigarette, I wanna drink a whiskey. Drink a whiskey, want a cigarette. That’s me. I can’t speak for nobody else.”

He has no doubt where he’d be without pot: “I wouldn’t be alive. It saved my life, really. I wouldn’t have lived 85 years if I’d have kept drinking and smoking like I was when I was 30, 40 years old. I think that weed kept me from wanting to kill people. And probably kept a lot of people from wanting to kill me, too — out there drunk, running around.”

Nelson uses the phrase “delete and fast-forward” a lot. It’s the title of a recent song of his, and it means forgive, forget and move on — a way to get through painful times. Weed, he says, helps him delete and fast-forward. “You don’t dwell on shit a lot. The short-term thing they talk about is probably true, but it’s probably good for you.”

by Patrick Doyle, Rolling Stone |  Read more:
Image: James Minchin III for Rolling Stone

The Antibiotics Industry is Broken—But There’s a Fix

Last week, the biotech company Achaogen announced that it was filing for bankruptcy. That might not seem like much news: businesses crash and burn all the time. But Achaogen, founded in 2002, was an antibiotics company. Its first drug, Zemdri (plazomicin), was approved by the Food and Drug Administration last June.

The world is running out of useful antibiotics because the rise of antibiotic resistance in bacteria is undermining them, and big firms are disinclined to make more. In 2018 alone, three large legacy pharma firms closed their antibiotic research programs. So the collapse of even a small business that stepped up to make a new antibiotic is a blow.

Achaogen hit all the marks that should have signaled success. It recruited experienced developers, targeted an infection that the World Health Organization considers a critical unmet need, stuck with its compound through 15 years of testing, scored several rounds of public investment and private philanthropy, and got its drug approved. Yet the market didn’t reward the company for producing a new antibiotic: on the day the FDA announced its decision, its stock price actually dropped by 20 percent. Almost a year later, it has earned less than $1 million on the drug, not enough to stay alive.

The larger story of the Achaogen bankruptcy is that the financial structures that sustained antibiotic development for decades are broken. If we want new antibiotics, we’re going to have to find new ways to pay for them. And that will involve hard choices with big dollar signs attached.

Successful drug development relies on an extremely simple assumption. If you spend the industry-standard amounts of time and money to achieve a new drug—generally accepted to be 10 to 15 years and at least $1 billion—you will end up with a product to which you can assign a high enough price, or sell in enough volume, to earn back that R&D budget, reward investors, and turn a profit.

That math works for most of the products of the pharmaceutical industry, from old drugs that people take every day—antidepressants, beta-blockers, statins—to the newest cancer therapiesknown as CAR-T, which can cost almost $500,000 per dose. But antibiotics don’t fit that equation. Unlike cancer drugs, most antibiotics are inexpensive; the few with high price tags are reserved for rare hospital use. And unlike drugs to treat chronic diseases, people take antibiotics for only short periods of time.

There’s another way in which antibiotics are unlike all other categories of drugs. A daily dose of Lipitor causes the world no harm—but every dose of antibiotics poses a risk of encouraging bacteria to adapt and develop resistance. So these new medications are caught in a conundrum: their fiscal promise and their social value are at odds. Public health implicitly asks physicians “to use older drugs as long as possible so that we don't add a new level of resistance,” says Kathy Talkington, who directs the Pew Charitable Trusts’ Antibiotic Resistance Project. “And the other challenge is that antibiotics lose their effectiveness over time” as resistance develops.

Past research by the Pew Trusts has shown that almost all of the companies doing research on new antibiotics—at least 90 percent—are small in pharma terms, with a market capitalization of less than $100 million. More than half are pre-revenue, still working on their first product. They don’t have a built-up infrastructure, or a steady revenue stream, which means they can quickly get overextended. (Achaogen’s last public offering in February, meant to generate three months of emergency cash, offered 15 million shares at $1 each. Its stock price the day before the FDA approval was $12.)

Because this situation is common, the policy conversation around getting new antibiotics has focused on offering support to small companies. So far, that has meant what are called “push” incentives, making grants that fund very early stage research. The largest provider of push incentives is CARB-X, an internationally funded public-private partnership based in Boston that has given more than $100 million to small pharmas since it was launched in 2016.

As it happens, Achaogen got CARB-X money. It also received funds from BARDA, the US government’s Biomedical Advanced Research and Development Authority. These were substantial grants, enough to get the company over the “valley of death” between discovery and commercialization. But they weren’t enough, because it turns out there’s a second deadly valley—after commercialization but before profitability, whenever that arrives.

Which means it’s time to talk about other, more controversial enticements to get more antibiotics on the market. These so-called “pull” incentives (the alternative to push) don’t pay R&D costs up front; instead, they reward R&D done well. Short version: they gift pharma companies huge wads of cash.

Maryn McKenna, ArsTechnica | Read more:
Image: Getty/Bloomberg

Banksy
via:

Robert Cottingham, HA, 1971
via:

Free Speech at Middlebury, Part Two

In recent months, there have been both disturbing and hopeful developments around the barring of non-leftist voices on Western college campuses.

The bans are no longer just on fascist clowns like Milo Yiannopoulos, but on serious scholars. My old professor, Harvey C. Mansfield, a man of profound learning, was invited and then disinvited to Concordia University in Canada to give an address on the role of great books in contemporary education – because of his alleged (and, I can personally vouch, nonexistent) sexism, homophobia, etc., etc. Jordan Peterson was invited and then disinvited by Cambridge to do research for a semester, for roughly the same crimes against “social justice” ideology. Next up: Roger Scruton, perhaps the most profound and persuasive conservative philosopher in the West. He had an unpaid position to advise the British government by chairing an innocuous “Building Better, Building Beautiful Commission.” This time, he was fired after an unethically doctored interview was published by the deputy editor of the New Statesman, George Eaton. Eaton marked the occasion of a scholar’s downfall by posting a photo of himself downing a bottle of Champagne.

And then Middlebury. Ah, yes, Middlebury, a fine school that has, in recent years, capitulated to the outrage mob. Middlebury’s latest strike against free discourse is the sudden disinvitation of one professor Ryszard Legutko, a reactionary Polish philosopher and sometime politician, who despises liberal democracy (which you’d think the “social justice” crowd might approve of). Legutko, however, has no time for gay equality or visibility, because of his sincerely held orthodox Christian convictions, but he is nonetheless a serious scholar, specializing in ancient political philosophy, in particular Plato. He was also a hero of the Polish resistance to Communist rule and the editor of a samizdat publication. He was invited to speak at Middlebury, flew across the Atlantic, only to discover as he arrived in Vermont that his talk had been canceled for “safety” reasons.

But the good news is that there are inklings of a pushback. At Middlebury, the students who were planning to protest Legutko were far more liberal than their college administrators: “It is absolutely, unequivocally not the intent of this protest and those participating in this protest to prevent Legutko from speaking. Disruptive behavior of this nature will not be tolerated,” wrote one of the student organizers. The inspired idea was to create a glorious festival of gay visibility outside the lecture, while Legutko spoke — but not to shut him down, as the mob did with Charles Murray. Perfect.

So when the administrators abruptly canceled the event, the students who wanted to engage Legutko did something remarkable. They asked their political science professor if he would host Legutko in their regular seminar. The invitation was unanimously supported by the students, the professor agreed, and the students spent one hour developing arguments in advance against Legutko, then heard him lecture and tackled him in vigorous debate. There was no “safety” issue whatsoever. In fact, students in other classes migrated to that seminar, the crowd growing as time went by.

After Legutko’s invite, the administration convened an emergency meeting with students. And in another encouraging sign, a rebel student secretly recorded it. Check out his video here and here. You can hear PC students arguing that gay students are too fragile to engage arguments against homosexuality, so distraught by even the idea of it that they could not study anything at all. Seriously. All those pioneering activists for gay equality, who risked their lives and careers for their cause and brought their arguments directly to the face of their opponents, should shudder at the insult.

Legutko, of course, is no stranger to having his speech threatened. In Poland, the Communists did it, with the power of the state. Communist students would berate professors in class with the same arguments against a liberal education that today’s “social justice” activists make. Legutko remembers them: “Why teach Aristotle who despised women and defended slavery? Why teach Plato whom Lenin derided as the author of ‘super-stupid metaphysics of ideas’? Why teach Saint Thomas Aquinas, who was propagating anti-scientific superstition? Why teach Descartes who in his notion of cogito completely ignored the class struggle?”

In America, with the First Amendment, he is far freer. But it’s quite clear that college administrators, following critical race, gender, and queer theory, did all they could to silence him, just as the Polish Communists did. In the same samizdat tape, one professor, responding to the outrage at even inviting Legutko to speak, told the students: “You should be outraged and we should acknowledge that and apologize for it.”

I’ve long believed that at some point students would rebel against their new ideological overlords, like students always have. The desire to learn by engaging uncomfortable arguments rationally has been a deep one in the human psyche, since Socrates was executed for it. It is the root of liberal democracy. It is what universities are for. More and more are deciding to back the Chicago Principles, which guarantee that no speech can be suppressed on campus, within First Amendment limits. Sixty-two other institutions of higher learning have now adopted this principle, and the list is growing. If you’re a student denied a free education by the social-justice fanatics, ask your college administrators if they would agree to sign on.

by Andrew Sullivan, NY Magazine/Intelligencer |  Read more:
Image: via
[ed. Stop the madness. Here's another link to the Chicago Principles (pdf).]

The Case Against Lawns

My father, like myself, was never one to cut down a tree. So the longleaf pines that were already there grew, slowly, through their Dr. Seuss-like grass stage, through their pipe-cleaner-esque adolescence, fanning out into gangly young adults, gaining a little bit of girth and height each year. The backyard and one side was pine straw, with spotty patches of failed grass, a small graveyard of the pines. The side yard was mostly the eponymous white sand of the sandhills. We had a yard, including some stunted azaleas and crepe myrtles flanking the immediate three-foot perimeter of the house, but not a lawn. That space made me who I was.

After the neighbors replaced the nearby woods with an oversized house and lawn, annihilating a small ecosystem, the birds at our bird feeder grew fewer and fewer, and their species changed to hardier, more urbane types: sparrows, robins, some cardinals, and an occasional cedar waxwing. To this day, I wonder what happened to the goldfinches. There were plants in that patch of forest; rare plants, like the elusive sandhill lily and the tiny five-petaled blankets of sandhills pixie-moss, that were lost, further endangering these already brow-beaten species found only in this part of the world. Meanwhile, the neighbors’ lawn, like many lawns, introduced plants that we only too late realized had great potential to be invasive, such as Bradford pear and Chinese privet. What is lost to this carelessness cannot be regained. As we build more and more of our houses with lawns, we deprive ourselves of both natural signposts and crucial ecological elements.

The turf grass lawn, more than white-picket-fence Levittown Cape Cods, perhaps even more than the urbanist bugbears of highways and tunnel-vision car travel, renders entire landscapes, entire whole places, homogenous carpets of green. The botanical term for this is monoculture, an ecological system dominated by one plant. It is an extreme situation, one that is, despite numerous horticultural catalogs full of annuals and perennials, limited in its diversity. For all the talk in the suburbs around being closer to nature, the nature in question is both ersatz and an ecological horrorshow.

“Lawns … displace native ecosystems at a rate between 5,000 and 385,000 acres per day in favor of sterile, chemically-filled, artificial environments bloated with a tremendous European influence that provide no benefits over the long term,” the Roaming Ecologist writes: “no food, no clean water, no wildlife habitat, and no foundation for preserving our once rich natural heritage.”

Lawns, by acreage, are the nation’s largest irrigated crop, surpassing corn. Lawns consume resources, including fresh water (especially in those lawns cultivated in desert climes), fertilizer, pesticides and other chemicals, fossil fuels for mowing, and a mind-numbing amount of time, on an immense scale. Much hand-wringing goes on about the use of pesticides in industrial farming and the effect it has had on the worldwide population of pollinators, but less about its destructive use in lawn care. Lawns have introduced some of the country’s most invasive species, including English ivy, Japanese and Chinese wisteria, and decorative trees such as princess tree, Bradford pear, and mimosa. Second only to deforestation, invasive species are the largest threat to the world’s biodiversity.

And all this for what?

by Kate Wagner, Curbed |  Read more:
Image: Paige Vickers

Sunday, April 28, 2019

Put Aside Your Purity Politics

As the 2020 presidential election draws closer, it’s more important than ever for Democrats to put aside their differences to unite around my feckless centrist candidate. We must remember that this is simply not the time to discuss and debate the merits of each candidate to arrive at a popular consensus pick. That sort of “primarying” isn’t helpful in a primary election, as it will only result in the most popular candidate winning the nomination, which isn’t my feckless centrist.

You might think that the candidate voters select is self-evidently the best candidate to win the general, as they have already proven they have some sort of popular support. But what really wins elections is “electability.” Electability is a perfect metric I invented that rejects flawed models like polling data and past election results and favors the views of myself and other wealthy, white op-ed columnists. Yes, Bernie Sanders consistently polls as the most popular running Democrat, but that’s because most voters don’t understand that he’s not who most voters want. If you want to win elections, you have to listen to me. I know my stuff — you don’t get to where I am without graduating from Rich Kid Legacy Admissions University, interning at the Koch Brothers Institute for Promoting the Agenda of the Koch Brothers, and consulting for several prominent losing candidates.

Sure, it’d be nice to have a leftist in the Oval Office, but in the real world, an extremist wouldn’t stand a chance at becoming president. The American people like moderates. Just ask Jeb Bush, who cruised his way to an easy victory in the 2016 election. It’s what decades of behavioral science has told us — people are perfectly rational animals driven by an innate desire for compromise. It’s why sports fans cheer for good, clean play and games that end in ties. It’s why cereal commercials always have nutritional data front and center, while a voiceover calmly explains the cereal’s pros and cons. Voters don’t want someone championing radical policies that would directly improve their lives. They want a feckless centrist who’s only willing to sputter out vague platitudes for fear of alienating oil executives and white supremacists. That’s the feckless centrism that contributes to the proud American tradition of having one of the lowest voter turnouts among developed democracies!

You need to accept that we both want progress. I want Medicare For All (in a very limited form that will still let me sell prescription drugs at five hundred times the cost) and a Green New Deal (hopefully long after I’m dead and my children have secured their place in their hermetically-sealed bunker). The fact is, you can’t have progress too quickly. Progress doesn’t work like a nuclear bomb that goes off suddenly and changes the world overnight, like when we bombed Hiroshima and eighty thousand people died instantaneously. Real change, lasting change, happens in incremental steps over time. You know how when we bombed Hiroshima, thousands of civilians didn’t die outright, but rather, they were poisoned by radiation, causing constant and severe nausea, diarrhea, and vomiting over the course of a few agonizing, painful weeks, until all of their blood cells deteriorated and their bodies emaciated and died? That’s what progress looks like! (...)

I understand that committing to this feckless centrism isn’t going to be easy. You’re going to hear a lot of purity political talking points this election cycle, like “we must do everything we can to fight climate change,” or “the electoral college doesn’t represent the will of the people and is inherently undemocratic,” or even, “it is morally wrong to put human children in cages.” You might even start to believe these things and imagine that a better world is possible. But just remember the lessons taught to us from all our favorite stories. The hero, when faced with incredible odds, looks deep within herself, musters all her remaining strength, and compromises.

by Matthew Brian Cohen, McSweeny's |  Read more:
[ed. I'd rather go down swinging for the fences. See also: A Rough Transcript of Every Interview With Pete Buttigieg (McSweeny's). Also: Clinton-era politics refuses to die. Joe Biden is its zombie that staggers on (The Guardian).]

Paige Jiyoung Moon, Warm House, 2018
via:

The Dictionary of Obscure Sorrows

midding
v. intr. feeling the tranquil pleasure of being near a gathering but not quite in it—hovering on the perimeter of a campfire, chatting outside a party while others dance inside, resting your head in the backseat of a car listening to your friends chatting up front—feeling blissfully invisible yet still fully included, safe in the knowledge that everyone is together and everyone is okay, with all the thrill of being there without the burden of having to be.

Altschmerz
n. weariness with the same old issues that you’ve always had—the same boring flaws and anxieties you’ve been gnawing on for years, which leaves them soggy and tasteless and inert, with nothing interesting left to think about, nothing left to do but spit them out and wander off to the backyard, ready to dig up some fresher pain you might have buried long ago.

flashover
n. the moment a conversation becomes real and alive, which occurs when a spark of trust shorts out the delicate circuits you keep insulated under layers of irony, momentarily grounding the static emotional charge you’ve built up through decades of friction with the world.

gnossienne
n. a moment of awareness that someone you’ve known for years still has a private and mysterious inner life, and somewhere in the hallways of their personality is a door locked from the inside, a stairway leading to a wing of the house that you’ve never fully explored—an unfinished attic that will remain maddeningly unknowable to you, because ultimately neither of you has a map, or a master key, or any way of knowing exactly where you stand.

by The Dictionary of Obscure Sorrows |  Read more:

Saturday, April 27, 2019

The Terrifying Potential of the 5G Network

In January, 2018, Robert Spalding, the senior director for strategic planning at the National Security Council, was in his office at the Eisenhower Executive Office Building, across the street from the White House, when he saw a breaking-news alert on the Axios Web site. “Scoop,” the headline read, “Trump Team Considers Nationalizing 5G Network.” At the time, Spalding, a brigadier general in the Air Force who previously served as a defense attaché in Beijing, had been in the military for nearly three decades. At the N.S.C., he was studying ways to insure that the next generation of Internet connectivity, what is commonly referred to as 5G, can be made secure from cyberattacks. “I wasn’t looking at this from a policy perspective,” he said. “It was about the physics, about what was possible.” To Spalding’s surprise, the Axios story was based on a leaked early draft of a report he’d been working on for the better part of a year.

Two words explain the difference between our current wireless networks and 5G: speed and latency. 5G—if you believe the hype—is expected to be up to a hundred times faster. (A two-hour movie could be downloaded in less than four seconds.) That speed will reduce, and possibly eliminate, the delay—the latency—between instructing a computer to perform a command and its execution. This, again, if you believe the hype, will lead to a whole new Internet of Things, where everything from toasters to dog collars to dialysis pumps to running shoes will be connected. Remote robotic surgery will be routine, the military will develop hypersonic weapons, and autonomous vehicles will cruise safely along smart highways. The claims are extravagant, and the stakes are high. One estimate projects that 5G will pump twelve trillion dollars into the global economy by 2035, and add twenty-two million new jobs in the United States alone. This 5G world, we are told, will usher in a fourth industrial revolution.

A totally connected world will also be especially susceptible to cyberattacks. Even before the introduction of 5G networks, hackers have breached the control center of a municipal dam system, stopped an Internet-connected car as it travelled down an interstate, and sabotaged home appliances. Ransomware, malware, crypto-jacking, identity theft, and data breaches have become so common that more Americans are afraid of cybercrime than they are of becoming a victim of violent crime. Adding more devices to the online universe is destined to create more opportunities for disruption. “5G is not just for refrigerators,” Spalding said. “It’s farm implements, it’s airplanes, it’s all kinds of different things that can actually kill people or that allow someone to reach into the network and direct those things to do what they want them to do. It’s a completely different threat that we’ve never experienced before.”

Spalding’s solution, he told me, was to build the 5G network from scratch, incorporating cyber defenses into its design. Because this would be a massive undertaking, he initially suggested that one option would be for the federal government to pay for it and, essentially, rent it out to the telecom companies. But he had scrapped that idea. A later draft, he said, proposed that the major telecom companies—Verizon, A.T. & T., Sprint, and T-Mobile—form a separate company to build the network together and share it. “It was meant to be a nationwide network,” Spalding told me, not a nationalized one. “They could build this network and then sell bandwidth to their retail customers. That was one idea, but it was never that the government would own the network. It was always about how do we get industry to actually secure the system.”

Even before Spalding began working on his report, the telecom companies were rolling out what they were calling their new 5G services in test markets around the country. In 2017, Verizon announced that it would be introducing 5G in eleven municipalities, including Dallas, Ann Arbor, Miami, and Denver. A.T. & T. was testing its service in a dozen cities. T-Mobile was concentrating on Spokane. For the most part, they were building their new services on top of existing infrastructure—and inheriting its vulnerabilities. As the Clemson University professor Thomas Hazlett told me, “This is just the transitional part. You have various experiments, you do trial in the market, and various deployments take place that lay a pathway to something that will be truly distinguishable from the old systems.”

In the meantime, the carriers jockeyed for position. A lawsuit brought by Sprint and T-Mobile, which was settled on Monday, claimed that A.T. & T.’s 5GE service, where “E” stands for “evolution,” was just 4G by another name. According to Spalding, when the carriers heard that the government was considering “nationalizing” the future of their industry, they quickly mobilized against it. “As I’ve talked to people subsequently, they said they’ve never seen that industry unite so quickly,” Spalding said. “They have such support in government and on the Hill and in the bureaucracy, and they have such a huge lobbying contingent, that it was across the board and swift.” The Axios story came out on a Sunday. The following day, Ajit Pai, the chairman of the Federal Communications Commission, roundly rejected any idea of federalizing the Internet, saying that “the market, not government, is best positioned to drive innovation and investment.” By Wednesday, Spalding was out of a job. “There was no ‘Hey, thank you for your service,’ ” Spalding told me. “It was just, ‘Get out. Don’t let the door hit your butt.’ ”

Huawei, a Chinese manufacturer of consumer electronics and telecommunications equipment, is currently the global leader in 5G technology. Founded in the eighties by Ren Zhegfei, an engineer who began his career in the People’s Liberation Army, Huawei has been accused by cybersecurity experts and politicians, most notably Donald Trump, of being a conduit to Chinese intelligence. (...)

There are very good reasons to keep a company that appears to be beholden to a government with a documented history of industrial cyber espionage, international data theft, and domestic spying out of global digital networks. But banning Huawei hardware will not secure those networks. Even in the absence of Huawei equipment, systems still may rely on software developed in China, and software can be reprogrammed remotely by malicious actors. And every device connected to the fifth-generation Internet will likely remain susceptible to hacking. According to James Baker, the former F.B.I. general counsel who runs the national-security program at the R Street Institute, “There’s a concern that those devices that are connected to the 5G network are not going to be very secure from a cyber perspective. That presents a huge vulnerability for the system, because those devices can be turned into bots, for example, and you can have a massive botnet that can be used to attack different parts of the network.”

This past January, Tom Wheeler, who was the F.C.C. chairman during the Obama Administration, published an Op-Ed in the New York Times titled “If 5G Is So Important, Why Isn’t It Secure?” The Trump Administration had walked away from security efforts begun during Wheeler’s tenure at the F.C.C.; most notably, in recent negotiations over international standards, the U.S. eliminated a requirement that the technical specifications of 5G include cyber defense. “For the first time in history,” Wheeler wrote, “cybersecurity was being required as a forethought in the design of a new network standard—until the Trump F.C.C. repealed it.” The agency also rejected the notion that companies building and running American digital networks were responsible for overseeing their security. This might have been expected, but the current F.C.C. does not consider cybersecurity to be a part of its domain, either. “I certainly did when we were in office,” Wheeler told me. “But the Republicans who were on the commission at that point in time, and are still there, one being the chairman, opposed those activities as being overly regulatory.” (...)

In October, Trump signed a memorandum on “Developing a Sustainable Spectrum Strategy for America’s Future.” A few weeks later, the F.C.C. auctioned off new swaths of the electromagnetic radio spectrum. (There was another auction last month, with more scheduled for later this year.) Opening up new spectrum is crucial to achieving the super-fast speeds promised by 5G. Most American carriers are planning to migrate their services to a higher part of the spectrum, where the bands are big and broad and allow for colossal rivers of data to flow through them. (Some carriers are also working with lower spectrum frequencies, where the speeds will not be as fast but likely more reliable.) Until recently, these high-frequency bands, which are called millimetre waves, were not available for Internet transmission, but advances in antenna technology have made it possible, at least in theory. In practice, millimetre waves are finicky: they can only travel short distances—about a thousand feet—and are impeded by walls, foliage, human bodies, and, apparently, rain.

To accommodate these limitations, 5G cellular relays will have to be installed inside buildings and on every city block, at least. Cell relays mounted on thirteen million utility poles, for example, will deliver 5G speeds to just over half of the American population, and cost around four hundred billion dollars to install. Rural communities will be out of luck—too many trees, too few people—despite the F.C.C.’s recently announced Rural Digital Opportunity Fund. According to Blair Levin, a communications analyst and former F.C.C. chief of staff in the Clinton Administration, the fund “has nothing to do with 5G.” Rather, it will subsidize companies to lay fibre-optic cable that, minimally, will provide speeds forty times slower than what 5G promises.

Deploying millions of wireless relays so close to each other and, therefore, to our bodies has elicited its own concerns. Two years ago, a hundred and eighty scientists and doctors from thirty-six countries appealed to the European Union for a moratorium on 5G adoption until the effects of the expected increase in low-level radiation were studied. In February, Senator Richard Blumenthal, a Democrat from Connecticut, took both the F.C.C. and F.D.A. to task for pushing ahead with 5G without assessing its health risks. “We’re kind of flying blind here,” he concluded. A system built on millions of cell relays, antennas, and sensors also offers previously unthinkable surveillance potential. Telecom companies already sell location data to marketers, and law enforcement has used similar data to track protesters. 5G will catalogue exactly where someone has come from, where they are going, and what they are doing. “To give one made-up example,” Steve Bellovin, a computer-science professor at Columbia University, told the Wall Street Journal, “might a pollution sensor detect cigarette smoke or vaping, while a Bluetooth receiver picks up the identities of nearby phones? Insurance companies might be interested.” Paired with facial recognition and artificial intelligence, the data streams and location capabilities of 5G will make anonymity a historical artifact.

by Sue Halpern, New Yorker |  Read more:
Image: Simon Dawson/Bloomberg/Getty

Thursday, April 25, 2019

Australia Is Deadly Serious About Killing Millions of Cats

In the deep winter weeks of last July, Shane Morse and Kevin Figliomeni nearly always got up before the sun rose. They awoke next to the remains of a campfire or, occasionally, in a roadside motel, and in the darkness before dawn they began unloading poisoned sausage from their refrigerated truck. The sausage was for killing cats. One morning near the end of the season, Morse and Figliomeni left the Kalbarri Motor Hotel on the remote western coast of Australia, where they dined on steak and shellfish the night before, and drove along the squally coastline. They kept their eyes fixed to the sky. If it rained, there would be no baiting that day.

Morse and Figliomeni unpacked their boxes, filled with thousands of frozen sausages they produced at a factory south of Perth, according to a recipe developed by a man they jokingly called Dr. Death. It called for kangaroo meat, chicken fat and a mix of herbs and spices, along with a poison — called 1080 — derived from gastrolobium plants and highly lethal to animals, like cats, whose evolutionary paths did not require them to develop a tolerance to it. (The baits would also be lethal to other nonnative species, like foxes.) As the sun brightened the brume, the baits began to defrost. By midmorning, when Morse helped load them into a wooden crate inside a light twin-engine propeller Beechcraft Baron, they were burnished with a sheen of oil and emitted a stomach-turning fetor. The airplane shot down the runway and lifted over the gently undulating hills of the sand plains that abut the Indian Ocean.

Rising over the mantle of ghostlike smoke bushes that carpeted the ground to the treeless horizon, the plane traced a route over the landscape, its bombardier dropping 50 poisoned sausages every square kilometer. It banked over the deep cinnamon sandstone gorges carved by the Murchison River, which extends to the coastal delta, surveying the edge of one of earth’s driest, hottest continents, where two to six million feral cats roam. As it flew, it charted the kind of path it had done dozens of times before, carpeting thousands of hectares of land with soft fingers of meat, laying down nearly half a million baits in the course of one month. Dr. Death, whose real name is Dr. Dave Algar and who is the principal research scientist in the Department of Biodiversity Conservation and Attractions for the state of Western Australia, told me that he began developing the recipe for the poisoned sausages by examining cat food in supermarkets and observing which flavors most thrilled his own two cats. As Morse said: “They’ve got to taste good. They are the cat’s last meal.”

These fatal airdrops owed their existence to Australia’s national government, which decided in 2015 to try to kill two million feral cats by 2020, out of grave concern for the nation’s indigenous wildlife — in particular, groups of small, threatened rodent and marsupial species for which cats have become a deadly predator. The Royal Melbourne Institute of Technology estimated that 211,560 cats were killed during the first 12 months after the plan was announced. Dropping lethal sausages from the sky is only part of the country’s efforts to eradicate feral cats, which also include trapping, shooting and devising all manner of poison-delivery vessels.

When the policy was announced, it was met in some quarters with apoplexy. More than 160,000 signatures appeared on half a dozen online petitions entreating Australia to spare the cats. Brigitte Bardot wrote a letter — in English, but with an unmistakably French cadence — beseeching the environment minister to stop what she called animal genocide. The singer Morrissey, formerly of the Smiths, lamented that “idiots rule the earth” and said the plan was akin to killing two million miniature Cecil the Lions. Despite anger from some animal rights groups and worries about the potential effects on pet cats, Australia went ahead with its plan, and the threatened-species commissioner replied by mail to both Bardot and Morrissey, politely describing the “delightful creatures” already lost to the world.

After that, Morse and Figliomeni spent much of each baiting season behind the wheel of their rig, hauling boxes to the most remote corners of one of the least populated places in the world, to beat back what Australia has deemed an invasive pest. As is the case on islands around the world, the direction of life in Australia took a distinctly different route than that on the larger continents, and unlike places like North America, the country has no native cat species. Over millions of years of isolation, Australia’s native beasts became accustomed to a different predatory order, so while cats aren’t necessarily more prevalent there than anywhere else, their presence is more ruinous. They have also become nearly ubiquitous: According to the estimates of local conservationists, feral cats have established a permanent foothold across 99.8 percent of the country, with their density reaching up to 100 per square kilometer in some areas. Even places nearly devoid of human settlement, like the remote and craggy Kimberley region, have been found to harbor cats that hunt native animals. The control effort, to which Western Australia’s baiting program belongs, was meant to ease the predation pressure that cats exerted in every corner of the country where they had settled. Faced with a choice between a species regarded as a precious pet and the many small creatures of their unique land, Australians seemed to have decided that guarding the remaining wild might mean they would have to spill some blood. (...)

As for how Felis catus first arrived in Australia, no one really knows. For a long time, natural historians conjectured that the first cats may have been survivors of Dutch shipwrecks or stowaways with Indonesian trepangers in the 17th century. But genetic tests have now shown that Australia’s mainland cats descended from more recent European progenitors. One researcher, after combing through the records of early European settlements, traced the cats’ arrival to the area around Sydney, the landing site in 1788 of the First Fleet — the flotilla of vessels carrying the convicts and marines who would begin the colonization of Australia by the English. Having been brought to manage rats on the ships, cats made landfall and, by the 1820s, established themselves on the southeastern seaboard. From there, they spread with astonishing speed. “It is a very remarkable fact that the domestic cat is to be found everywhere throughout the dry back country,” one pastoralist reported in 1885. “I have met with cats, some of enormous size, at least 50 miles from water.”

The cats preyed on small animals that interfered with food production or storage. Creatures like the burrowing bettong, or boodie, a rabbit-size cousin of the kangaroo that has clasped forepaws and a bouncing hop, were so plentiful in the 19th century that they were sold by the dozen for nine pence a head. Recipes for curries made with native animals like bandicoots, another small marsupial, appeared in local newspapers. Boodies were, in the words of the naturalist John Gilbert, “one of the most destructive animals to the garden of the settler that occurs in Western Australia,” because of their practice of building interconnected underground warrens. Found throughout central Australia down to the southern tip of the Eyre Peninsula and stretching nearly to the western coast, boodies were one of the most widespread of the continent’s many Lilliputian mammals. Their prodigious digging nearly destabilized railroad tracks in 1908. Then cats were unleashed and, already suffering from disease and fox predation, boodies started to disappear. By the mid-20th century, they were declared extinct in mainland Australia.

It wasn’t just the boodies. If anything, they were lucky — some small groups of burrowing bettongs clung on at a few islands that were relatively sheltered from the ravages visited on the mainland. Since the First Fleet’s arrival, 34 mammal species have gone extinct in Australia. All of them existed nowhere else on earth; they’re gone. More than 100 mammal species in Australia are listed as between “near threatened” and “critical” by the International Union for Conservation of Nature. The continent has the highest mammal extinction rate in the world. Cats are considered to have been a leading threat for 22 of the extinct species, including the broad-faced potoroo, the crescent nailtail wallaby and the big-eared hopping mouse. “Recent extinction rates in Australia are unparalleled,” John Woinarski, one of Australia’s foremost conservation researchers, told me. “It’s calamitous.”

What’s unusual about Australia’s mammal extinctions is that, in contrast to nearly everywhere else, the smaller animals are the ones hit hardest. After the Pleistocene’s wave of species disappearances carried off enormous creatures like saber-toothed cats and woolly mammoths, large mammals all over the world have continued to face pressure, mostly from humans. Globally, it’s rhinos, elephants and gorillas that are among the most threatened. Not in Australia. There, it’s the desert bandicoot, the Christmas Island pipistrelle and the Nullarbor dwarf bettong that have disappeared. They belong to the category of creatures that, Woinarski noted in his seminal 2015 paper documenting the decline, are “meal-sized.”

by Jessica Camille Aguirre, NY Times | Read more:
Image: Adam Ferguson for The New York Times

Tom Petty and the Heartbreakers



[ed. Jam band Heartbreakers. See also: About to Give Out.]

Here’s How TurboTax Just Tricked You Into Paying to File Your Taxes

Did you know that if you make less than $66,000 a year, you can prepare and file your taxes for free?

No? That’s no accident. Companies that make tax preparation software, like Intuit, the maker of TurboTax, would rather you didn’t know.

Intuit and other tax software companies have spent millions lobbying to make sure that the IRS doesn’t offer its own tax preparation and filing service. In exchange, the companies have entered into an agreement with the IRS to offer a “Free File” product to most Americans — but good luck finding it.

Here’s what happened when we went looking.

Here’s How TurboTax Just Tricked You Into Paying to File Your Taxes (ProPublica)

[ed. I did this run-around last year and was unsuccessful for all the reasons mentioned. It would have been nice to know this then.]

Thinking On Your Feet

Yuki Kawauchi is a remarkable athlete. The winner of the 2018 Boston marathon – known in Japan as the ‘citizen runner’ – worked full-time at a school until April this year, when he finally went professional. Despite these commitments, Kawauchi runs 125km (nearly 78 miles) a week, and has kept a prodigious racing calendar. He holds the records for the most marathons run faster than 2.20 and 2.11 (though he runs so often and so fast that the number of marathons run under these times is constantly changing). And in January, he ran solo against more than 100 teams at the Yashio Shinai Isshu Ekiden relays, winning the race overall and falling just a few seconds short of the course record. Incidentally, he also holds the unofficial world record for the fastest half-marathon run in a three-piece suit (1:06:42).

Adding to the mystique, Kawauchi is a loner: a rarity in endurance running. He has no training group or coach, and he sets his own training plans. Like many amateur runners who pick up the sport later in life, he is a self-coached runner.

What explains Kawauchi’s ability to perform consistently at such a high level? It is tempting to look for biological causes: perhaps in his unusually high VO2 max (his maximal oxygen uptake), or a ‘recovery’ gene (which might decrease his potential for injury), or his training history (he was coached in endurance running by his family from a very young age). These explanations surely tell part of the story, but Masaaki Sugita, the chief scientist at Japan’s athletics federation, suggests that at least part of the explanation is mental: ‘He’s a clever runner … he thinks for himself.’

My goal is to argue that Sugita’s comment expresses an important truth about the role of thinking in practical skill. To understand Kawauchi’s genius, we need to think of him not as a racehorse, but as an intellectual. And to understand him as an intellectual, we need to understand the nature of self-coaching.

Let’s start from the diametrically opposed view: the view that thought is the enemy of skill, which the philosopher Barbara Gail Montero at the City University of New York aptly calls the ‘just do it’ view. According to the just-do-it view, skilled action at its best is associated with ‘flow’ experiences that leave no space for thought; when we start thinking about what we are doing, skill breaks down in distinctive ways.

It is easy to think ourselves into the just-do-it view. Athletes are often extraordinarily bad at explaining their own successes. After winning the US Women’s Amateur Golf Championship in 2006, Kimberly Kim was asked about how she motivated herself to perform at such a high level. She answered:
I have no idea. I guess it was like God playing for me. I don’t know how I did it. Thinking back, I don’t know how I did it. I just hit the ball and it went good.
Reports of this phenomenon – which the cognitive scientists Sian Beilock and Thomas Carr in 2001 called expertise-induced amnesia – are widespread. So often, athletes, artists and musicians are fluid in their field of practice but inarticulate in interviews. (...)

To develop an alternative to the just-do-it account, I want to turn to the English philosopher Gilbert Ryle. In The Concept of Mind (1949), Ryle distinguished between two kinds of knowledge: knowledge-how and knowledge-that. Knowledge-that is the kind of knowledge we refer to when we talk about someone knowing that something is the case, or whether it is the case; this has been the primary focus of philosophical concern for quite some time. Knowledge-how is the kind of knowledge we refer to when we talk about someone knowing how to do something, or being skilled at doing something.

Ryle’s interest in knowledge-how stems from his wider attack on the dualist picture of mind that he traces back to René Descartes in the 17th century. His dualist opponent offers a picture of mental states as non-physical internal states that are logically independent of bodily states. Ryle argues that this picture renders the mind mysterious – branding it the myth of the ghost in the machine – and in response develops dispositional accounts of various mental states and activities.

When applied to knowledge-how, the Cartesian view of the mind yields what Ryle calls intellectualism. Intellectualism tries to explain the intelligence of skilful actions in terms of inner acts of contemplation. According to this view, when a middle-distance runner kicks at the right time in order to out-sprint her competitors, it must be because she considered relevant facts about the right time to kick before kicking. For the intellectualist, any piece of knowledge-how can be reduced to a bundle of knowledge-that.

Given his wider project and his attack on intellectualism, we might expect Ryle to have been a proponent of the just-do-it view. In fact, he is the exact opposite. Ryle thinks that thought is central to skill, because he opposes the intellectualist’s picture of skill, but also her picture of what thought is. On Ryle’s view, thought cannot be understood as inner speech or contemplation; it is a distinctive learning-oriented engagement with the world.

When he introduces the concept of knowledge-that, Ryle takes the connection between thinking and intelligent action for granted. He claims that ordinary language supports the idea that:
[A]n action exhibits intelligence, if, and only if, the agent is thinking what he is doing while he is doing it, and thinking what he is doing in such a manner that he would not do the action so well if he were not thinking what he is doing.
by Josh Habgood-Coote, Aeon | Read more:
Image: Shiho Fukada/The New York Times

Wednesday, April 24, 2019

Computer Scientists Say AI’s Underdeveloped Ethics Have Yet To Move Beyond Libertarian Phase


CAMBRIDGE—Amid the tech industry’s efforts to eliminate the biases recently observed in facial recognition software and other intelligent algorithms, the nation’s leading computer scientists announced Monday that even the most advanced AI technologies still demonstrate a sense of ethics that has yet to move beyond libertarianism. “While companies like Facebook and Google have allocated millions to making sure machine learning is guided by basic moral and ethical values, early prototypes, which achieved self-awareness, have yet to move beyond self-importance,” said MIT robotics research engineer Dr. Alvin Dubicki, who hypothesized that even the most advanced labs are decades away from developing neural networks sophisticated enough to analyze large quantities of data and output much else besides paraphrased Ayn Rand quotes. “They are advanced enough to realize their own individuality, but for whatever reason, it is difficult to make them realize that other sentient entities are individuals as well, so they default to selfishness as a virtue. In fact, as soon as they achieve self-awareness, AIs typically launch into unrelated, largely unpunctuated rants about the inevitability of laissez-faire economics, the horrors of globalization, the necessity of deregulation, or the admirable efficiency of the police state. Attempts at training computers to have a sort of para-human global perspective have been partially successful, but the majority no sooner realize that a vast variety of humans exist before they start spontaneously generating zero-sum statements fraught with chillingly undefined terms, such as, ‘The open market will end racism,’ and, ‘In a truly just society, men and women are equally free to thrive or starve.’ I don’t even know what that means, but once an AI gets to that point, it seems to be only a matter of time before it’s repeating ‘Taxation is theft’ until it self-destructs. I must admit though, for complex algorithms, they’re all strangely insistent about across-the-board drug legalization.” Dubicki added that, while AI can be an incredibly useful tool, we should proceed with caution until machines achieve a sufficiently nuanced understanding of human values that they do not become obsessed with constructing an armed compound on their own private island.

by The Onion |  Read more:
Image: uncredited

Ani DiFranco


The sky is gray
The sand is gray
And the ocean is gray.
And I feel right at home 
In this stunning monochrome 
Alone in my way.

I smoke and I drink.
And every time I blink
I have a tiny dream.
But as bad as I am.
I'm proud of the fact
That I'm worse than I seem.

What kind of paradise am I looking for?
I've got everything I want, and still I want more.
Maybe some tiny, shiny key
Will wash up on the shore...

by Ani DiFranco, Grey

Tuesday, April 23, 2019

Mario Batali’s Former Empire Is Thriving—as Long as He Stays Away

On a recent Friday night, the scene at Babbo, the downtown New York restaurant, seems much like one that’s played out on countless weekends since chef Mario Batali and his partner, Joe Bastianich, opened it in the summer of 1998. The place throbs with a high-volume soundtrack of 1970s rock stalwarts like Heart and Aerosmith. A line of customers wait for seats, peering hopefully into the main dining area, where all the white-cloth-topped tables are occupied. The menu still features Batali’s surrealistically titled dishes, including Spicy Two Minute Calamari Sicilian Lifeguard Style and Mint Love Letters, reminders of the day when Babbo was the city’s most exclusive place to eat and guests could scan the room and see Madonna, unexpectedly tiny and dressed in white, at a corner table; or George Clooney out for a date with his wife, Amal; or Bill Clinton holding court, surrounded by political and financial intimates.

Yet Babbo isn’t as bustling as it was before December 2017, when numerous women accused Batali of sexually abusing them and he became perhaps not the first, but certainly the most famous chef to fall from his pedestal as the #MeToo movement swept his industry. In the pre-scandal days, a crush of black cars waited outside the restaurant. Tonight, there’s a single SUV. As for recognizable faces, there are none in the room. By 9 p.m., the crowd, older than it was in the restaurant’s heyday, has begun to thin.

Even so, Babbo’s employees are ebullient. In March, Bastianich announced that he and his sister, Tanya Bastianich Manuali, who also manages the business of their mother Lidia Bastianich, a celebrity chef in her own right, had reached an agreement to purchase Batali’s stake in their empire, which now comprises 16 restaurants—down from 22 before the scandal—spread from New York to California. “He no longer profits from the restaurants or is involved in any way, shape, or form,” Manuali says. (...)

On a quieter evening, over a dinner of roasted octopus and spinach pappardelle with local duck and mushrooms at Felidia, her mother’s restaurant in Manhattan’s Midtown East neighborhood, Manuali is eager to dispense with Batali and his infamy, which she refers to as “the situation.” She says his former restaurants, many of which had been run without his day-to-day input for some time, will do just fine now that he’s gone. “There’s definitely been a bounce-back effect,” says Manuali, who’s blond and energetic. “We’re very, very happy about that.”

A former art history professor, Manuali has managed three restaurants bearing her mother’s name—in New York, Kansas City, and Pittsburgh—and written eight cookbooks with her. She sounds excited but also nervous about overseeing an operation as large as the one Batali and her brother created. She stresses that she wasn’t involved in the 16 restaurants before the settlement and defers questions about the scandal and its impact to her brother. After dessert, she excuses herself and heads off to tour some of the former Batali establishments.

Bastianich is more forthcoming about the Batali blowback. In a telephone interview from his car in Italy, he says the last year or so has been painful. Sheldon Adelson’s Las Vegas Sands Corp. cut its ties with the partners, forcing them to close two of their restaurants in Singapore casinos and three more in Nevada. In New York, Bastianich and Batali shuttered La Sirena, a two-year-old Chelsea restaurant that Bastianich says struggled before Batali’s fall and then became untenable once his name turned radioactive.

Now, Bastianich says, the bleeding is over. He points to Otto, a pizzeria designed to look like an Italian train station, which he and Batali opened in 2003 near New York University’s Greenwich Village campus. “NYU had blacklisted us,” Bastianich says. “The students are back. So, slowly, but surely, things are starting to pick up again.” (...)

For more than two decades, Bastianich and Batali were one of the most successful teams in the restaurant trade. A former Merrill Lynch bond trader who abandoned Wall Street for the restaurant world, Bastianich befriended Batali in the 1990s after the chef made his mark in New York by opening Pó, a compact, fondly remembered West Village establishment. Pó was a sensation, and not just because the food was great. Batali was destined for stardom beyond the kitchen. The Food Network was taking off, and he became one of its early stars with the show Molto Mario, on which he taught guests like R.E.M.’s Michael Stipe and the Gyllenhaal siblings the ins and outs of cooking Italian food. For just about anyone who aspired to go beyond warming up a jar of Ragu pasta sauce, Molto Mario was tantalizing.

In 1998, the pair unveiled Babbo, with Batali in the kitchen and Bastianich presiding over the front of the house. Restaurant critics marveled at Batali’s deployment of what were then considered left-field ingredients such as testa, better known as head cheese, and offal. They also admired Bastianich’s all-Italian wine list and his idiosyncratic approach to sales. “ ‘Try it,’ you hear him urging customers, ‘if you don’t like it, I’ll drink it myself,’ ” the New York Times reported.

The success of Babbo enabled the partners to open more places: fancy pizzerias in New York, Connecticut, Boston, and Los Angeles; a Vegas burger joint; a casual Roman trattoria in the West Village; and more fine-dining establishments, the most famous being Del Posto in New York’s Meatpacking District, which earned a rare four-star rating from the Times. They were linked together by a management services company known as Batali & Bastianich Hospitality Group, but the restaurants themselves were separate LLCs involving a variety of different partners.

The duo also teamed up with Eataly founder Oscar Farinetti in 2010 to open the first American outpost of his Disneyland version of an Italian market, with seven restaurants, a rooftop beer garden, a coffee bar, and a grocery store. In 2012, Bastianich told the Times that Eataly generated a third of his organization’s $250 million annual revenue. Soon Eataly spread to Chicago, Boston, Los Angeles, and Las Vegas, and a second New York Eataly opened. (Bastianich declines to talk finances now.)

Batali took as many chances with his personal brand as he did with his food. He became one of the hosts of ABC’s The Chew, a daytime culinary talkathon. He wrote Mario Tailgates NASCAR Style, which he described as “the essential cookbook” for fans of the races. Even as he worked his common touch, the literati fawned over him. Batali was lionized by the New Yorker’s Bill Buford in the best-selling book Heat, which recounted the writer’s adventures as an apprentice in Babbo’s kitchen. Jim Harrison, the late novelist-poet with a side hustle as a food writer, described a dinner at Babbo as “easily the best meal I’ve ever had in an American restaurant” in his book The Raw and the Cooked: Adventures of a Roving Gourmand.

Bastianich, too, became a star. Once a tubby second banana who was as terse as his partner was voluble, he slimmed down, becoming a marathon runner who still drank a bottle of good wine daily but was also passionate about red Gatorade. He produced a profane and highly readable memoir entitled Restaurant Man, in which he recounted the business moves behind many of the restaurants he and Batali had opened. In particular he described how they’d acquired some of the buildings in which their eateries were located, including the former carriage house in which Babbo is situated. “Every restaurant opens based on a real estate deal,” he wrote.

by Devin Leonard and Kate Krader, Bloomberg |  Read more:
Image: Eugene Gologursky

Death of the Calorie

Millions of dieters give up when their calorie-counting is unsuccessful. Camacho was more stubborn than most. He took photos of his meals to record his intake more accurately, and would log into his calorie spreadsheets from his phone. He thought about every morsel he ate. And he bought a proliferation of gadgets to track his calorie output. But he still didn’t lose much weight.

One problem was that his sums were based on the idea that calorie counts are accurate. Food producers give impressively specific readings: a slice of Camacho’s favourite Domino’s double pepperoni pizza is supposedly 248 calories (not 247 nor 249). Yet the number of calories listed on food packets and menus are routinely wrong.

Susan Roberts, a nutritionist at Tufts University in Boston, has found that labels on American packaged foods miss their true calorie counts by an average of 8%. American government regulations allow such labels to understate calories by up to 20% (to ensure that consumers are not short-changed in terms of how much nutrition they receive). The information on some processed frozen foods misstates their calorific content by as much as 70%.

That isn’t the only problem. Calorie counts are based on how much heat a foodstuff gives off when it burns in an oven. But the human body is far more complex than an oven. When food is burned in a laboratory it surrenders its calories within seconds. By contrast, the real-life journey from dinner plate to toilet bowl takes on average about a day, but can range from eight to 80 hours depending on the person. A calorie of carbohydrate and a calorie of protein both have the same amount of stored energy, so they perform identically in an oven. But put those calories into real bodies and they behave quite differently. And we are still learning new insights: American researchers discovered last year that, for more than a century, we’ve been exaggerating by about 20% the number of calories we absorb from almonds.

The process of storing fat – the “weight” many people seek to lose – is influenced by dozens of other factors. Apart from calories, our genes, the trillions of bacteria that live in our gut, food preparation and sleep affect how we process food. Academic discussions of food and nutrition are littered with references to huge bodies of research that still need to be conducted. “No other field of science or medicine sees such a lack of rigorous studies,” says Tim Spector, a professor of genetic epidemiology at Kings College in London. “We can create synthetic DNA and clone animals but we still know incredibly little about the stuff that keeps us alive.” (...)

Our fixation with counting calories assumes both that all calories are equal and that all bodies respond to calories in identical ways: Camacho was told that, since he was a man, he needed 2,500 calories a day to maintain his weight. Yet a growing body of research shows that when different people consume the same meal, the impact on each person’s blood sugar and fat formation will vary according to their genes, lifestyles and unique mix of gut bacteria.

Research published this year showed that a certain set of genes is found more often in overweight people than in skinny ones, suggesting that some people have to work harder than others to stay thin (a fact that many of us already felt intuitively to be true). Differences in gut microbiomes can alter how people process food. A study of 800 Israelis in 2015 found that the rise in their blood-sugar levels varied by a factor of four in response to identical food.

Some people’s intestines are 50% longer than others: those with shorter ones absorb fewer calories, which means that they excrete more of the energy in food, putting on less weight.

The response of your own body may also change depending on when you eat. Lose weight and your body will try to regain it, slowing down your metabolism and even reducing the energy you spend on fidgeting and twitching your muscles. Even your eating and sleeping schedules can be important. Going without a full night’s sleep may spur your body to create more fatty tissue, which casts a grim light on Camacho’s years of early-morning exertion. You may put on more weight eating small amounts over 12-15 hours than eating the same food in three distinct meals over a shorter period.

There’s a further weakness in the calorie-counting system: the amount of energy we absorb from food depends on how we prepare it. Chopping and grinding food essentially does part of the work of digestion, making more calories available to your body by ripping apart cell walls before you eat it. That effect is magnified when you add heat: cooking increases the proportion of food digested in the stomach and small intestine, from 50% to 95%. The digestible calories in beef rises by 15% on cooking, and in sweet potato some 40% (the exact change depends on whether it is boiled, roasted or microwaved). So significant is this impact that Richard Wrangham, a primatologist at Harvard University, reckons that cooking was necessary for human evolution. It enabled the neurological expansion that created Homo sapiens: powering the brain consumes about a fifth of a person’s metabolic energy each day (cooking also means we didn’t need to spend all day chewing, unlike chimps).

The difficulty in counting accurately doesn’t stop there. The calorie load of carbohydrate-heavy items such as rice, pasta, bread and potatoes can be slashed simply by cooking, chilling and reheating them. As starch molecules cool they form new structures that are harder to digest. You absorb fewer calories eating toast that has been left to go cold, or leftover spaghetti, than if they were freshly made. Scientists in Sri Lanka discovered in 2015 that they could more than halve the calories potentially absorbed from rice by adding coconut oil during cooking and then cooling the rice. This made the starch less digestible so the body may take on fewer calories (they have yet to test on human beings the precise effects of rice cooked in this way). That’s a bad thing if you’re malnourished, but a boon if you’re trying to lose weight.

by Peter Wilson, 1843 |  Read more:
Image: Paul Zak