Friday, May 31, 2019

Peter Tosh


Repost

The Fed's Dangerous 'New Normal'

The American public doesn’t have much appetite for monetary matters, and most of that limited attention has been riveted on the political fights over President Donald Trump’s controversial nominees to the Federal Reserve Board. But there’s a far more serious piece of news on the Fed front.

The Fed’s once-revered independence and traditional controls on government spending have been dangerously eroded, with almost no public notice or debate. And unless the Fed itself or Congress does something about it, our financial system is at risk.

When did this happen? In a news conference in March, Fed Chairman Jerome Powell announced that the central bank would stop unwinding its balance sheet this September. That decision, phrased in the typically dry language of central bank news releases, didn’t make headlines. Yet it was a watershed: It was the most obvious sign yet that the Fed’s program to “normalize” monetary policy, as it had promised to do since 2009, was coming to an end. In essence, the Fed has decided to keep its emergency monetary powers and stick to its new methods of managing the supply of money in the economy indefinitely.

That “new normal,” which the Fed adopted during the financial crisis, includes novel methods for controlling interest rates. During the crisis, those methods allowed the Fed to engage in “quantitative easing,” meaning large-scale purchases of government bonds and other securities. But while they helped it fight the Great Recession, the Fed’s quantitative easing powers also fudged the traditional boundary line between fiscal policy, which Congress controls and which includes decisions about government funding, and monetary policy, which the Fed controls and which is supposed to be dedicated solely to fighting recessions and limiting inflation.

By blurring that boundary line, the Fed’s new methods threaten to undermine its critically important independence. An independent central bank ensures that neither the president nor Congress can decide to fund special projects or tweak economic growth by compelling the Fed to print more money. But the longer the Fed retains its “new normal,” the more that independence is at risk.

To understand why this new normal is so risky, you first need to understand how we got here.

Before the 2008 financial crisis, the Fed controlled inflation by creating or destroying bank reserves. When the Fed created reserves, interest rates declined, banks increased their lending and the supply of money in the economy expanded. When it supplied fewer reserves, it checked inflation.

But after the failure of Lehman Brothers in September 2008, the Fed started paying interest on banks’ reserves — the cash that banks must hold to meet reserve requirements and settle accounts with one another. Its goal was to get banks to stockpile reserves its emergency lending was creating instead of lending them, to avoid excessive money growth in the economy and prevent inflation. The decision to pay interest on reserves effectively allowed the Fed to control inflation no matter how many reserves it created — something it never could do before.

Second, in its first round of what later became known as quantitative easing, the Fed took the extraordinary move of buying large quantities of mortgage-backed securities from banks to keep their value from plummeting as the asset markets froze. In later rounds of quantitative easing, it also bought large amounts of long-term Treasury securities.

These two new policies left banks holding more than $2.5 trillion in reserves and quadrupling the overall size of the Fed’s assets. In the past, such a large increase in bank reserves would have led to high inflation. But the Fed could now avoid that result simply by making sure the rate it paid banks stayed high enough to get them to hoard any reserves the Fed created. That’s why quantitative easing hasn’t made prices skyrocket, as many Republicans feared it would when they were still a party of monetary hawks.

When quantitative easing began, then-Chairman Ben Bernanke promised that after the recession ended the Fed would revert to its “normal” self — meaning that the central bank would go back to a modest-size balance sheet and stop encouraging banks to hoard reserves. Later, the Fed released a normalization plan explaining how it would “unwind” its swollen balance sheet — letting it shrink as its bond holdings matured — and otherwise get back to business as usual.

Now that Powell has announced an end to that unwinding, is the Fed almost back to normal? Hardly. Far from resembling its precrisis self, it looks and operates much as it did in the worst days of the recession, bloated balance sheet and all. When Lehman Brothers failed, the Fed held $900 billion in assets, consisting mainly of short-term Treasury bills. Quantitative easing added another $3.6 trillion, all in long-term Treasury and mortgage-backed securities. When its unwinding ends in September, the Fed’s assets will still top $3.8 trillion.

Fed officials were once proud of the Fed’s lean, clean portfolio: “lean” because it was small relative to the size of the U.S. economy; “clean” because it was free of risky, including long-term, assets. Having it so, a 2002 Fed study said, kept the central bank from interfering with “relative asset values and credit allocation within the private sector.” That decision kept the Fed clear of fiscal policy, leaving support for particular markets, such as housing, and responsibility for funding the government entirely to the Treasury and Congress.

But today, the Fed’s willingness to hold on to the massive reserves, coupled with its ability to gobble up debt — including government debt — without fueling inflation, make it harder for Fed officials to resist a Republican administration’s own call for renewed quantitative easing. Instead of more “quantitative tightening,” Trump said in April, the Fed “should actually now be quantitative easing. … You would see a rocket ship.” Fed officials might scoff at Trump’s rocket ship analogy, but they can no longer claim that caving in to his demands would mean losing their ability to control inflation.

Trump is not the only politician Fed officials have to worry about: Politicians of either party are equally likely to lean on it. While Republicans are learning to love the Fed’s printing press — not one Republican senator scolded Powell for the Fed’s decision to stop raising interest rates and prematurely end its unwind — Democrats see more quantitative easing as a painless way to finance their own favorite projects. Instead of being a fluke, in other words, Trump’s call for more quantitative easing may turn out to be a taste of things to come.

by George Selgin, Politico |  Read more:
Image: Mark Wilson/Getty

The “Zero Waste” People Must be Stopped

My feelings regarding the “zero-waste movement” are vast and varied, but come in many shades of upset. I’m upset when I see a friend wracked with guilt when she succumbs to buying a plastic water bottle on an oppressively hot afternoon. I’m upset when a bratty roommate preaches the zero-waste gospel to me when I know they drive half a mile to work every day. I’m upset that I’m expending energy being upset over a movement that is well-intentioned instead of on one that is not.

According to the Zero Waste International Alliance, the zero-waste movement is “the conservation of all resources by means of responsible production, consumption, reuse, and recovery of products, packaging, and materials without burning and with no discharges to land, water, or air that threaten the environment or human health,” or, put more simply, consuming less shit so you have less shit to throw away. The movement has become prominent in recent years as a response to increasingly alarming news of plastic-clogged waterways, growing landfills, and insufficiently low recycling rates.

It all sounds pretty good. I am definitely a proponent of resource conservation and waste reduction. I would love for consumption to be less of a strain on the environment. My exasperation rests in the gap between this definition and its real-life execution, which seems to be focused primarily on one type of waste: household plastic.

Before we go any further, let’s talk about the problem with the word “zero” in zero-waste movement. Perhaps if the name was “less-waste movement” or “not-as-much-waste movement” or something equally pragmatic, I wouldn’t be so bothered. But, and I’m sorry to have to say this, the “zero” in zero-waste is impossible first and foremost due to the second law of thermodynamics, which states that the quality of energy degrades as it is used. Your banana will either be eaten and partially excreted by you or left to rot somewhere; neither is zero waste and entropy is undeniable. Therefore, discussions about waste should be framed around reduction, not some fundamentally unattainable “zero.” This sets everyone up for failure, making the movement fertile grounds for some of the same damaging psychology present in movements like “clean eating.”

Next on my list of grievances is how gender and zero-waste intersect. A casual glance at the #zerowaste hashtag on Instagram or at the author bio pages of numerous zero-waste blogs will highlight how feminine this movement skews. I suspect this movement is prone to an even larger gender gap than other environmental initiatives due its focus on how household waste can be reduced, a domain over which women still seem to reign. This is annoying, mostly because being zero-waste is time-consuming and expensive. Living a zero-waste lifestyle involves very inconvenient things like making your own toothpaste and never ordering takeout. Moreover, it’s well-documented that men tend to recycle less, litter more, and have larger carbon footprints. I would love for men and women to at least be inconvenienced equally.

More insidiously, I am concerned that zero-waste is narrowing people’s perceptions of waste to consist only of what they physically put in the garbage. Those who write about their individual reduction efforts tend to use phrases like “plastic-free” interchangeably with “zero-waste” (exemplified in a post on the popular blog Going Zero Waste titled “3 Zero-Waste Shops for All Your Plastic-Free Needs”) These phrases are not synonymous — waste comes in many forms. And even if a business claims to be zero-waste, such can be misleading. According to a recent article in Smithsonian Magazine, there is growing tide of grocery stores that claim to be “zero waste” by selling only bulk items. But while this might mean there is no visible packing waste for the shopper, there is definitely some unseen amount of packaging with which the store itself had to contend — it’s not as if a truck of loose Brazil nuts showed up for delivery.

Zero-wasters think that producing teeny-tiny amounts of garbage — specifically, one mason jar’s worth — is the pinnacle of environmentally conscious behavior. But the notion that the packaging of your purchased end product is an accurate reflection of the waste generated in the entire production process of that item is insane. You might buy minimally packaged meat, but eating meat is one of the most wasteful consumption habits in which you can partake. While some zero-wasters proudly announce that they have avoided food packaging at the airport, one flight can negate a year’s worth of otherwise environmentally conscious behavior.

Further, there is inherent tension between packaging food to minimize food waste and the waste from the packaging itself. Compared to its sinful plastic counterpart, the virtuous glass bottle requires much more packaging to not break in transit. The plastics industry itself was borne from repurposing the waste from petroleum refining; the affordability of plastic goods, from kitchenware to clothing, has democratized consumption and blurred class lines, facilitating social mobility for many and ultimately leading to a more sustainable (ahem, less wasteful) use of human capital. (...)

The whole point is that the stuff shouldn’t be made in the first place, or that the environmental externalities of those items should be forcibly internalized so that prices spike precipitously. If purchasing more items — like a plastic-free toilet brush or Goop-sanctioned “Zero Waste Starter Kit” — makes you feel like you are helping the environment, perhaps ask yourself why.

by Tara Conway, The Outline |  Read more:
Image: uncredited
[ed. Not to mention all those Amazon boxes.]

Miley Cyrus, Swae Lee, Mike WiLL

Molecules of US Freedom


US Department of Energy is now referring to fossil fuels as “freedom gas (Ars Technica)

In a press release published on Tuesday, two Department of Energy officials used the terms "freedom gas" and "molecules of US freedom" to replace your average, everyday term "natural gas."
Image: Craig Hartley/Bloomberg via Getty Image
[ed. A for effort.]

Thursday, May 30, 2019


Miguel Covarrubias (1904-1957), Bathing in the River
via:

All-American Despair

The Centers for Disease Control recorded 47,173 suicides in 2017, and there were an estimated 1.4 million total attempts. Many of society’s plagues strike heavier at women and minorities, but suicide in America is dominated by white men, who account for 70 percent of all cases. Middle-aged men walk the point. Men in the United States average 22 suicides per 100,000 people, with those ages 45 to 64 representing the fastest-growing group, up from 20.8 per 100,000 in 1999 to 30.1 in 2017. The states with the highest rates are Montana, with 28.9 per 100,000 people; Alaska, at 27 per 100,000; and Wyoming, at 26.9 per 100,000 — all roughly double the national rate. New Mexico, Idaho and Utah round out the top six states. All but Alaska fall in the Mountain time zone.

Last summer, I began a 2,000-mile drive through the American West, a place of endless mythology and one unalterable fact: The region has become a self-immolation center for middle-aged American men. The image of the Western man and his bootstraps ethos is still there, but the cliché has a dark turn — when they can no longer help themselves, they end themselves. I found men who sought help and were placed on a 72-hour hold in a hospital ward, and say they were sent home at the end of their stay without any help, collapsing back into the fetal position — the only thing accomplished was everyone in the small town now knew they were ill. I found men on both sides of the Trump divide: One whose anger toward his abusive parents was exacerbated by hours in his basement watching Fox News and Trump while drinking vodka; the other was a Buddhist mortician whose cries for help were met by scorn in a cowboy county that went 70 percent for Trump.

“I have no one,” a man told me quietly over coffee. Outside, an unforgiving wind whipped through the tall grass. “The winters here are killing me.”

I found something else: guns, lots of them. Guns that could be procured in an hour. A house where a wife did a gun sweep and found dozens hidden. I found suicidal men who balked at installing gun locks on their pistols because they were afraid of being caught unarmed when mythical marauders invaded their homes. And I found that the men who survived suicide attempts had one thing in common: They didn’t use guns. Pills can be vomited, ropes can break, but bullets rarely miss.

For years, a comfortable excuse for the ascending suicide rate in the rural West was tied to the crushing impact of the Great Recession. But it still climbs on a decade later.

“There was hope that ‘OK, as the economy recovers, boy, it’s going to be nice to see that suicide rate go down,’ ” says Dr. Jane Pearson, a suicide expert at the National Institute of Mental Health. “And there’s a lot of us really frustrated that didn’t happen. We’re asking, ‘What is going on?’ ”

The impact of hard times can linger long after the stock market recovers. A sense of community can disintegrate in lean years, a deadly factor when it comes to men separating themselves from their friends and family and stepping alone into the darkness.

“There’s been an increase in the ‘every-man-for-himself mentality,’ ” says Dr. Craig Bryan, who studies military and rural suicide at the University of Utah. “There doesn’t seem to be as strong a sense of ‘We’re all in this together.’ It’s much more ‘Hey, don’t infringe upon me. You’re on your own, and let me do my own thing.’ ”

The climactic scene in Westerns has always been the shootout. Now that’s being acted out in a deadly monologue. Activists in gun-friendly states tiptoe around the issue of eliminating guns, instead advocating locking them up to keep them out of the hands of the desperate and angry. Their efforts are noble, but futile. In Utah, 85 percent of firearm-related deaths are suicides. One of the shocking things Bryan learned was that many of these deaths were suicides of passion — impulsive, irrevocable acts.

“A third of the firearm suicides in Utah happened during an argument,” says Bryan. “Two people were having at it. Not necessarily physically violent, but they were yelling. And someone in the moment, almost always a man, basically just says, ‘I’m done,’ grabs a gun, shoots himself, and he’s dead.”

No segment of the population is more likely to be impacted by these horrifying numbers than middle-aged men in rural America. They not only own guns and lack mental-health resources — by one estimate, there are 80 or so psychiatrists licensed to practice in Wyoming — but they also have chosen a life that values independence above all else.

“It becomes a deterministic thing,” says Pearson. “You are the type of man who has chosen to isolate himself from town, health care and other people. Then you shoot yourself, and you’re hours from a trauma center.”

It’s easy to bash white middled-aged men in America. As a member of that privileged group, I’ll admit that much of the bashing has been warranted: No group in the history of the world has been given and squandered more than the white man. Yet the American white man is responsible for enough suicides annually that Madison Square Garden could not hold all the victims. And no matter how privileged, that’s somebody’s dad, someone’s friend, someone’s brother and someone’s husband. (...)

I spent the next day driving nine hours on two-lane blacktop across the vast, empty part of America from Ketchum to Casper, Toby Lingle’s hometown. In between Sean Hannity braying at me from every AM station, I listened to Marc Maron’s WTF podcasts with Anthony Bourdain and Robin Williams, two of the more notable recent examples of the middle-aged white-male suicide epidemic.

Both men had battled depression, and had struggled with drug and alcohol abuse, a prevalent factor in suicides particularly in the area I was driving through — 30 percent of male suicide victims nationwide have alcohol in their systems, but in states like Wyoming, experts say, it is closer to 40 percent. Otherwise, they couldn’t have been more different.

Bourdain sounded self-deprecatingly ebullient, a gourmand bon vivant who had kicked cocaine and heroin after decades of abuse.

“When you know how low you can go, when you’ve hurt and disappointed people and humiliated yourself for many years, you’re not going to start complaining about the wrong bottled water,” said Bourdain with a small laugh.

He sounded happy and in control, but there were other signs. After Bourdain’s suicide, a fan counted up the times that he uttered lines on his show like “I determine not to hang myself in the shower stall of my lonely hotel room.” That is precisely how Bourdain killed himself in Kaysersberg, France.

Williams offered more-obvious clues. During the Maron interview, Williams talked about an alcohol relapse while filming in Alaska. He sounded exhausted and told a seemingly funny story about the discussion he had with his “conscious brain” about suicide.

“There was only one time, even for a moment, where I thought, ‘Fuck life,’ ” Williams told Maron. He then re-created the inner dialogue:

“You know you have a pretty good life as it is right now. Have you noticed the two houses?”
“Yes.”
“Have you noticed the girlfriend?”
“Yes.”
“Have you noticed that things are pretty good . . . ?”
“Yes.”
“OK, let’s put the suicide over here under ‘discussable’. . . . First of all, you don’t have the balls to do it. I’m not going to say it out loud. I mean, ‘Have you thought about buying a gun?’ ”
“No.”
“What were you going to do, cut your wrist with a Waterpik?”


His conscious brain asked him what he was currently up to.

“You’re sitting naked in a hotel room with a bottle of Jack Daniel’s?”
“Yes.”
“Is this maybe influencing your decision?”
“Possibly.”


Williams then riffed that maybe he would talk about it on a podcast two years down the road. Maron couldn’t help but laugh.

Four years later, suffering from undiagnosed Lewy body dementia, Williams would hang himself in the same Marin County home where the interview took place.

Two of the more famous men of the past 50 years fell into the same existential pit that has gripped so many their age: a curtain slowly falling over a life that outsiders saw as filled with privilege and promise.

Even the famous leave few breadcrumbs.

by Stephen Roderick, Rolling Stone |  Read more:
Image: Brian Stauffer for Rolling Stone

Apply Directly to the Forehead

Like many preteens prescribed heavy-duty amphetamines, I had difficulty sleeping. This manifested in a variety of ways, but the most common was sneaking out of my room to watch my favorite late-night channel, The Weather Channel, until I passed out from exhaustion on my family’s Southwestern-patterned sofa. (For an experience enhancer while reading this essay, please see this video).

The year was 2006, and I was twelve (sorry, old people). I had just started middle school, and, frankly, I was not having a good time. In addition to being put on high-strength narcotics, I was beset with a kind of permanent feverish anxiety.

So imagine, if you will, a vibrating preteen trying to command their cortisol levels, immersing themselves in the mundanity of forecasts interspersed with outdated Local on the 8s interludes courtesy of rural cable, when suddenly, interrupting a commercial for super-sharp knives that has already started, is this:


A woman of rather conservative appearance stares into the abyss like a military commander on a Maoist poster as she rubs what appears to be a glue-stick on her forehead. In the background, an Eisenmanian grid, and a voice, not quite human, but not quite automated, chants:
HeadOn: Apply directly to the forehead. HeadOn: Apply directly to the forehead. HeadOn: Apply directly to the forehead.
We cut to an image of the product in its packaging, while the voice-over tells us that
HeadOn is available without a prescription at retailers nationwide.
The whole event is over in fifteen seconds, but I remember thinking to myself, what the fuck did I just witness? (...)

Back when Head On was only airing during late night television on non-primetime networks, it gave the unsettling impression that one had just witnessed something they were not supposed to be witnessing, that aliens had descended to earth and hijacked the television networks with this fucked up commercial for a bullshit headache remedy consisting mostly of wax. Slowly but surely, it snatched up ad spots on primetime and daytime television and evolved into an annoyance; but for a brief, mystical time, the commercial would have been a proper subject for investigation by Mulder and Scully.

A combination of elements made the initial run of Head On so bizarre. First, it aired during times, and on networks, with very low viewership, borrowing a mood from the existing isolation of being up late at night watching something probably very few people are watching, like The Weather Channel or infomercials on HGTV. Once the aura of the witching hour was no longer a part of the Head On experience, it quickly became charmless.

Second, the commercial would reappear in erratic fashion. Sometimes, a different commercial would already begin to air, when, about two to five seconds later, it would be cut off by the Head On commercial, as if it had swooped in from above to smite its 1-800-number competition. After seeing it the first time, it took a while before I saw the Head On commercial again—probably several days. This left me in a state of suspense, which allowed me to question my reality—had I really seen that weird commercial or was I just imagining it in the throes of an anxiety fugue? When I saw it the second time, at least I knew it was a real commercial. But beneath the surface I still questioned whether or not I, the 2 a.m. Weather Channel watcher in rural North Carolina, was still the only person who had witnessed it.

Finally, the commercial itself is just plainly weird—absurd, even. The aesthetics of the commercial were decidedly low-fidelity; it could have been made by anyone with a green screen and text processor. The local car dealership ads had a greater production value. The content of the commercial, its short duration, the loud, repetitive urgency of the brand name, the woman staring into space rubbing a glue stick on her forehead, were so freakishly unlike anything actual humans would reasonably produce to sell something—it gave the ad the eerie feeling of pirate television. It was an anti-advertisement; instead of trying to persuade the viewer to purchase a product, like pretty much every advertisement, it just yelled the product name over the image of a hypnotized actress. Rather than building a case for why someone should buy this confounding stick (or even telling viewers what the product does), it simply lobs a brick into the unsuspecting consciousness of the consumer.

Absurdist advertising is everywhere now. The Burger King and KFC Colonel have metamorphosed from folksy, gentlemanly mascots into big-headed horrors and misguided sex symbols, respectively. Skittles tried to sell Skittles by likening them to the plague. Geico, that ever-present representative of the weird commercial, recently started to slip in their old ad campaigns from the 1990s back into the mix. Old Spice did the “I’m on a Horse” campaign. The list goes on. Combined with brands pretending to have depression on Twitter, the indication is largely that brands are too online, and what was once an amusing wink wink nudge nudge to the strange world of internet culture is beginning to give off more of a Hello Fellow Kids vibe.

When fast food chains and insurance companies get weird, the sentiment, while at first amusing (at least, ten years ago it was), now elicits a dour reaction: oh, the Brands™ are at it again. Big brands can release absurdist commercials ’til the cows come home, but they’ll never truly be weird because, honestly, people know what Skittles are. Part of what makes Head On so weird, even to this day, is that it isn’t a brand that has been around in the public consciousness forever. When the ads debuted nobody knew what the fuck it was, and most people probably still wonder if it was ever a real product. It was a real product.

by Kate Wagner, The Baffler |  Read more:
Image: YouTube

Born to Be Eaten

The caribou cow gives birth on her feet. She stands with legs wide apart, or turns on the spot, shuffling in slow circles, craning her long neck to watch as her calf emerges inch by inch from below her tail, between her hips. It’s oddly calm, this process — a strange thing to witness for us two-legged mammals, more accustomed to the stirrups and the struggle and the white-knuckled screaming of a Hollywood birth scene.

The calf, when he comes, emerges hooves first. He climbs into the world fully extended, like a diver stretching toward the water. Out come the front pair of hooves, capping spindly legs, then the long narrow head, the lean, wet-furred body, and finally, another set of bony legs and sharp little hooves. His divergence from his mother leaves behind nothing but some strings of sticky fluid and a small patch of bloody fur. He doesn’t know it, but the land he is born on is one of the most contentious stretches of wilderness in North America.

Still slick with mucus, the calf takes his first steps within minutes, stumbling awkwardly to his feet as his mother licks him clean. Within 24 hours, he is able to walk a mile or more. Soon, if he survives long enough, he will be capable of swimming white-water rivers, outrunning wolves, and trotting overland for miles upon miles every day. His life will offer myriad dangers and only the rarest respite; for the caribou, staying alive means staying on the move.

The days and weeks immediately after his birth are critical. That’s why, if at all possible, his mother will have sought out a known quantity, a place of relative safety, before he arrived. That’s why, every year, tens of thousands of heavily pregnant caribou cows return, salmon-like, to the places where they were born.

For the Porcupine caribou herd, 218,000 strong, that means a long march through snow-choked mountains to one of two calving grounds. One, lesser used, is in Canada, in the northwestern corner of the Yukon Territory between the Firth and Babbage rivers. It’s protected by the invisible boundaries of Ivvavik National Park.

The other, the most commonly used by the herd, is a small slice of land just across the border in northeastern Alaska, a flat patch tucked between the Brooks Range and the Beaufort Sea. The land is unassuming but critical: When, every so often, the herd fails to make it to the calving grounds on time — as can be the case when deep snow lingers late into the spring — their calves’ mortality rate can climb by as much as 20 percent.

This primary calving ground lies within the Arctic National Wildlife Refuge (ANWR), but unlike its counterpart across the border, it has not been formally sealed off from large-scale human activity. Instead, for 40 years, a debate has raged about its status. On one side are those who want the oil that could lie below the calving grounds extracted. On the other are those who want the area protected from industry forever.

The Porcupine caribou herd is caught between the two, its fate tied up in Washington committee rooms and the fine print of legislation. And intimately connected to the caribou is the Gwich’in nation, roughly 9,000 people scattered across Alaska and northern Canada. In fighting to protect the caribou, they are fighting for their own survival. (...)

In mid-1967 and early 1968, the Prudhoe Bay oil field was discovered along the Arctic coast just west of the Porcupine herd’s territory. In 1973, the OPEC crisis hit, ratcheting up concerns about a domestic oil supply. In 1977, the Trans-Alaska pipeline system was completed, and Alaska’s oil began to flow.

In November 1980, the United States Congress passed the Alaska National Interest Lands Conservation Act. Among other changes, it more than doubled the size of ANWR, to about 19 million acres — just slightly smaller than South Carolina. The act changed the name of the area, from the “Wildlife Range” to “Wildlife Refuge.” And within those 19 million acres, it formally designated 8 million as “wilderness” under the terms of the 1964 act.

The 1980 act also set aside 1.5 million non-wilderness acres on the refuge’s northern edge for further study — an area of ANWR’s coastal plain that encompasses the calving grounds of the Porcupine caribou herd. Section 1002 of the act, titled “Arctic National Wildlife Refuge Coastal Plain Resource Assessment,” outlined a process of inventory and assessment of resources, and analysis of potential impacts of development, before oil and gas development would be authorized.

In other words: Wait and see. Full-blown oil and gas development was not yet permitted on the coastal plain, but the possibility of future extraction remained.

After decades of campaigning by conservationists, ANWR as we know it today was born. The refuge was built on the strength of powerful words and lofty ideals. Its founders envisioned it as a place that would feed the soul of humanity worldwide. Simply by continuing to exist in its natural state, “where man himself is a visitor,” it would provide an example to the world of what once was. (...)

The Thirty Years’ War

On December 22, 2017, President Donald Trump signed the Tax Cuts and Jobs Act into law. The complex bill had only narrowly made it through the Senate, by a final vote of 51–48, and while it contained an array of changes to the tax code, it also contained something else: a provision to explore drilling possibilities on the coastal plain of the Arctic National Wildlife Refuge. That provision was widely regarded as securing the vote of Alaska Senator Lisa Murkowski, whose support of the tax bill had not been assured. Nearly 40 years after the 1002 loophole was created, and after 30 years of successful lobbying and resistance by the Gwich’in, ANWR was formally open to oil development. It had been a long fight.

It wasn’t long after the 1988 gathering that the Gwich’in had faced their first challenge. In March 1989, a bill to allow leasing across nearly a quarter of the 1002 area had passed a Senate subcommittee. But then, eight days later, the tanker Exxon Valdez tore itself open in Alaska’s Prince William Sound, spilling millions of gallons of crude oil. Suddenly, even for some of the pro-drilling politicians in Washington, the calculus changed.

The Edmonton Journal called the spill a “gift” to the caribou. Doug Urquhart, a Canadian member of the transnational Porcupine Caribou Management Board, wrote in the Yukon’s Dan Sha News that “every cloud has a silver lining.”

“The only positive aspect of the Valdez disaster,” he wrote, “is that it might save the Arctic National Refuge, the Porcupine Caribou Herd, and the thousands of Alaskan and Canadian native people who rely on the herd for economic and cultural survival.”

In the Lower 48, editorials supporting the preservation of the refuge, at least for the time being, popped up like morel mushrooms after a forest fire. “Until it is necessary to drill in the Arctic Wildlife Refuge, it is clearly necessary not to,” went the slightly ambivalent line in the Philadelphia Enquirer. The Dallas Times Herald was more forceful: “It would be wrong to destroy about 1.5 million acres of natural habitat for no valid reason.” The Boston Globe called the energy security argument “a decoy,” and argued that, “on balance, the evidence suggests that risking ecological damage to the wildlife refuge is not justified.”

Vying with images of oil-slicked seals and birds, the Senate subcommittee bill withered. So did another, in 1991, the Johnston-Wallop National Energy Security Act. In 1995, yet another attempt made it through the House and the Senate before running headlong into President Clinton’s veto. There were more legislative skirmishes in 2002, and each time, the ban on oil exploration in the calving grounds held. The 1002 area remained in limbo.

In 2005, the issue came to a head again. In the Republican-led Congress, two different energy bills contained provisions that would have essentially opened the 1002 to drilling. Both were defeated by coalitions of Democrats and moderate Republicans. A third bill, one that targeted Pentagon spending, also included a drilling provision, but that provision was yanked after a successful filibuster.

Meanwhile, with every bill and every counter-campaign, the Arctic Refuge grew in public stature and was mythologized in the American consciousness. It had become a symbol, a talisman, to both sides. The fight over the refuge drew in not only all the major conservation groups — the Sierra Club, the Wilderness League, the World Wildlife Fund, and so on — but also an array of celebrities and mainstream brands. (During the 2005 skirmishes, Ben & Jerry’s produced a 900-pound Baked Alaska and had it carried on a litter of plywood and two-by-fours to Capitol Hill. “This is not going to last very long,” one of the company’s “flavor gurus” announced to a small crowd of protesters. “Just like the Arctic National Wildlife Refuge, if you drill up there.”)

After 2005, things quieted down some. When President Obama was elected in 2008, the Democrats took the Senate and the House as well. Notwithstanding Republicans’ call to “drill, baby, drill,” there was no legislative path forward for the pro-development side.

Early in 2015, Obama recommended that Congress designate the coastal plain, and other non-wilderness areas of ANWR, as wilderness. But that never happened, and as his term wound down, and the 2016 election loomed, people on both sides of the debate wondered if he would exercise his power to declare the area a national monument. He did not. Then came the 2016 election and the 2017 tax bill. The federal government shutdown in early 2019 delayed the commenting and review process for activity in the refuge, meaning that seismic exploration work originally planned for this winter will be held off a year. But the Bureau of Land Management still plans to hold ANWR’s first ever lease sale later this year.

by Eva Holland, Longreads |  Read more:
Image: Peter Mather

Bob Dylan’s Rolling Thunder Revue, Ft Collins CO, May 23, 1976, recorded for an NBC TV special “Hard Rain”, aired September 13.
via:
[ed. What a day. I lived about a quarter mile from CSU stadium and could just walk over.]

Wednesday, May 29, 2019


Manolo Valdés (Spanish, b. 1942), Rostro sobre fondo turquesa, 2002.
via:

Micro Feedback Loops and Learning

I recently discovered Singer’s Studio, an iPhone app for voice training that is approximately the Best Thing Ever (h/t Raemon). It’s a work of pedagogical art, and describing what it does right pulls together various nebulous thoughts I’ve had about learning and developing intuitions for a particular domain.

How the app works: it gives me various singing exercises, and then tells me in real-time, via a pretty graph, if I’m singing on pitch. That’s it.

Okay, it does have a few other features, like:
  • A total score and breakdown by % of notes correct after each exercise.
  • Letting me play myself back to see what being on vs slightly off pitch sounds like.
  • Helpful text prompts such as “keep your tongue behind your teeth.”
  • A built-in progression from easier to more complicated exercises.
  • Exercises for a specific skill, like switching from head to chest voice.
  • The ability to add challenge by turning down/off the piano playing my notes as I sing them.
  • Not a built-in part of the app, but I can glance away from the screen and sing without the visual feedback, and then immediately check how I did. (...)
Rewardingness

It occurs to me that another feature, which I think is less key than the instantaneous pitch contour but still pretty important, is how it slightly gamifies the entire thing, and thus makes it addictive. There are % scores at the end of the exercise! And points! It logs my all-time high score for each exercise so I can try to beat it! It also logs how many minutes a day I’ve practiced. All of this makes me more likely to use it, and actually putting in the time is a key part of training any skill.

It’s also cool that I can listen to my voice and, in addition to catching mistakes (ouch!), notice when hey, wow, I actually sounded good there. This lets me gradually figure out what correlates with liking how my voice sounds, and it also gives me a warm glow of satisfaction and helps me feel like a Real Singer.

(I’m not sure if the app is cheating by doing some kind of post-processing on these recordings to make them sound pretty. Normally I hate recordings of myself speaking, let alone singing. Still, I’ll take it.)

It also seems relevant that there’s zero embarrassment factor – this isn’t a human watching me and judging me for daring to sing when I kind of suck. I’m pretty shameless, as humans go, but I’ve been slightly nervous with every voice teacher I’ve had – they’re an expert! they’re probably really unimpressed! – and tension is not good for singing well. With this, I can sing my heart out in the privacy of my apartment, and even feel safe experimenting.

How does this generalize?

This is a post about singing, but it’s also a post about learning skills in general.

Learning to sing (or to play the piano, tie one’s shoes, draw, dance, swim; all the things commonly known as procedural memory) isn’t like memorizing a list of dates for a history test. There are some steps that can usefully happen in the explicit verbal loop, like “remember to breathe from the diaphragm”, but the end goal is that basically nothing is being held in working memory, and everything happens on the level of microsecond-to-second intuitions and muscle memory. (...)

Implications

This app is a really cool category of thing, that’s only possible at all due to fairly recent technological advances, and there are probably a ton more instances that I don’t know about.

I’m curious where else this has been explored. Singing may be an easy case, because measuring a single straightforward variable, pitch, gets you so far. I can imagine an app that trains, say, krav maga fighting techniques, via video analysis and/or accelerometer data, but I’m not sure that’s possible yet given current tech.

It has me thinking about other pedagogical techniques, though. Martial arts teachers will shout real-time feedback at you ("turn your hips more! get your knee higher!") I’ve taught swimming, and one issue is that waiting until a swimmer finishes a lap before giving any feedback introduces a huge delay, but grabbing onto them every time they do something slightly wrong is incredibly irritating and disruptive. Now I’m imagining giving them waterproof headphones and narrating the feedback in real time (“elbow higher please”, “roll your shoulder deeper into the water”, “keep your head back when you breathe”, etc etc.) This would be so cool.

by Swimmer963, Less Wrong |  Read more:
Image: via

Which Box Do You Check?

Ever since El Martinez started asking to be called by the gender-neutral pronouns “they/them” in the ninth grade, they have fielded skepticism in a variety of forms and from a multitude of sources about what it means to identify as nonbinary.

There are faculty advisers on El’s theater crew who balk at using “they” for one person; classmates at El’s public school on the outskirts of Boston who insist El can’t be “multiple people”; and commenters on El’s social media feeds who dismiss nonbinary gender identities like androgyne (a combination of masculine and feminine), agender (the absence of gender) and gender-fluid (moving between genders) as lacking a basis in biology.

Even for El’s supportive parents, conceiving of gender as a multidimensional sprawl has not been so easy to grasp. Nor has El’s suggestion that everyone state their pronouns gained much traction.

So last summer, when the Massachusetts State Legislature became one of the first in the nation to consider a bill to add an “X” option for nonbinary genders to the “M” and “F” on the state driver’s license, El, 17, was less surprised than some at the maneuver that effectively killed it.

Beyond the catchall “X,” Representative James J. Lyons Jr. (he/him), a Republican, had proposed that the bill should be amended to offer drivers 29 other gender options, including “pangender,” “two-spirit” and “genderqueer.” Rather than open the requisite debate on each term, leaders of the Democratic-controlled House shelved the measure.

“He articulated an anxiety that many people, even folks from the left, have: that there’s this slippery slope of identity, and ‘Where will it stop?’” said Ev Evnen (they/them), director of the Massachusetts Transgender Political Coalition, which is championing a new version of the bill.

As the first sizable group of Americans to openly identify as neither only male nor only female has emerged in recent years, their requests for recognition have been met with reservations that often cross partisan lines. For their part, some nonbinary people suggest that concerns about authenticity and grammar sidestep thornier questions about the culture’s longstanding limits on how gender is supposed to be felt and expressed.

“Nonbinary gender identity can be complicated,” said Mx. Evnen, 31, who uses a gender-neutral courtesy title. “It’s also threatening to an order a lot of people have learned how to navigate.”

And with bills to add a nonbinary marker to driver’s licenses moving through at least six legislatures this session, the expansive conception of gender that many teenagers can trace to middle-school lunch tables is being scrutinized on a new scale.

A Learning Curve

The wave of proposed gender-neutral legislation has prompted debate over whether extending legal recognition to a category of people still unknown to many Americans could undermine support for other groups vulnerable to discrimination. It has also highlighted how disorienting it can be to lose the gendered cues, like pronouns, names, appearance and mannerisms, that shape so much of social interaction.

Over the last few months, lawmakers have sought — not always successfully — to use the singular “they” when introducing nonbinary constituents who have appeared to testify. The elected officials have listened to tutorials on the difference between sexual orientation and gender identity (the former is who you go to bed with, the latter is who you go to bed as); to pleas for compassion from parents who have learned to refer to their children as “my kid” rather than “son” or “daughter”; and to why being called by binary pronouns feels, as Kayden Reff (they/them), 15, of Bethesda, Md., put it in testimony read by their mother, “as though ice is being poured down my back.”

by Amy Harmon, NY Times |  Read more:
Image: Tony Luong for The New York Times
[ed. I recently saw a notice for an upcoming "queer" party in Seattle (don't ask): “Queer" here is used as an umbrella term for anyone who identifies as not-heterosexual or not-cisgender, including but not limited to people who identify as: gay, lesbian, bisexual, genderqueer, agender, bigender, transgender, third gender, trans, genderfluid, gaymo, bicurious, asexual, homosexual, homofabulous, homoflexible, heteroflexible, intersex, questioning, pansexual, MTF, FTM, butch, femme, fag, dyke.

If you identify with one of these groups, then this party is for you!
]

Tuesday, May 28, 2019

Steely Dan


I got one and you want four
It's so hard to help you
I can't keep up with you no more
And you treat me like it's a sin
But you can't lock me in
You want me here with you right to the end
No thank you my friend
I fear the monkey in your soul

Won't you turn that bebop down
I can't hear my heart beat
Where's that fatback chord I found?
Honey don't you think it was wrong
To interrupt my song?
I'll pack my things and run so far from here
Goodbye dear
I fear the monkey in your soul

[ed. Crank it up. See also: this live Santa Monica version of Bodhisattva (starting at 2:25 to avoid the crazy inebriated introduction by Jerome Aniton: "Mr Steely Dan... whatevers." haha). It smokes. Didn't think they could out-amp the original. Also: this, this, and this.]


Julia0123 - Noir in Outer Space, collaboration with Julia Lillard

Utagawa Hiroshige 歌川広重 (1797 - 1858)
via: here and here

The Mass Media Is Poisoning Us With Hate

In “Manufacturing Consent: The Political Economy of the Mass Media,” published in 1988, Edward S. Herman and Noam Chomsky exposed the techniques that the commercial media used to promote and defend the economic, social and political agendas of the ruling elites. These techniques included portraying victims as either worthy or unworthy of sympathy. A Catholic priest such as Jerzy Popiełuszko, for example, murdered by the communist regime in Poland in 1984, was deified, but four Catholic missionaries who were raped and murdered in 1980 in El Salvador by U.S.-backed death squads were slandered as fellow travelers of the “Marxist” rebel movement. The techniques also included both narrowing the debate in a way that buttressed the elite consensus and intentionally failing to challenge the intentions of the ruling elites or the actual structures of power.

“Manufacturing Consent” was published on the eve of three revolutions that have dramatically transformed the news industry: the rise of right-wing radio and Fox-style TV news that abandon the media’s faux objectivity, the introduction of 24-hour cable news stations, and the creation of internet platforms—owned by a handful of corporations—that control the distribution of news and information and mine our personal data on behalf of advertisers, political campaigns and the government. The sins of the old media, bad though they were, are nothing compared with the sins of the new media. Mass media has degenerated into not only a purveyor of gossip, conspiracy theories and salacious entertainment but, most ominously, a purveyor of hate. Matt Taibbi, the author of “Hate Inc.: How, and Why, the Media Makes Us Hate One Another,” has dissected modern media platforms in much the same way that Herman and Chomsky did the old media.

The new media, Taibbi points out, still manufactures consent, but it does so by setting group against group, a consumer version of what George Orwell in his novel “1984” called the “Two Minutes Hate.” Our opinions and prejudices are skillfully catered to and reinforced, with the aid of a detailed digital analysis of our proclivities and habits, and then sold back to us. The result, Taibbi writes, is “packaged anger just for you.” The public is unable to speak across the manufactured divide. It is mesmerized by the fake dissent of the culture wars and competing conspiracy theories. Politics, under the assault, has atrophied into a tawdry reality show centered on political personalities. Civic discourse is defined by invective and insulting remarks on the internet. Power, meanwhile, is left unexamined and unchallenged. The result is political impotence among the populace. The moral swamp is not only a fertile place for demagogues such as Donald Trump—a creation of this media burlesque—but channels misplaced rage, intolerance and animosity toward those defined as internal enemies.

The old media sold itself as objective, although as Taibbi points out, this was more a reflection of tone rather than content. This vaunted objectivity and impartiality was, at its core, an element of a commercial tactic designed to reach the largest numbers of viewers or readers.

“Objectivity was when I was told I couldn’t write with voice,” Taibbi told me when I interviewed him on my television show, “On Contact.” [Part one of the interview; part two.] “I couldn’t write with a point of view. Objectivity was to write in a dull, flat, third-person perspective. Don’t express yourself. Don’t be too colorful. This actually was, if you pick up The New York Times today, that same writing style. The original idea behind it is you didn’t want to turn off people on the start because they’re trying to reach the widest possible audience. This also infected radio, television. That’s why you have this Tom Brokaw, Dan Rather-style delivery, which was monotonal, flat, unopinionated. A lot of people thought this was some kind of an ethical decision that news organizations were making. In fact, what they were trying to do is reach the greatest number of people to sell the greatest number of ads. That’s how we developed that idea.”

The old media rigidly held to the fiction that there were only two kinds of political opinions—those expressed by Democrats and those expressed by Republicans. These two positions swiftly devolved into caricatures on radio and television. The classic example was the show “Crossfire,” in which two antagonists, the stereotypical liberal and the stereotypical conservative, could never agree. The liberal, Taibbi pointed out, “was always cast as the person who couldn’t punch back. He was always in retreat. The conservative was always in attack mode. A personality like Tucker Carlson.” These staged and choreographed confrontations were, in essence, sporting events.

“If you watch a [‘Sunday NFL] Countdown’ you’ll see the sets are designed exactly the same” as that of “Crossfire.” “The anchor on one side. There’s usually four commentators—two that represent each team. They have graphics that tell you what the score is, who is ahead, who is behind. We want people to perceive politics as something they have a rooting interest in. There’s no possibility of any gray area in any of this. Your political identity can’t possibly bleed into any other political identity. You are on one team or another. That’s it. We don’t even acknowledge the existence of people who have different types of ideas. For instance, anti-abortion but also pro-union. Whatever it is. That doesn’t exist in the media.”

The fact that on most big issues the two major political parties are in agreement is ignored. The deregulation of the financial industry, the militarization of police, the explosion in the prison population, deindustrialization, austerity, the endless wars in the Middle East, the bloated military budget, the control of elections and mass media by corporations and the wholesale surveillance of the population by the government all have bipartisan support. For this reason, they are almost never discussed.

“It’s always presented as two parties that are always in disagreement about everything,” Taibbi said, “which is not true.”

“We [members of the press] are not focusing on the timeless, permanent nature of how the system works,” he said. “We don’t think about the central bank. We don’t think about the security state. We don’t think about any of that stuff. We focus on personalities. Donald Trump versus Alexandria Ocasio-Cortez. That simplifies everything and allows us to not look at the bigger picture.”

Once the old media model imploded with the arrival of 24-hour news networks, Fox-style news and the internet, the monopoly of a few dominant newspapers and networks ended. In the new setting, media organizations tailor their content to focus on specific demographics.

“MSNBC, which has gone through some interesting changes over the years, markets itself as a left-leaning network,” Taibbi said. “But it was so intensely pro-war in 2002 that it had to uninvite Jesse Ventura and Phil Donahue from the network. This latest thing was ‘Russiagate’ and the constant hyping of the narrative ‘If you watch, you might learn any minute that we, along with Robert Mueller, are going to take down the president.’ ”

The media model not only sets demographic against demographic, it mutes and destroys investigations into corporate systems of oppression and genuine dissent.

“You don’t have to make the news up for these people,” Taibbi said of the process of carving up the public. “You can just pick stories that they’re going to like. You start feeding them content that is going to ratify their belief systems. Fox did it first. They did it well. They started to make money. They were No. 1 for a long time. But this started to bleed into the rest of the business. Pretty soon, everybody was doing the same thing. It didn’t matter whether you were the food channel tailoring content for people who liked food or MSNBC who tailored content for people who leaned in a certain political direction, you were giving people stuff they wanted to hear.”

“Previously, you were looking at the illusion of debate,” Taibbi said of the old media model. “You would see people arguing on ‘Crossfire.’ On the op-ed pages, there were people who disagreed with each other. Now, most people’s news consumption experience is tailored entirely to their preferences. … If you’re only reading media that tailor to your particular belief system you’re not being exposed to other ideas. It’s going to be progressively more vituperative.”

“One of the first stories that taught the news business you can actually sell anger as a product was the [Monica] Lewinsky scandal,” Taibbi said.

MSNBC built its brand and its audience by relentlessly warning that the presidency of Bill Clinton was in mortal peril during the Lewinsky investigation. It repeated this formula by spending two years hyping the story of supposed Russian collusion with the Trump administration.

“What they were trying to do was basically create the impression that [a new] ‘Watergate was going on, you better tune in because at any moment this could all go kaput,’ ” Taibbi said of the Lewinsky scandal. “They got an enormous market share. Fox added a twist to it. Fox took the same concept and openly villainized the characters. They decided to make Bill and Hillary Clinton into caricatures and cartoon figures, aging hippies. They kept running clip after clip of Hillary Clinton talking about how she didn’t bake cookies. They knew their audience was going to react to all these images in a certain way. They sold people stories that make them angry. They told them, ‘If you keep tuning in, somehow you are a part of the process. You are a part of this ongoing prosecution of this culture enemy that we’re showing you. … We tell you about somebody you don’t like. We keep telling you about it over and over to dominate the ratings.’ ”

The result, Taibbi argues, is a marketing strategy that fosters addictive and aggressive behavior. The more the habits of readers and viewers on the internet and electronic devices are tracked, the more the addiction and aggression are fed.

“This creates more than just pockets of political rancor,” he went on. “It creates masses of media consumers who have been trained to only see in one direction, as if they have been pulled through history on a railroad track, with heads fastened in blinders, looking only one way. … Even without the vitriolic content, just the process of surfing and consuming the news has a lot of the same qualities as other addictions—like smoking cigarettes or taking drugs. People get addicted to the feel of their phones. They get addicted to the process of turning on screens. You especially get addicted to the idea that you’re going to turn on a news program or read an article and it’s going to tell you something that is going to anger you even more than you were yesterday.”

The template for news, Taibbi writes, is professional wrestling.

“Wrestling was a commercial formula that they figured out worked incredibly well,” Taibbi said of the corporate owners of news outlets. “There was a simplified morality play where there was a good guy, who was called the baby face, and a bad guy they called the heel. They relentlessly hyped the bad guy. The heel was more important in wrestling and more popular than the face. The amount of tickets they can sell is a direct correlation to how much people hate the bad guy. You have to have a hateable heel in order to make the formula work. This is how news works.”

by Chris Hedges, TruthDig |  Read more:
Image: Mr. Fish / Truthdig

Bezos Reveals His Ugly Vision For The World He’s Trying To Rule

“Guess what the best planet is in this solar system?” asked Amazon CEO Jeff Bezos at a recent media event on his Blue Origin space program.

“It’s easy to know the answer to that question,” he continued. “We’ve sent robotic probes like this one to all of the planets in our solar system. Now, some of them have been fly-bys, but we’ve examined them all. Earth is the best planet. It is not close. This one is really good.”

Bezos then went on to discuss his plan to ship humans off of the best planet in the solar system and send them to live in floating cylinders in space.

Bezos claimed that the growing human population and growing energy consumption will force us to make a choice between “stasis and rationing” and “dynamism and growth”, and claimed that the latter item in his dichotomy is possible only by moving humans off the planet.

“If we’re out in the solar system, we can have a trillion humans in the solar system, which means we’d have a thousand Mozarts and a thousand Einsteins,” Bezos said. “This would be an incredible civilization. What would this future look like? Where would a trillion humans live? Well it’s very interesting, someone named Gerry O’Neill, a physics professor, looked at this question very carefully and he asked a very precise question that nobody had ever asked before, and it was, ‘Is a planetary surface the best place for humans to expand into the solar system?’ And he and his students set to work on answering that question, and they came to a very surprising–for them–counterintuitive answer: No.”

Bezos went on to describe how the limited surface areas, distance, and gravitational forces of the other planets in our solar system make settling on those planets impractical and cost-prohibitive, while constructing giant space cylinders closer to Earth which can hold a million people is far more practical. These cylinders would spin to replicate Earth’s gravitational pull with centrifugal force.

Here are some illustrations Bezos used in his presentation to show us what these “O’Neill colonies” might look like:


“These are really pleasant places to live,” Bezos said. “Some of these O’Neill colonies might choose to replicate Earth cities. They might pick historical cities and mimic them in some way. There’d be whole new types of architecture. These are ideal climates. These are short-sleeve environments. This is Maui on its best day, no rain, no storms, no earthquakes.”

No rain? No weather? Just big, spinning cylinders floating monotonously in space? A trillion divided by a million is one million, which means that the best idea the richest man in the world can come up with for the future of our species is to fill our solar system with a million of these floating homogenized space malls.

“If we build this vision, these O’Neill colonies, where does it take us? What does it mean for Earth?” Bezos asked. “Earth ends up zoned, residential, and light industry. It’ll be a beautiful place to live, it’ll be a beautiful place to visit, it’ll be a beautiful place to go to college, and to do some light industry. But heavy industry, polluting industry, all the things that are damaging our planet, those will be done off Earth. We get to have both. We get to keep this unique gem of a planet, which is completely irreplaceable–there is no Plan B. We have to save this planet. And we shouldn’t give up a future of our grandchildren’s grandchildren of dynamism and growth. We can have both.”

Now, if you look at the behavior of Jeff Bezos, who exploits his employees and destroys his competitors, and who some experts say is trying to take over the underlying infrastructure of our entire economy, you can feel reasonably confident that this man has no intention of leaving “this unique gem of a planet”, nor of having the heirs to his empire leave either. When you see this Pentagon advisory board member and CIA contractor planning to ship humans off the Earth’s surface so the planet can thrive, you may be certain that he’s talking about other humans. The unworthy ones. The ones who weren’t sociopathic enough to climb the capitalist ladder by stepping on the backs of everyone else.

And make no mistake, when Bezos talks about saving the planet for “our grandchildren’s grandchildren”, he’s not just talking about his heirs, he’s talking about himself. Bezos has invested large amounts of wealth in biotech aimed at reversing the aging process and cracking the secret of immortality.

This is the sort of guiding wisdom that is controlling the fate of our species, everyone. The world’s most ambitious plutocrat envisions a world in which, rather than evolving beyond our destructive tendencies and learning to live in collaboration with each other and our environment, we are simply shipped off into space so that he can stretch out and enjoy our beautiful planet. That’s his best idea.

Our plutocratic overlords aren’t just sociopaths. They’re morons.

Bezos’ incredibly shallow vision for humanity reminds me of something Julian Assange said at a 2017 London festival via video link about the way Silicon Valley plutocrats are trying to become immortal by finding a way to upload their brains onto computers.

“I know from our sources deep inside those Silicon Valley institutions, they genuinely believe that they are going to produce artificial intelligences that are so powerful, relatively soon, that people will have their brains digitized, uploaded on these artificial intelligences, and live forever in a simulation, therefore will have eternal life,” Assange said. “It’s a religion for atheists. They’ll have eternal life, and given that you’re in a simulation, why not program the simulation to have endless drug and sex orgy parties all around you. It’s like the 72 virgins, but it’s like the Silicon Valley equivalent.”

I mean, damn. First of all, how stupid do you have to be to overlook the fact that science has virtually no understanding of consciousness and doesn’t even really know what it is? Even if these idiots find a way to upload their neurological patternings onto some AI’s virtual simulation, it’s not like they’d be there to experience it. It would just be a bunch of data running in a computer somewhere, mimicking the personality of a dead person and experienced by no one. People who believe that all there is to them is their dopey mental patterns have not spent any time whatsoever exploring what they are, and have no idea what it is to be human. The fact that anyone would think they could become immortal by digitizing their churning, repetitive personality patterns is crazy, and the fact that they’d want to is even crazier.

People who think this way should shut up and learn about life, not rule the world in a plutocratic system where money translates directly to political influence. People who think that humans can be happily unplugged from the ecosystemic context in which they evolved, the ecosystemic context of which they are an inseparable part, and people who think they can become immortal by uploading their wanky personalities onto a computer should shut the fuck up, spend some time alone with themselves, maybe try some psilocybin mushrooms, and learn a bit about what it means to be human. They certainly shouldn’t be calling the shots.

Earth is our home. It’s what we’re made for. The earth went through a lot to give you life. Sparks had to catch, oceans had to freeze, billions of cells had to survive endless disease, all of these amazing things had to happen just right to give you life. You belong here. You are as much a creation of the earth as the air you breathe. You may feel like a singular organism but you’re actually as much a singular organism as one of the many billions of organisms that make up your body. You and earth are one. And because you evolved on earth, you are perfectly adapted to earth and it is perfectly adapted to you. It yearns for your breath as you yearn for its breeze on your face.

by Caitlin Johnstone, CatlinJohnstone.com |  Read more:
Images: Jeff Bezos

Monday, May 27, 2019

In Your Hands, My Dissatisfied Countrymen: The Jaquess-Gilmore Mission

“I worked night and day for twelve years to prevent the war, but I could not. The North was mad and blind, would not let us govern ourselves, and so the war came.” —Jefferson Davis, July 1864

By the time Sherman’s armies had scorched and bow-tied their way to the sea, by the time Halleck had followed Grant’s orders to “eat out Virginia clean and clear as far as they go, so that crows flying over it for the balance of the season will have to carry their own provender with them,” and by the time Winfield Scott’s Anaconda Plan was finished squeezing every drop of life out of the Confederacy, there had to be those who wondered what possible logic would lead intelligent men like Jefferson Davis to make such a catastrophic choice.

Yet, the South almost won the gamble. With secession, they had challenged the core of the American Experiment, the democratic principle of equal rights, general (male) suffrage, government by a majority, and a peaceful transition of power when that majority so indicated. They also posed an existential question for the North: Was adherence to a principle, even a cherished one like the Union, worth lives and property?

The Civil War is fascinating on so many levels, but what made it fundamentally different than any other conflict that preceded it was that, for the first time, two peoples with the ability to exercise electoral oversight engaged in a protracted armed conflict. This implied something new. The simplest mechanisms of civic beliefs: the right to disagree publicly, to organize, to place elected leadership on notice that their jobs could be at risk, would all play an unexpectedly crucial role in the manner in which the war began and was ultimately prosecuted.

There is no question that the issue was ripe. The Jefferson Davis quote, self-serving though it may be, reflects the reality of a political war that had been going on for more than a decade. The South (or at least the Fire-Eaters, who were more influential than their numbers would imply) had talked themselves into eternal dissatisfaction. They had also convinced themselves that the North would lack backbone when pushed. Threatening to leave if Lincoln were to win the election wasn’t just bluster. The South saw him as a real threat, and secession mutterings progressed to secession organizing. Several Southern States did legislative groundwork in anticipation of a Lincoln win, and they swung into action immediately thereafter. Between December 20,1860 and February 1, 1861, South Carolina, Mississippi, Alabama, Florida, Georgia, Louisiana, and finally Texas seceded, all before Lincoln could even take the Oath of Office.

Of course, that wasn’t the entire South—there was significant Unionist sentiment in many places, and slaveholding Border States actually remained loyal. Bear in mind as well that in no state, even in the Deep South, did slaveholders or even slaveholding families represent a majority. But, as William Freehling points out in The Reintegration of American History, a different type of political culture predominated, one that was far more hierarchical and patriarchal. It wasn’t just that the economic elites (often meaning the Planter Class) held power. It was the way they expressed that power: A plantation required a self-sacrificing leader making all the decisions and receiving in return obedience from inferiors—slaves, employees, tradesmen, wives, and people of a lower social standing. Apply that mind-set to politics, and you have the few choosing for the many.

As to the North, the picture was more complex, in part because divergent views had greater access to power. There were plenty of Southern sympathizers, not just Copperheads but also “Doughfaces” like outgoing President James Buchanan. There were bankers and business people who wanted access to Southern markets. There were also many rank and file Democrats who stuck, out of loyalty, with an increasingly Southern-dominated party. But, just as Southern anger burned, so did resentment in the North. The constantly escalating Southern demands, always couched in hyperbolic terms, grated. Lincoln himself had made this point powerfully at Cooper Union in 1859—the South demanded not just Northern obeisance, but also Northern complicity in what many thought of as a profound moral wrong.

Secession made it palpable, real, and now. It forced Northerners to decide whether the whole thing was worth it. Maybe another set of concessions would work, but, if not, why not just let the slavemongers go their own way, and be done with the problem altogether? The entire country held its breath.

It was at this point that the South (or, at least South Carolina) took a fateful step: They fired first, shelling Fort Sumter in Charleston Harbor on April 12, 1861. Northern public opinion moved sharply in the direction of intervention, but Lincoln’s call for 75,000 volunteers to respond also induced four Southern States, first Virginia, the big prize, then Arkansas, North Carolina, and Tennessee, to join the Confederacy.

Why did South Carolina do it? There was no strategic reason—Major Anderson lacked the supplies to sustain his command for any length of time. It’s reasonable to suspect that the aristocratic Southern leadership, steeped in the culture of honor and duels, and contemptuous of a presumed Northern lack of manly fiber, simply assumed it would be easy.

But if the firing on Fort Sumter misjudged the situation, a second Southern assumption was far better. No quick strike was going to end the rebellion. They could fight a largely defensive war—the North would have to come to them, and they felt they had superior military leadership, easier logistics, and far better knowledge of the topography. In short, the South could win just by not losing, and the longer the war went on, the greater the risk to Lincoln that public support would erode.

In the short run, this is exactly what occurred. Poor generalship and tactics led to Northern defeats on the battlefield and at the ballot box: In 1862, Democrats gained 27 seats in the then-184-seat House of Representatives. Northern fortunes picked up militarily in 1863 and early 1864 with the victory at Gettysburg and the capture of New Orleans, but by midsummer they were back to slogging it out again, with heavy loss of life.

Criticism of Lincoln intensified. Seemingly everyone from across the Northern political spectrum found something to dislike in his policies. Influential thought-leaders like William Cullen Bryant, Horace Greeley, and Theodore Tilton came to the conclusion that Lincoln was a failure and needed to be replaced. Others started a John C. Frémont movement, which would have seriously impacted Lincoln’s chances in the general election. At the Republican National Convention in Baltimore, whispers began for a Grant candidacy. Lincoln did secure re-nomination, but, right afterwards, hostilities broke out again with his own party after he pocket-vetoed the Wade-Davis Reconstruction Bill, which was far more punitive than he wanted and would have seized control of the process from the Executive Branch.

The disappointments mounted. In June 1864, Confederate General Jubal Early launched a surprise raid on Washington itself, and almost broke through. By August, against a backdrop of continued military frustrations and a revitalized Democratic Party about to nominate General George McClellan, Lincoln wrote his famous “Blind Memo” to his Cabinet: “This morning, as for some days past, it seems exceedingly probable that this Administration will not be re-elected. Then it will be my duty to so co-operate with the President elect, as to save the Union between the election and the inauguration; as he will have secured his election on such ground that he cannot possibly save it afterwards.” At the very time that memo was written, a serious attempt was made to have a second convention, in Cincinnati, on September 28, to replace Lincoln as the nominee.

Weariness with the war wasn’t confined to the North, although Davis was more secure as a result of his six-year term. The public wanted something done. This gave rise to an unusual number of peace talks, peace feelers, and ad hoc peace conferences. Lincoln even sent the ever-complaining Horace Greeley to one. But perhaps the most interesting is the only one that engaged Jefferson Davis directly, the ragtag Jaquess-Gilmore Mission.

Colonel James Jaquess was a Methodist preacher and soldier from Illinois, James Gilmore a businessman from New York with contacts in the South. Jaquess had long had an obsession with bringing peace to both sides, and requested leave to travel South to meet with like-minded people. Finally, in June of 1864, Lincoln gave a pass to the two to travel to Richmond and attempt to connect with the Confederate President. They were given no formal status or negotiating authority, but were made generally aware of Lincoln’s bottom line—a reconstituted Union, the end of hostilities, and emancipation.

After a preliminary meeting with Judah P. Benjamin, then the Confederacy’s Secretary of State, they were granted an audience with Davis himself on July 17, 1864. What followed was an extraordinary back and forth that may give us as clear a roadmap to Davis’s thinking as we could possibly find. Jeffries later published an account of the meeting, and, making allowances for period language and perhaps a little puffery, it is worth reading in its entirety.

What first strikes you is how absolutely clear Davis was: The war could only end with the North withdrawing. The blame was entirely on them (“At your door lies all the misery and the crime of this war—and it is a fearful, fearful account.”). The North, by insisting on Union, “would deny to us what you exact for yourselves—the right of self-government.” When Jaquess suggested that he had many Southern friends who wished reconciliation, Davis disagreed: “They are mistaken… They do not understand Southern sentiment.”

Jaquess was a determined man, and he pressed his case. Surely peace was desirable? Davis was unmoved: “I desire peace as much as you do. I deplore bloodshed as much as you do; but I feel that not one drop of the blood shed in this war is on my hands; and I look up to my God and say this. I tried all in my power to avert this war. I saw it coming, and for twelve years I worked night and day to prevent it, but I could not. The North was mad and blind; it would not let us govern ourselves; and so the war came, and now it must go on till the last men of this generation falls in his tracks, and his children seize his musket and fight his battles, unless you acknowledge our right to self-government. We are not fighting for Slavery. We are fighting for independence, and that or extermination we will have.”

Over and over Davis returns to his central themes. Independence is non-negotiable. The South hates the North and will never rejoin it, and the North has no right to demand it stay. Each State is only in the Union as a result of a consent that can be withdrawn at any time.

Jaquess then proposes something so far-fetched that it is incredible he could have possibly thought either Lincoln or Davis would ever agree to it: An armistice, followed by a national vote that would choose between two competing proposals—(1) Peace with Disunion, or (2) Peace with Union, emancipation, no confiscation, and universal amnesty.

Davis rejects it, first with the technical objection that one Southern State had no legal right to end slavery in another. But then, in just a few words, he defines why any vote would never be acceptable, no matter the terms: “We seceded to rid ourselves of the rule of the majority, and this would subject us to it again.”

With that, the substantive part of the discussion is over. Jaquess and Jeffries take their leave, and on the way out are met by Judge Robert Ould, who had helped arrange the meeting. Judge Ould inquired about the results: “Nothing but war—war to the knife,” said Jeffries. Jaquess, who had staked so much emotionally on his ability to broker peace, was clearly disappointed at Davis’ fixation on an impossible result. Quoting Hosea 4:17, he adds, “Ephraim is joined to his idols—let him alone.”

by Michael Liss, 3QD | Read more:
Image: uncredited