Saturday, November 9, 2019

Timeout: Estonia Has a New Way to Stop Speeding Motorists

On the desk of a government building, a diorama is laid out. Little vehicles sit by the side of a road, watched over by little policemen. On two recent mornings, this scene was recreated in real life. Drivers caught speeding along the road between Tallinn and the town of Rapla were stopped and given a choice. They could pay a fine, as normal, or take a “timeout” instead, waiting for 45 minutes or an hour, depending on how fast they were going when stopped.

The aim of the experiment is to see how drivers perceive speeding, and whether lost time may be a stronger deterrent than lost money. The project is a collaboration between Estonia’s Home Office and the police force, and is part of a programme designed to encourage innovation in public services. Government teams propose a problem they would like to solve—such as traffic accidents caused by irresponsible driving—and work under the guidance of an “innovation unit”. Teams are expected to do all fieldwork and interviews themselves.

“At first it was kind of a joke,” says Laura Aaben, an innovation adviser for the interior ministry, referring to the idea of timeouts. “But we kept coming back to it.” Elari Kasemets, Ms Aaben’s counterpart in the police, explained that, in interviews, drivers frequently said that having to spend time dealing with the police and being given a speeding ticket was more annoying than the cost of the ticket itself. “People pay the fines, like bills, and forget about it,” he said. (In Estonia, speeding fines generated by automatic cameras are not kept on record and have no cumulative effect, meaning that drivers don’t have their licences revoked if they get too many.)

Making drivers wait requires manpower. The team acknowledges that the experiment is not currently scalable, but hopes that technology could make it so in the future. Public reaction, though, was not what they expected. “It’s been very positive, surprisingly,” says Helelyn Tammsaar, who manages projects for the innovation unit.

by The Economist |  Read more:
Image: uncredited
[ed. Pretty smart. I imagine the the babysitting problem could be overcome with temporary geotags or something. See also: Japanese commuters try new ways to deter gropers (The Economist).]

You Are What You (Don’t) Eat

In the summer of 2016, James and Becca Reed, a lower-income couple living in Austin, Texas, decided it was time to save their lives. The Reeds, married more than twenty-five years, had become morbidly obese, diabetic, and depressed. They were taking a combined thirty-two medications. Only in their early fifties, they had arrived at this condition via a well-trod path: They ate their way into it. They did no more than consume what the American food industry not only offers in abundance—salt, starch, and sweetness—but also encourages us to eat.

As nearly 40 percent of the adult US population can attest, it doesn’t take a lot of time, effort, or expense for the consequences of the American way of eating to add up. A steady diet of processed and fast food, oversized restaurant meals, and “favorited” takeout options can quickly make the average American a victim of the growing obesity epidemic. Considering that the Reeds live paycheck to paycheck, and given what we know about the strong link between economic disadvantage and poor eating choices, I was especially intrigued when a friend, who knew James and Becca from church, told me about this really interesting couple getting ready to reclaim their health in a dramatic way.

With disarming generosity, the Reeds opened their lives to me as they undertook their mission. For three months I followed and documented their progress, meeting with them several times a week, usually at the small gym they attended (on the gym owner’s dime) to talk as they exercised. What they did was both miraculous and subversive. The miraculous part is in the numbers. Becca’s blood sugar level dropped from an alarming 200 milligrams per deciliter (mg/dL) to a very normal 80 mg/dL; James’s cholesterol went from well over a borderline 200 mg/dL to a safe 153; both were losing 15 to 20 pounds a month. They reversed their diabetes—Becca’s score on the glycated hemoglobin test (6.5 or higher indicates the presence of diabetes) plummeted from 9.75 to 5.8—and stopped taking most of their medications. With remarkable efficiency, the Reeds did as planned. They saved their lives.

But as physically conspicuous as their transformation was (soon their clothes were hanging off their bodies), the ultimate driving force behind the Reeds’ success was subversive: They escaped a food system that had been eroding their health. On the surface, the Reeds did what healthy Americans habitually do—they walked more, went to the neighborhood pool after work, cut back on screen time, and hit the gym a few times a week. But these measures, at least when it came to emotionally sustaining their journey, struck them as too anodyne, too lacking in the sort of meaning they wanted to experience through their efforts. As they often remarked, it would have been easy to cheat on their routines unless there had been a moral dimension to their crusade. Healthful activities might have been central to their transformation, but they did not provide what the Reeds needed most: a community bound by a set of stipulations that mattered—in effect, a creed.

So when it came to confronting the food system in which the Reeds had long been entrapped, they decided it was not enough to behave like most relatively healthy Americans. Instead, they needed to adopt an entirely new identity and wrap their reinvented selves in its defining cloak. The Reeds did so by going vegan.

Food Fills the Spiritual Void

In Food Cults: How Fads, Dogma, and Doctrine Influence Diet, Kima Cargill, a psychology professor at the University of Washington, Tacoma, writes that “membership in food cults serves the same psychological functions of cult membership of any kind.” People are attracted to cohesive groups as a means of defining identity, or as Cargill puts it, “delineating in-group and out-group membership.”

As a physical matter, the Reeds did not regain their health because there is something inherently beneficial about being vegan—there’s not. It’s possible, indeed easy, to be an unhealthy vegan. Rather, the Reeds’ transition resulted, predictably, from adopting certain perfectly unremarkable practices: more exercise, portion control, and the consumption of real food, mostly vegetables, rather than processed junk. But what the vegan diet did for the Reeds was exactly what Cargill suggests it does. It allowed them to frame otherwise dull choices in an exclusive and essentialist—and often very exciting—ideology, one that gave them a sense of conviction and community. In this respect, veganism, like many rigorous diet schemes, functions like a cult, with an ethic rooted in what members won’t eat and the value imbued in that denial.

The ghost of religion hovers like a mist over America’s sprawling dietary landscape. Catholics’ abstinence from meat on Fridays, Jews’ avoidance of the flesh of cloven-hooved beasts, and the Hindus’ vegetarianism are well-known, identity-forming convergences of diet, faith, and community. But Cargill takes this religious association further, suggesting that the secularization of modern culture “has left many searching for the structure and identity that religion once provided.” Given this spiritual void, she explains, “food cults arguably replace what religion once did by prescribing organized food rules and rituals.” These are rules and rituals that—whether the diet is vegan or vegetarian, paleo or primal, Mediterranean or South Beach—nurture identities that keep us loyal, insularly focused, and passionate about what we will and, even more significantly, will not eat.

As in any religious quest, the themes of reform and conversion overlap. From Upton Sinclair’s The Jungle to Michael Pollan’s The Omnivore’s Dilemma, a century of literature has demonized verboten foods in the interest of improving personal health and, more importantly, the quality of the nation’s food supply. Whether the offending choice is industrial meat, all meat, farmed fish, processed food, food that grandma didn’t eat, or fast food, the message is one that has been internalized as a mainstream cultural critique: Our food system is in shambles and it must, as a moral imperative, be reformed. Today, it’s no surprise that a relatively new social movement—the “Food Movement”—has emerged around these impassioned exhortations and prohibitions, fueled by congregations of the faithful urging us to “vote with our forks” to fix the system. The personal diet has become not only a cult; it has become a political statement.

Vegans, slow foodies, sustainable foodies, pescatarians, vegetarians, paleos, primals, fruititarians, juicers—this ever-expanding list of dietary sects demonstrates how we can still find new ways to define ourselves in an American dietary landscape seemingly mined to the point of exhaustion. Given the pervasive corruption and seductive power of the system by which food is produced and then presented to the American consumer, as well as our sense of political impotence in the face of this system, it’s hard not to credit the decision and commitment of someone who seeks salvation in a cult-diet conversion. But for all the options to go that route and for all our feverish enthusiasm for such diet regimes, there’s a more fundamental issue we seem to be neglecting: the larger food system itself.

Big agriculture’s fundamental problems—the emphasis on factory-farmed meat and dairy, fertilizer-intensive corn and soy production, the failure to grow a diversity of nutrient-dense plants for people to eat (rather than corn and soy for animals), agricultural policies that favor large corporate farms—have become even more entrenched. Indeed, in the last half-century, industrial food has become only more aligned with the logic of industrial animal production, less diverse in nutrients and real foods, and more reliant on mechanization (and, now it seems, artificial intelligence). All this has occurred even as cult diets have flourished. The question is thus unavoidable: Could individuals voting with their forks—thereby identifying with a diet (or at least a movement)—distract from or even undermine what we really should be doing to reform our food system: reimagining it altogether?

by James McWilliams, The Hedgehog Review |  Read more:
Image: via

On Hawaii, the Fight for Taro’s Revival

Sun breaks through the clouds overlooking the Kako‘o Oiwi farm on Oahu. Its goals are to restore agricultural and ecological productivity for the benefit of the local community.

On Hawaii, the Fight for Taro’s Revival (NY Times)
Image: Scott Conarroe

Friday, November 8, 2019

Experience: My Face Became a Meme

Nine years ago, I did a reverse image search on a photograph of me and was shocked to discover it had become a meme. People online thought my smile, combined with the look in my eyes, seemed terribly sad. They were calling me “Hide the Pain Harold”.

The photo came from a shoot I’d done a year earlier, when I was still working as an electrical engineer. A professional photographer had got in touch after seeing my holiday photographs on Facebook. He said he was seeking someone like me to be in some stock images. Everyone is a little vain inside, myself included, so I was happy that he wanted me. He invited me to a photoshoot near my home in Budapest and we took shots in different locations and settings. Over the course of two years he took hundreds of pictures of me for photo libraries.

András Arató, whose face became a memeI thought the pictures would just be used by businesses and websites, but I wasn’t expecting the memes. People overlaid text on my pictures, talking about their wives leaving them, or saying their identity had been stolen and their bank account emptied. They used my image because it looked as if I was smiling through the pain.

Once the memes were out in the world, journalists began to contact me, and wanted to come to my home to interview me. My wife hated it: she thought it interfered in our private life and didn’t like the way I was portrayed. People thought I wasn’t a real person, that I was a Photoshop creation – someone even got in contact asking for proof that I existed.

I knew that it was impossible to stop people making memes, but it still annoyed me that Facebook pages, some with hundreds of thousands of followers, were using my photograph as their profile picture, and pretending to be me. Some kind of brand had been made out of me and I would have been a fool not to make use of it. So, in 2017, I created my own Facebook fan page and updated it with videos and stories from my travels.

That started everything going. People noticed that I had taken ownership of the meme and got in contact to offer me work. I was given a role in a television commercial for a Hungarian car dealer. In one of the adverts, I travelled to Germany to buy a used car and it broke down halfway home; if I had bought the same car through their company, the brand claimed, it wouldn’t have happened. The fee for that commercial changed my wife’s mind about the meme.

by András Arató, The Guardian | Read more:
Image: Bela Doka/The Guardian

BS


via: 

The Dictionary of Capitalism

As you encounter the world around you, you will hear many words that go undefined. It is the task of Current Affairs to explain these terms in intelligible language, so that readers may perceive the true nature of things.

  • American lives /əˈmɛrɪk(ə)n lʌɪvz/ n. the units by which the human toll of a war is measured.
  • body camera /ˈbɒdi ˈkam(ə)rə/ n. a magic technology capable of turning off whenever a police officer commits a crime.
  • border /ˈbɔːdə/ n. an imaginary line drawn through the world, the crossing of which is met with violence on behalf of those who may live thousands of miles away from the line.
  • charity /ˈtʃarɪti/ n. organization that attempts to partially compensate for capitalism’s failures.
  • charter school /ˈtʃɑːtəskuːl/ n. a school that insists it can provide better education than a public school because the people who run it are not accountable to anyone.
  • corporation /kɔːpəˈreɪʃ(ə)n/ n. a collectivist enterprise in which all power is vested in a small number of central planning bureaucrats at the top and the individual must surrender their autonomy to serve the interests of the planning authority.
  • death tax /dɛθ taks/ n. when, upon a person becoming deceased, the state declines to transfer that person’s accumulated freedom-unit score to a different, arbitrary person who did not earn those freedom-units.
  • disruption /dɪsˈrʌpʃn/ n. changing an industry standard by finding ways to pay people less for their labor.
  • education /ɛdjʊˈkeɪʃ(ə)n/ n. the process of being filtered for employment based on one’s pliability and deference.
  • employment /ɪmˈplɔɪm(ə)nt/ n. being granted permission to continue living in exchange for an adjustable daily quantity of toil.
  • entitlements /ɪnˈtʌɪt(ə)lmənts/ n. small increases in the freedom-units afforded to the elderly, sick, and those who have too few units to survive.
  • eviction /ɪˈvɪkʃ(ə)n/ n. forcible removal from one’s home for failure to provide the appropriate lord with sufficient tribute.
  • fighter plane /ˈfʌɪtə pleɪn/ n. a plane that fights other planes to the death.
  • fiscal responsibility /ˈfɪsk(ə)l rɪˌspɒnsɪˈbɪlɪti/ n. only spending money on things that kill people.
  • food stamps /fuːd stamp/ n. a system by which the poor must humiliate themselves on a six-monthly or yearly basis in order to buy a tiny amount of food they will be shamed for possessing. (Use of the stamp for lobster will result in congressional hearings.)
  • fossil fuel /ˈfɒs(ə)l fjuː(ə)l/ n. a suicide pill; brings gratification of immediate pleasures but causes death to one’s self and one’s offspring.
  • genocide /ˈdʒɛnəsʌɪd/ n. something white people fear will happen to them; all other uses are disputed.
  • health insurance board of directors /hɛlθ ɪnˈʃʊər(ə)ns bɔːd (ə)v dʌɪˈrɛktəs/ n. death panel.
  • homeless person /ˈhəʊmlɪs ˈpəːs(ə)n/ n. a person we are choosing not to house in any of the X million properties currently empty.
  • journalist /ˈdʒəːn(ə)lɪst/ n. a transcriptionist used by anonymous state officials to nudge public opinion in an appropriate direction.
  • landlord /ˈlan(d)lɔːd/ n. precisely what it sounds like: the feudal ruler of a plot of land, entitled to extract wealth from all inhabitants.
  • limited government /ˈlɪmɪtɪd ˈɡʌv(ə)nˌm(ə)nt/ n. government that restricts its role only to taking care of rich people.
by Current Affairs |  Read more:
Image: uncredited

Away Fans are Taking Over the NFL

The Los Angeles Chargers returned to southern California on Sunday after playing the previous two weeks on the road, but it didn’t make much difference. Home-field advantage doesn’t really apply to the Chargers, not when visiting fans routinely make the team feel like they’re behind enemy lines in their own stadium. That was the case again on Sunday, when the Chargers hosted the Green Bay Packers. The predominant color in the stands was the green of the visitors, and the cheers rang out louder for Aaron Rodgers than Philip Rivers. The home team won, convincingly at that, but most people left the stadium disappointed.

The stands were a sea of Packers green rather than Chargers blue when the teams met in LA on SundayIt has become one of the peculiar features of the NFL calendar since both the Chargers and Rams relocated to Los Angeles in 2017, marking a reunion between America’s second-largest market and its most popular sporting league: more often than not, the teams’ home games look and sound like home games for the opposition. Chargers players were showered with boos when they took the field against the visiting Philadelphia Eagles two years ago. The Rams got the same treatment last season at home against the Packers. Both Rivers, the Chargers quarterback, and Rams quarterback Jared Goff have regularly been forced to use a silent count to combat the noise generated by the away side’s fans, typically an unnecessary measure to take for a team playing at home.

“It’s certainly not ideal,” Rivers said with a hint of resignation after the 2017 game against the Eagles. The home-field hostility hit a fresh apex for both teams on the same Sunday last month. That afternoon, the Rams were overwhelmed on the field and in the stands, which were blanketed by the red of the visiting San Francisco 49ers. “This turned into a home game pretty quickly,” said San Francisco quarterback Jimmy Garoppolo after the game. “I’ve never seen anything like it.”

It hasn’t been quite as enjoyable for the ostensible home teams. A few hours later that day, the Chargers hosted the Pittsburgh Steelers, whose fans roared with approval when the stadium PA system blasted their team’s adopted anthem, Renegade by Styx. It was supposed to be a gag; the song eventually transitioned to Never Gonna Give You Up by Rick Astley, the punctuation to a long-running internet prank. But the joke didn’t land, and Chargers players were miffed.

“It was crazy,” Chargers running back Melvin Gordon said. “They started playing [the Steeelers’] theme music. I don’t know what we were doing – that little soundtrack, what they do on their home games. I don’t know why we played that.” Chargers offensive lineman Forrest Lamp was more blunt: “We’re used to not having any fans here. It does suck, though, when they’re playing their music in the fourth quarter. We’re the ones at home. I don’t know who’s in charge of that, but they probably should be fired.”

The go-to line from Rams and Chargers brass is that it will take time to cultivate a true fan base in Los Angeles. Chargers owner Dean Spanos, who engineered the franchise’s move from San Diego after voters there rejected his bid for public funding of a new stadium, told the New York Times earlier this year that it will “take maybe a generation” for the team to find its footing in LA. On Tuesday this week, he was forced to deny rumors the team has discussed relocating to London.

by Tom Kludt, The Guardian | Read more:
Image: Jake Roth/USA Today Sports

Thursday, November 7, 2019

Julian Lage, Scott Colley, Kenny Wollesen



[ed. The most talented jazz guitarist working today (well, most talented young guitarist anyway).]

Why We Wish for Wilderness

Wilderness, the environmental historian Roderick Nash has argued, is not so much a place as an idea. Nash’s essential book Wilderness and the American Mind, which traced the long evolution of attitudes toward wild places from fear and avarice to awe and nostalgia, was submitted as a doctoral thesis at the University of Wisconsin in 1964, the same year that the Wilderness Act enshrined environmental activist Howard Zahniser’s somewhat whimsical legal definition of wilderness as “an area where the earth and its community of life are untrammeled by man, where man himself is a visitor who does not remain.”

That anthropocentric definition—wilderness is wherever we’re not—has sparked plenty of debate in conservation circles about what types of places deserve protection, from whom, and for whose benefit. “[I]f nature dies because we enter it,” another University of Wisconsin environmental historian, William Cronon, proposed in 1995, “then the only way to save nature is to kill ourselves.” But the idea of wilderness as the absence of humans remains implicit in much of how we think and talk about wild places. Nash traced this conception back to the advent of herding, agriculture, and settlement 10,000 years ago, when lines both literal and metaphorical began to be etched into the land to delineate where human dominion started and stopped. And this still has a particular American resonance thanks to the lingering influence of the historian Frederick Jackson Turner’s Frontier Thesis, first proposed in 1893, which attributed the emergence of a hardy and distinct national character in the 1800s to the century-long process of conquering the wild and untrammeled West.

You can still see this idea play out in popular survival shows like Man vs. Wild. Bear Grylls and his peers parachute into barren hinterlands that are always completely devoid of all traces of human presence, and triumph over the malign forces of nature while relying only on fortitude, MacGyver-like cunning, and occasionally, when dehydration strikes, their own bodily fluids. The same vibe of self-reliance and unflappable omnicompetence runs through Shoalts’s books—like the time, on a trip deep into British Columbia’s Great Bear Rainforest, with no gun and no bear-repellent spray, he returns to his tent to find fresh grizzly claw slashes in the bark of a nearby cedar:
The only tools at my disposal were a hatchet, my knife, and a folding saw, along with some rope and paracord. With these tools I could fashion a sleeping platform between four trees a safe distance off the ground… Working quickly to beat the sunset, I had to shinny up each tree and lash together the strong sticks I had cut to create a platform between four hemlocks. After the frame was finished, I cut sticks that I would bind to the rectangular platform, creating a solid floor to sleep on. For protection against the rain, I made a roof out of my tarp and enclosed the sides with hemlock boughs. To make the floor comfortable, I laid moss and more hemlock boughs over the platform. My shelter finished, it served as a rather cozy abode for the next five nights.
Though undeniably entertaining, this adversarial approach to a human-less nature, implicit even in the title Alone Against the North, carries a lot of baggage. Most obviously, it assumes either that the lands to be explored are uninhabited, or that its inhabitants somehow don’t count. The writer and explorer Kate Harris, writing in The Walrus, mocked Shoalts for claiming to have “‘discovered’ waterfalls in [the Moose Cree First Nation’s] traditional territory (when he accidentally canoed over them, no less).” The Globe and Mail’s review similarly called out his elision of indigenous knowledge and his “misguided reverence for the lumbering spirit of European colonialism.”

My initial reaction to these criticisms was defensive—on both Shoalts’s behalf and my own. After all, Shoalts had clearly addressed these concerns pre-emptively with two distinct arguments. The first was geographical: unlike more densely populated areas farther south, Canada’s subarctic wilderness is both imponderably vast and all but uninhabitable. The Again River is located in the Hudson Bay Lowlands, a swampy, bigger-than-Minnesota wetland most notable for its polar bears and for having the highest concentration of bloodsucking insects in the world. Aboriginal peoples certainly ventured into the area along its major rivers, but, according to Shoalts, they considered it “sterile country,” and there’s little evidence of sustained pre-contact indigenous inhabitation. Given the near-nonexistent settlement, along with the fact that Canada has something like three million lakes—no one has ever succeeded in counting them properly—and innumerable rivers, creeks, and ponds, the math argues overwhelmingly against the notion that humans have visited every single one of these waterways.

Of course, as Shoalts himself acknowledges, there’s simply no way of knowing for sure whether anyone in previous centuries, let alone previous millennia, has ever visited a given place. But his second argument is that this doesn’t matter, because exploration isn’t just about hair-raising adventures or breaking new ground, but about “the generation of new geographical information that adds to humanity’s stock of collective knowledge.” If someone once paddled the Again but didn’t file any report of it, preferably with an august geographical society brimming with cabinets of yellowing files that date back to earlier centuries, then Shoalts has not been pre-empted in performing the task as he defines it.

On that narrow question of whether Shoalts’s trip down the Again represents some sort of geographical first, I believe he’s right. But the broader issues raised by critics left me uneasy. Do our romanticized vision of exploration and perhaps the very concept of wilderness itself require a mental erasure of the native experience—one that echoes their physical removal from the places we now treasure as national parks? Contrary to what William Denevan, another University of Wisconsin scholar, called “the Pristine Myth,” it’s now widely accepted that the pre-contact population of the Americas was vastly greater than once thought, with some estimates exceeding 50 million. The only reason early European settlers found the land seemingly empty was that as much as 95 percent of the indigenous population had already been wiped out by European diseases transmitted at the initial time of contact. Moreover, the “primeval” landscapes and fauna, from the Great Plains to the Amazon rainforest, didn’t exist in some untouched, Edenic original form; they had already been widely and deliberately modified by fire, agriculture, hunting, and other human activity.

All of this undercuts the stories I like to tell myself about why I love wilderness travel—that, beyond the clean air and nice scenery, I’m experiencing the planet as it was when “we” (a pronoun I leave deliberately vague) got here. In fact, when I look back at the series of wilderness travel articles I wrote for The New York Times a decade ago, what jumps out at me is the almost monomaniacal obsession with enacting Denevan’s myth by finding unpopulated places. Camped out in the Australian outback, I boasted that it was “the farthest I’d ever been from other human beings.” Along the “pristine void” of a remote river in the Yukon, I climbed ridges and scanned the horizon: “It was intoxicating,” I wrote, “to pick a point in the distance and wonder: Has any human ever stood there?”

Rereading those and other articles, I now began to reluctantly consider the possibility that my infatuation with the wilderness was, at its core, a poorly cloaked exercise in colonial nostalgia—the urbane Northern equivalent of dressing up as Stonewall Jackson at Civil War reenactments because of an ostensible interest in antique rifles.

My first wilderness trip was a week-long canoe trip, at age fifteen, with two friends in Algonquin Park, a 3,000-square-mile canoeing paradise four hours north of my home in Toronto. I’d done some car camping and learned basic canoe strokes at summer camp, but I had no idea what an actual backcountry trip entailed. None of us did—we didn’t know about such technical innovations as sleeping pads and camp stoves, or even about basic skills like how to cook. Our packs were so heavy with sacks of onions, potatoes, carrots, and other bad choices that even the outfitter who rented us a canoe could barely lift them.

On the very first portage, I was lagging behind when a grouse charged at me. I stood up straight in surprise, and immediately toppled over backward, pinned like an overturned turtle. Since I couldn’t lift the pack on my own, I had to hike to the end of the portage to get my friends to return and hoist the pack onto my back again. The same thing happened on the next portage, except this time I was carrying the canoe, and it was a moose that spooked me. That evening, we discovered that dumping chunks of potato, onion, and carrot into water boiling on the fire, then adding rice, doesn’t produce a very good stew, even after waiting ten whole minutes. We stayed up half the night trying to burn surplus root vegetables to lighten our packs, with very limited success.

The eventual upshot of this bumbling display of ineptitude, strangely enough, was an incredible feeling of accomplishment. As a coddled and privileged kid whose path through life was meticulously well-signposted and more or less strewn with rose petals, I reveled in the opportunity to tackle problems on my own, with a legitimate chance of doing something wrong, in a context where errors could have serious consequences. In the years that followed, through trial and copious error, I got more competent at taking care of myself in the bush—and in consequence, started seeking out ever more remote settings so that the challenges, problems, and potential consequences remained real.

No one, in the course of my admirably progressive education, ever tried to tell me that Christopher Columbus was any kind of hero. Still, as I hiked across mountain passes or paddled down lonely rivers, I’d often daydream about what it would have been like to travel those routes for the first time—or at least, without the benefit of knowledge passed on from previous travelers or inhabitants. I read a lot about early explorers of what is now Canada—Étienne Brûlé, Alexander Mackenzie, John Franklin—but until I started worrying about the Shoalts critiques, I’d never actually read their journals.

Mackenzie, in particular, has a hallowed place in Canadian history, having completed the first overland journey across North America—a dozen years before Lewis and Clark. (I’m just saying.) His journals turned out to be as gripping as I’d expected. But reading with new eyes, I couldn’t help noticing how little of his travel involved actually venturing into the unknown. Instead, his progress amounted to a relay from tribe to tribe, shanghaiing locals into guiding his crew through each leg of the journey. “Thunder and rain prevailed during the night,” he writes at one point, “and, in the course of it, our guide deserted; we therefore compelled another of these people, very much against his will, to supply the place of his fugitive countryman.” This pattern recurs over and over.

That’s not to say Mackenzie’s voyages were easy. In fact, for both explorers and the settlers who followed, the fact that the so-called wilderness was already inhabited did little to tame their perceptions of the land. Brûlé, according to some versions of the story, was killed and eaten by Hurons who thought he had betrayed them—a legend that may owe as much to colonial mythologizing as to historical fact. “Regardless of what we might think about it today,” Roderick Nash wrote, “Indians made the New World a greater, not a lesser, wilderness for the pioneer pastoralists.” In my own intrepid-explorer fantasies, indigenous people mostly didn’t appear at all. On further reflection, I’m not sure which is worse: dismissing people as savages, or ignoring their existence altogether.

Yet I persist in believing there’s something special about paths untrodden. To read accounts of Antarctic exploration, or even early space travel, is to find some of the same fire that animates Shoalts. In these cases, there are no native life forms (that we know of) being brushed aside. Granted, there are heavy doses of nationalism, mercantilism, and ego at work, but there’s also something else there that’s harder to articulate—something that’s not specific to any particular landscape or historiography. The closest I’ve come to putting my finger on it was in a conversation I had a few years ago with a University of Utah professor named Daniel Dustin.

Back in 1981, Dustin and a colleague wrote an article in the Journal of Forestry called “The Right to Risk in Wilderness,” in which they proposed the creation of “no-rescue” wilderness zones in places like Gates of the Arctic National Park in Alaska. The idea has never been adopted, but over the years it has proven to be a provocative spur to discussions about what it is people are seeking in the wilderness. “Publicly you could never say this is a good idea, because it sounds so heartless and cruel,” he admitted to me. “But my point is that in our culture, we glorify a few experts who do this sort of thing, and we’ll put ’em on television, maybe a create a reality TV show or whatever, but if every man wants to do it, stretch herself or himself, we somehow suggest that that’s just ridiculous.” Even three decades after the original article, Dustin didn’t have a simple answer for what you’d get from a trip to a no-rescue zone. But the ideas he spitballed—the opportunity to face the unknown, to take personal responsibility, to become self-reliant—somehow reminded me of that first day of my first canoe trip, backing away from a moose and eating undercooked rice.

by Alex Hutchison, NYRB |  Read more:
Image: VW Pics/Universal Images Group via Getty Images

Every Bon Appétit: Gourmet Makes Video, Ranked

Few could have predicted the massive cultural impact of Claire Saffitz, then–Senior Food Editor of Bon Appétit, creating an upgraded version of a Hostess Twinkie. It’s been over two years since the appropriately titled “Pastry Chef Attempts to Make a Gourmet Twinkie” was uploaded, and 6.3 million views later, what began as an 11-minute video has become the Bon Appétit YouTube channel’s signature series, earning an impressive legion of devoted fans and turning Claire and her co-workers into internet stars. Gourmet Makes has tackled sweet, savory, and everything in-between, putting Claire’s culinary expertise (and crafting skills) to the test by asking her to recreate beloved junk food with a gourmet twist.

The results of Claire’s efforts are variable, but over the course of its 28 and counting episodes, Gourmet Makes has become as much about the Bon Appétit test-kitchen personalities as it is about perfecting the texture of Twizzlers or Doritos’ nacho-cheese flavor. While each installment still ends with a how-to guide, at this point Gourmet Makes is less instructional video and more legitimate web series, with all the drama, surprises, and rich character arcs of prestige television. With that in mind, the show feels long overdue for a ranking: not of how close Claire’s food comes to the original, but as episodes of an ensemble series starring Claire, her fellow chefs, and a surprisingly useful dehydrator.

Twinkies


It’s hard to judge “Twinkies” objectively — this is the episode that started it all, the first exposure that many of us had to Gourmet Makes and to Claire. When compared to the complexity and nuance of later episodes, it’s admittedly lacking. And yet, beyond its nostalgic appeal, “Twinkies” is an essential foundational text, laying the groundwork for everything that follows. We have Claire discovering that the endeavor is more challenging than anticipated: “This is harder than I thought it was gonna be,” she says for the first and not the last time. Her relationship with Brad, who alternates between compassionate ally and merciless bully, is already coming in to focus. Here, he ends up being helpful, suggesting Claire combine a yellow cake and a chiffon cake to get that unique Twinkie texture. “Twinkies are a Frankenstein, in my opinion,” he offers. The result is not so much a Twinkie as the platonic ideal of a Twinkie, which is pretty much Gourmet Makes’ mission statement. And unlike the vast majority of Claire’s future efforts, this Twinkie is something viewers could actually make at home — no background in food science required.

by Louis Peitzman, Vulture | Read more:
Image: YouTube
[ed. End times.]

Samsara

The man standing outside my front door was carrying a clipboard and wearing a golden robe. “Not interested,” I said, preparing to slam the door in his face.

“Please,” said the acolyte. Before I could say no he’d jammed a wad of $100 bills into my hand. “If this will buy a few moments of your time.”

It did, if only because I stood too flabbergasted to move. Surely they didn’t have enough money to do this for everybody.

“There is no everybody,” said the acolyte, when I expressed my bewilderment. “You’re the last one. The last unenlightened person in the world.”

And it sort of made sense. Twenty years ago, a group of San Francisco hippie/yuppie/techie seekers had pared down the ancient techniques to their bare essentials, then optimized hard. A combination of drugs, meditation, and ecstatic dance that could catapult you to enlightenment in the space of a weekend retreat, 100% success rate. Their cult/movement/startup, the Order Of The Golden Lotus, spread like wildfire through California – a state where wildfires spread even faster than usual – and then on to the rest of the world. Soon investment bankers and soccer moms were showing up to book clubs talking about how they had grasped the peace beyond understanding and vanquished their ego-self.

I’d kind of ignored it. Actually, super ignored it. First a flat refusal to attend Golden Lotus retreats. Then slamming the door in their face whenever their golden-robed pamphleteers came to call. Then quitting my job to live off savings after my coworkers started converting and the team-building exercises turned into meditation sessions. Then unplugging my cable box after the sitcoms started incorporating Golden Lotus themes and the national news started being about how peaceful everybody was all the time. After that I might have kind of become a complete recluse, never leaving the house, ordering meals through UberEats, cut off from noticing any of the changes happening outside except through the gradual disappearance of nonvegetarian restaurants on the app.

I’m not a bigot; people can have whatever religion they choose. But Golden Lotus wasn’t for me. I don’t want to be enlightened. I like being an individual with an ego. Ayn Rand loses me when she starts talking politics, but the stuff about selfishness really speaks to me. Tend to your own garden, that kind of thing. I’m not becoming part of some universal-love-transcendent-joy hive mind, and I’m not interested in what Golden Lotus is selling.

So I just said: “Cool. Do I get a medal?”

“This is actually very serious,” said the acolyte. “Do you know about the Bodhisattva’s Vow?”

“The what now?”

“It’s from ancient China. You say it before embarking on the path of enlightenment. ‘However innumerable sentient beings are, I vow to save them all.’ The idea is that we’re all in this together. We swear that we will not fully forsake this world of suffering and partake of the ultimate mahaparanirvana – complete cosmic bliss – until everyone is as enlightened as we are.”

“Cool story.”

“That means 7.5 billion people are waiting on you.”

“What?”

“We all swore not to sit back and enjoy enlightenement until everyone was enlightened. Now everyone is enlightened except you. You’re the only thing holding us all back from ultimate cosmic bliss.”

“Man. I’m sorry.”

“You are forgiven. We would like to offer you a free three-day course with the Head Lama of Golden Lotus to correct the situation. We’ll pick you up at your home and fly you to the Big Island of Hawaii, where the Head Lama will personally…”

“…yeah, no thanks.”

“What?”

“No thanks.”

“But you have to! Nobody else can reach mahaparanirvana until you get enlightened!”

“Sure they can. Tell them I’m okay, they can head off to mahabharata without me, no need to wait up.”

“They can’t. They swore not to.”

“Well, they shouldn’t have done that.”

“It’s done! It’s irreversible! The vow has been sworn! Each of the seven point five billion acolytes of Golden Lotus has sworn it!”

“Break it.”

“We are enlightened beings! We can’t break our solemn vow!”

“Then I guess you’re going to learn an important lesson about swearing unbreakable vows you don’t want to keep.”

“Sir, this entire planet is heavy with suffering. It groans under its weight. Seven billion people, the entirety of the human race, and for the first time they have the chance to escape together! I understand you’re afraid of enlightenment, I understand that this isn’t what you would have chosen, but for the sake of the world, please, accept what must be!”

“I’m sorry,” I said. “I really am. But the fault here is totally yours. You guys swore an oath conditional on my behavior, but that doesn’t mean I have to change my behavior to prevent your oath from having bad consequences. Imagine if I let that work! You could all swear to kill yourself unless I donated money, and I’d have to donate or have billions of deaths on my hands. That kind of reasoning, you’ve got to nip it in the bud. I’m sorry about your oath and I’m sorry you’re never going to get to Paramaribo but I don’t want to be enlightened and you can’t make me.”

I slammed the door in his face.

by Scott Alexander, Slate Star Codex |  Read more:

Tuesday, November 5, 2019

Rosalyn Drexler Hold Your Fire (Men and Machines) 1966
https://rosalyndrexler.org/selected-paintings/

Rosalyn Drexler, Hold Your Fire (Men and Machines) 1966
via:
Eduardo Paolozzi Wittgenstein in New York (from the series As is When), 1965 (detail)
http://www.whitechapelgallery.org/exhibitions/eduardo-paolozzi/

Eduardo Paolozzi, Wittgenstein in New York (from the series As is When), 1965
via:

Manufacturing Fear and Loathing, Maximizing Corporate Profits! Why Today’s Media Makes Us Despise One Another

Matt Taibbi’s Hate Inc. is the most insightful and revelatory book about American politics to appear since the publication of Thomas Frank’s Listen, Liberal almost four full years ago, near the beginning of the last presidential election cycle.

While Frank’s topic was the abysmal failure of the Democratic Party to be democratic and Taibbi’s is the abysmal failure of our mainstream news corporations to report news, the prominent villains in both books are drawn from the same, or at least overlapping, elite social circles: from, that is, our virulently anti-populist liberal class, from our intellectually mediocre creative class, from our bubble-dwelling thinking class. In fact, I would strongly recommend that the reader spend some time with Frank’s What’s the Matter with Kansas? (2004) and Listen, Liberal! (2016) as he or she takes up Taibbi’s book. And to really do the book the justice it deserves, I would even more vehemently recommend that the reader immerse him- or herself in Taibbi’s favorite book and vade-mecum, Manufacturing Consent (which I found to be a grueling experience: a relentless cataloging of the official lies that hide the brutality of American foreign policy) and, in order to properly appreciate the brilliance of Taibbi’s chapter 7, “How the Media Stole from Pro Wrestling,” visit some locale in Flyover Country and see some pro wrestling in person (which I found to be unexpectedly uplifting — more on this soon enough).

Taibbi tells us that he had originally intended for Hate, Inc. to be an updating of Edward Herman and Noam Chomsky’s Manufacturing Consent (1988), which he first read thirty years ago, when he was nineteen. “It blew my mind,” Taibbi writes. “[It] taught me that some level of deception was baked into almost everything I’d ever been taught about modern American life…. Once the authors in the first chapter laid out their famed propaganda model [italics mine], they cut through the deceptions of the American state like a buzz saw” (p. 10). For what seemed to be vigorous democratic debate, Taibbi realized, was instead a soul-crushing simulation of debate. The choices voters were given were distinctions without valid differences, and just as hyped, just as trivial, as the choices between a Whopper and a Big Mac, between Froot Loops and Frosted Mini-Wheats, between Diet Coke and Diet Pepsi, between Marlboro Lites and Camel Filters. It was all profit-making poisonous junk.

Manufacturing Consent,” Taibbi writes, “explains that the debate you’re watching is choreographed. The range of argument has been artificially narrowed long before you get to hear it” (p. 11). And there’s an indisputable logic at work here, because the reality of hideous American war crimes is and always has been, from the point of view of the big media corporations, a “narrative-ruining” buzz-kill. “The uglier truth [brought to light in Manufacturing Consent], that we committed genocide of a fairly massive scale across Indochina — ultimately killing at least a million innocent civilians by air in three countries — is pre-excluded from the history of the period” (p. 13).

So what has changed in the last thirty years? A lot! As a starting point let’s consider the very useful metaphor found in the title of another great media book of 1988: Mark Crispin Miller’s Boxed In: The Culture of TV. To say that Americans were held captive by the boob tube affords us not only a useful historical image but also suggests the possibility of their having been able to view the television as an antagonist, and therefore of their having been able, at least some of them, to rebel against its dictates. Three decades later, on the other hand, the television has been replaced by iPhones and portable tablets, the workings of which are so precisely intertwined with even the most intimate minute-to-minute aspects of our lives that our relationship to them could hardly ever become antagonistic.

Taibbi summarizes the history of these three decades in terms of three “massive revolutions” in the media plus one actual massive political revolution, all of which, we should note, he discussed with his hero Chomsky (who is now ninety! — Edward Herman passed away in 2017) even as he wrote his book. And so: the media revolutions which Taibbi describes were, first, the coming of FoxNews along with Rush Limbaugh-style talk radio; second, the coming of CNN, i.e., the Cable News Network, along with twenty-four hour infinite-loop news cycles; third, the coming of the Internet along with the mighty social media giants Facebook and Twitter. The massive political revolution was, going all the way back to 1989, the collapse of the Berlin Wall, and then of the Soviet Union itself — and thus of the usefulness of anti-communism as a kind of coercive secular religion (pp. 14-15).

For all that, however, the most salient difference between the news media of 1989 and the news media of 2019 is the disappearance of the single type of calm and decorous and slightly boring cis-het white anchorman (who somehow successfully appealed to a nationwide audience) and his replacement by a seemingly wide variety of demographically-engineered news personæ who all rage and scream combatively in each other’s direction. “In the old days,” Taibbi writes, “the news was a mix of this toothless trivia and cheery dispatches from the frontlines of Pax Americana…. The news [was] once designed to be consumed by the whole house…. But once we started to be organized into demographic silos [italics mine], the networks found another way to seduce these audiences: they sold intramural conflict” (p. 18).

And in this new media environment of constant conflict, how, Taibbi wondered, could public consent, which would seem to be at the opposite end of the spectrum from conflict, still be manufactured?? “That wasn’t easy for me to see in my first decades in the business,” Taibbi writes. “For a long time, I thought it was a flaw in the Chomsky/Herman model” (p. 19).

But what Taibbi was at length able to understand, and what he is now able to describe for us with both wit and controlled outrage, is that our corporate media have devised — at least for the time being — highly-profitable marketing processes that manufacture fake dissent in order to smother real dissent (p. 21). And the smothering of real dissent is close enough to public consent to get the goddam job done: The Herman/Chomsky model is, after all these years, still valid.

Or pretty much so. Taibbi is more historically precise. Because of the tweaking of the Herman/Chomsky propaganda model necessitated by the disappearance of the USSR in 1991 (“The Russians escaped while we weren’t watching them, / As Russians do…,” Jackson Browne presciently prophesied on MTV way back in 1983), one might now want to speak of a Propaganda Model 2.0. For, as Taibbi notes, “…the biggest change to Chomsky’s model is the discovery of a far superior ‘common enemy’ in modern media: each other. So long as we remain a bitterly-divided two-party state, we’ll never want for TV villains” (pp. 207-208).

To rub his great insight right into our uncomprehending faces, Taibbi has almost sadistically chosen to have dark, shadowy images of a yelling Sean Hannity (in lurid FoxNews Red!) and a screaming Rachel Maddow (in glaring MSNBC Blue!) juxtaposed on the cover of his book. For Maddow, he notes, is “a depressingly exact mirror of Hannity…. The two characters do exactly the same work. They make their money using exactly the same commercial formula. And though they emphasize different political ideas, the effect they have on audiences is much the same” (pp. 259-260).

And that effect is hate. Impotent hate. For while Rachel’s fan demographic is all wrapped up in hating Far-Right Fascists Like Sean, and while Sean’s is all wrapped up in despising Libtard Lunatics Like Rachel, the bipartisan consensus in Washington for ever-increasing military budgets, for everlasting wars, for ever-expanding surveillance, for ever-growing bailouts of and tax breaks for and and handouts to the most powerful corporations goes forever unchallenged.

Oh my. And it only gets worse and worse, because the media, in order to make sure that their various siloed demographics stay superglued to their Internet devices, must keep ratcheting up levels of hate: the Fascists Like Sean and the Libtards Like Rachel must be continually presented as more and more deranged, and ultimately as demonic. “There is us and them,” Taibbi writes, “and they are Hitler” (p. 64). A vile reductio ad absurdum has come into play: “If all Trump supporters are Hitler, and all liberals are also Hitler,” Taibbi writes, “…[t]he America vs. America show is now Hitler vs. Hitler! Think of the ratings!…” The reader begins to grasp Taibbi’s argument that our mainstream corporate media are as bad as — are worse than — pro wrestling. It’s an ineluctable downward spiral.

Taibbi continues: “The problem is, there’s no natural floor to this behavior. Just as cable TV will eventually become seven hundred separate twenty-four-hour porn channels, news and commentary will eventually escalate to boxing-style, expletive-laden, pre-fight tirades, and the open incitement to violence [italics mine]. If the other side is literally Hitler, … [w]hat began as America vs. America will eventually move to Traitor vs. Traitor, and the show does not work if those contestants are not eventually offended to the point of wanting to kill one another” (pp. 65-69). (...)

On the same day I read this chapter I saw that, on the bulletin board in my gym, a poster had appeared, as if by magic, promoting an upcoming Primal Conflict (!) professional wrestling event. I studied the photos of the wrestlers on the poster carefully, and, as an astute reader of Taibbi, I prided myself on being able to identify which of them seemed be playing the roles of heels, and which of them the roles of babyfaces.

For Taibbi explains that one of the fundamental dynamics of wrestling involves the invention of crowd-pleasing narratives out of the many permutations and combinations of pitting heels against faces. Donald Trump, a natural heel, brings the goofy dynamics of pro wrestling to American politics with real-life professional expertise. (Taibbi points out that in 2007 Trump actually performed before a huge cheering crowd in a Wrestlemania event billed as the “battle of the billionaires.” Watch it on YouTube! https://youtu.be/5NsrwH9I9vE— unbelievable!!)

The mainstream corporate media, on the other hand, their eyes fixed on ever bigger and bigger profits, have drifted into the metaphorical pro wrestling ring in ignorance, and so, when they face off against Trump, they often end up in the role of inept prudish pearl-clutching faces.

Taibbi condemns the mainstream media’s failure to understand such a massively popular form of American entertainment as “malpractice” (p. 125), so I felt more than obligated to buy a ticket and see the advertised event in person. To properly educate myself, that is.

On the poster in my gym I had paid particular attention to the photo of character named Logan Easton Laroux, who was wearing a sweater tied around his neck and was extending an index finger upwards as if he were summoning a waiter. Ha! I thought. This Laroux chap must be playing the role of an arrogant preppy face. The crowd will delight in his humiliation! I imagined the vile homophobic and even Francophobic abuse to which he would likely be subjected.

On the night of the Primal Conflict event, I intentionally showed up a little bit late, because, to be honest, I was fearing a rough crowd. Pro wrestling in West Virginia, don’t you know. But I was politely greeted and presented with the ticket I had PayPal-ed. I looked over to the ring, and, sure enough, there was Logan Easton Laroux being body-slammed to the mat. Ha! Just the ritual humiliation I anticipated! But I had most certainly not anticipated the sudden display of Primal Conflict wit that ensued. Our plucky Laroux dramatically recovered from his fall and adroitly pinned his opponent as the crowd happily cheered for him, cheered in unison, cheered an apparently rehearsed chant again and again: ONE PER CENT! ONE PER CENT!

So no homophobic obscenities??Au contraire! Here was a twist in narrative far more nuanced than anything you might read in the New York Times!

Soon enough I realized that this was wholesome family entertainment. The most enthusiastic fans seemed to be the eight- and nine-year-old boys. (A couple of the boys were proudly wearing their Halloween costumes.) There was no smoking, no drinking, no foul language, no sexual innuendo of any sort, and, above all no racial insults — just the opposite: For both the wrestlers and the spectators were a mix of white and black, and the most popular wrestler was a big black guy in an afro wig who “lost” his bout to a white guy who played a cheating sleazebag heel named Quinn. Also, significantly, there was zero police presence, and zero chance of any kind of actual altercation. When the night was over the promoter stood at the exit and shook the hand of and said good-bye and come-back to each of us departing spectators — sort of like, well, a pastor after church in a small southern town as his congregation disperses.

So here I was in the very midst of — to use Hillary Clinton’s contemptuous terminology— the deplorables. But they weren’t the racist misogynistic homophobes Clinton had condemned. The vibe was that everyone liked all the wrestlers, even the ones they had booed, and that everyone pretty much liked each other. During intermission the promoter called out a birthday greeting to a spectator named John. A middle-aged black guy stood up to a round of applause. He was with his wife and kids.

Where was the hate?

by Yves Smith, Naked Capitalism |  Read more:
Image: OR Books
[ed. See also: Is Politics a War of Ideas or of Us Against Them? (NY Times)]

US National Debt Passed $23 Trillion, Jumped $1.3 Trillion in 12 Months

And these are the good times. What happens in a recession?

The US gross national debt – the sum of all Treasury securities outstanding – passed another illustrious milestone, $23.01 trillion, the US Treasury department disclosed on Friday. And it got there at lightning speed just eight months after having passed the illustrious milestone of $22 trillion on February 11. Over the past 12 months, the US national debt has jumped by $1.33 trillion – and these are the good times, and not a financial crisis when everything goes to heck:


The cute flat spots in the charts are periods when the US government bumped into the “debt ceiling.” The US is unique among countries in that Congress first tells the government how much to spend, what to spend it on, and in whose Congressional district to spend it in, and then on the appointed day, Congress tells the government that it cannot borrow the money that it needs in order to spend the money that Congress told it to spend. The charade, carried out regularly for political purposes to arm-twist one or the other side, leaves these flat spots behind as permanent testimony to this idiocy.

Over the 12 months through the third quarter, the US gross national debt rose by 5.6% from the same period a year earlier. But nominal GDP over the same period rose only 3.7%: meaning the growth of the debt is outrunning the growth even in what President Trump called two days ago, the “Greatest Economy in American History!”

If the growth of the federal debt outruns the economy during these fabulously good times, what will the debt do when the recession hits? When government tax receipts plunge and government expenditures for unemployment and the like soar? The federal debt will jump by $2.5 trillion or more in a 12-month period. That’s what it will do.

The growth in the US debt (the growth of Treasury securities outstanding) is the most accurate measure of the true deficit – the actual cash difference between how much cash the government takes in and how much cash the government spends. The government has to borrow the difference between the two – and that’s what the increase in the debt measures.

This increase in the debt shows the negative cash flow of the government. And it’s almost always significantly larger than the “budget deficit,” which is based on government accounting.

For example, in the fiscal year 2019, ended September 30, the “budget deficit” was $984 billion, according to the Treasury Department. This is a huge number, considering that these are the good times. But the government had to borrow an additional $1.2 trillion over the same period. In other words, the actual cash deficit, as represented by the increase in the debt, was $219 billion higher than the government accounting of the deficit.

And this is the case year after year. The chart below shows the increase in the debt for each fiscal year (blue column) and the “deficit” as per government accounting, going back to 2002. Over these 18 years, there were only two years when the deficit was either the same or larger than the increase in the debt. For the remaining 16 years, the increase in the debt was far larger than the deficit. In total, over those 18 years, all added together, the increase in debt has exceeded the “budget deficit” by $5 trillion:


The budget deficit – the much more benign figure, huge as it is – is what is being bandied about. Practically no one in government or the mainstream media bandies about the increase in the debt, though it is the more truthful figure that cannot be played with.

by Wolf Richter, Wolfstreet |  Read more:
Images: US Treasury Dept. and Wolfstreet
[ed. See also: What Will Stocks Do When “Consensual Hallucination” Ends? and, for head-shaking, throw up your hands incredulity, Uber Loses Another $1.2 billion, Stock Dives Again (Wolfstreet).]
thatsbutterbaby:
“ Björn Keller
”

Björn Keller
via: