Tuesday, May 11, 2021

Is Capitalism Killing Conservatism?

The report Wednesday that U.S. birthrates fell to a record low in 2020 was expected but still grim. On Twitter the news was greeted, characteristically, by conservative laments and liberal comments implying that it’s mostly conservatism’s fault — because American capitalism allegedly makes parenthood unaffordable, work-life balance impossible and atomization inevitable.

This is a specific version of a long-standing argument about the tensions between traditionalism and capitalism, which seems especially relevant now that the right doesn’t know what it’s conserving anymore.

In a recent essay for New York Magazine, for instance, Eric Levitz argues that the social trends American conservatives most dislike — the rise of expressive individualism and the decline of religion, marriage and the family — are driven by socioeconomic forces the right’s free-market doctrines actively encourage. “America’s moral traditionalists are wedded to an economic system that is radically anti-traditional,” he writes, and “Republicans can neither wage war on capitalism nor make peace with its social implications.”

This argument is intuitively compelling. But the historical record is more complex. If the anti-traditional churn of capitalism inevitably doomed religious practice, communal associations or the institution of marriage, you would expect those things to simply decline with rapid growth and swift technological change. Imagine, basically, a Tocquevillian early America of sturdy families, thriving civic life and full-to-bursting pews giving way, through industrialization and suburbanization, to an ever-more-individualistic society.

But that’s not exactly what you see. Instead, as Lyman Stone points out in a recent report for the American Enterprise Institute (where I am a visiting fellow), the Tocquevillian utopia didn’t really yet exist when Alexis de Tocqueville was visiting America in the 1830s. Instead, the growth of American associational life largely happened during the Industrial Revolution. The rise of fraternal societies is a late-19th- and early-20th-century phenomenon. Membership in religious bodies rises across the hypercapitalist Gilded Age. The share of Americans who married before age 35 stayed remarkably stable from the 1890s till the 1960s, through booms and depressions and drastic economic change.

This suggests that social conservatism can be undermined by economic dynamism but also respond dynamically in its turn — through a constant “reinvention of tradition,” you might say, manifested in religious revival, new forms of association, new models of courtship, even as older forms pass away.

It’s only after the 1960s that this conservative reinvention seems to fail, with churches dividing, families failing, associational life dissolving. And capitalist values, the economic and sexual individualism of the neoliberal age, clearly play some role in this change.

But strikingly, after the 1960s, economic dynamism also diminishes as productivity growth drops and economic growth decelerates. So it can’t just be capitalist churn undoing conservatism, exactly, if economic stagnation and social decay go hand in hand.

One small example: Rates of geographic mobility in the United States, which you could interpret as a measure of how capitalism uproots people from their communities, have declined over the last few decades. But this hasn’t somehow preserved rural traditionalism. Quite the opposite: Instead of a rooted and religious heartland, you have more addiction, suicide and anomie.

Or a larger example: Western European nations do more to tame capitalism’s Darwinian side than America, with more regulation and family supports and welfare-state protections. Are their societies more fecund or religious? No, their economic stagnation and demographic decline have often been deeper than our own.

So it’s not that capitalist dynamism inevitably dissolves conservative habits. It’s more that the wealth this dynamism piles up, the liberty it enables and the technological distractions it invents let people live more individualistically — at first happily, with time perhaps less so — in ways that eventually undermine conservatism and dynamism together. At which point the peril isn’t markets red in tooth and claw, but a capitalist endgame that resembles Aldous Huxley’s “Brave New World,” with a rich and technologically proficient world turning sterile and dystopian.

by Ross Douthat, NY Times |  Read more:
Image: Carlos Javier Ortiz/Redux

Monday, May 10, 2021

The Truth about Painkillers

In October 2003, the Orlando Sentinel published "OxyContin under Fire," a five-part series that profiled several "accidental addicts" — individuals who were treated for pain and wound up addicted to opioids. They "put their faith in their doctors and ended up dead, or broken" the Sentinel wrote of these victims. Among them were a 36-year-old computer-company executive from Tampa and a 39-year-old Kissimmee handyman and father of three — the latter of whom died of an overdose.

The Sentinel series helped set the template for what was to become the customary narrative for reporting on the opioid crisis. Social worker Brooke Feldman called attention to the prototype in 2017:
Hannah was a good kid....Straight A student....Bright future. If it weren't for her doctor irresponsibly prescribing painkillers for a soccer injury and those damn pharmaceutical companies getting rich off of it, she never would have wound up using heroin.
Feldman, who has written and spoken openly about her own drug problem, knows firsthand of the deception embedded in the accidental-addict story. She received her first Percocet from a friend years after she'd been a serious consumer of marijuana, alcohol, benzodiazepines, PCP, and cocaine.

Indeed, four months after the original "OxyContin under Fire" story ran, the paper issued a correction: Both the handyman and the executive were heavily involved with drugs before their doctors ever prescribed OxyContin. Like Feldman, neither man was an accidental addict.

Yet one cannot overstate the media's continued devotion to the narrative, as Temple University journalism professor Jillian Bauer-Reese can attest. Soon after she created an online repository of opioid recovery stories, reporters began calling her, making very specific requests. "They were looking for people who had started on a prescription from a doctor or a dentist," she told the Columbia Journalism Review. "They had essentially identified a story that they wanted to tell and were looking for a character who could tell that story."

The story, of course, was the one about the accidental addict. But to what purpose?

Some reporters, no doubt, simply hoped to call attention to the opioid epidemic by showcasing sympathetic and relatable individuals — victims who started out as people like you and me. It wouldn't be surprising if drug users or their loved ones, aware that a victim-infused narrative would dilute the stigma that comes with addiction, had handed reporters a contrived plotline themselves.

Another theory — perhaps too cynical, perhaps not cynical enough — is that the accidental-addict trope was irresistible to journalists in an elite media generally unfriendly to Big Pharma. Predisposed to casting drug companies as the sole villain in the opioid epidemic, they seized on the story of the accidental addict as an object lesson in what happens when greedy companies push a product that is so supremely addictive, it can hook anyone it's prescribed to.

Whatever the media's motives, the narrative does not fit with what we've learned over two decades since the opioid crisis began. We know now that the vast majority of patients who take pain relievers like oxycodone and hydrocodone never get addicted. We also know that people who develop problems are very likely to have struggled with addiction, or to be suffering from psychological trouble, prior to receiving opioids. Furthermore, we know that individuals who regularly misuse pain relievers are far more likely to keep obtaining them from illicit sources rather than from their own doctors.

In short, although accidental addiction can happen, otherwise happy lives rarely come undone after a trip to the dental surgeon. And yet the exaggerated risk from prescription opioids — disseminated in the media but also advanced by some vocal physicians — led to an overzealous regime of pill control that has upended the lives of those suffering from real pain.

To be sure, some restrictions were warranted. Too many doctors had prescribed opioids far too liberally for far too long. But tackling the problem required a scalpel, not the machete that health authorities, lawmakers, health-care systems, and insurers ultimately wielded, barely distinguishing between patients who needed opioids for deliverance from disabling pain and those who sought pills for recreation or profit, or to maintain a drug habit.

The parable of the accidental addict has resulted in consequences that, though unintended, have been remarkably destructive. Fortunately, a peaceable co-existence between judicious pain treatment, the curbing of pill diversion, and the protection of vulnerable patients against abuse and addiction is possible, as long as policymakers, physicians, and other authorities are willing to take the necessary steps. (...)

Many physicians... began refusing to prescribe opioids and withdrawing patients from their stable opioid regimens around 2011 — approximately the same time as states launched their reform efforts. Reports of pharmacies declining to fill prescriptions — even for patients with terminal illness, cancer pain, or acute post-surgical pain — started surfacing. At that point, 10 million Americans were suffering "high impact pain," with four in five being unable to work and a third no longer able to perform basic self-care tasks such as washing themselves and getting dressed.

Their prospects grew even more tenuous with the release of the CDC's "Guideline for Prescribing Opioids for Chronic Pain" in 2016. The guideline, which was labeled non-binding, offered reasonable advice to primary-care doctors — for example, it recommended going slow when initiating doses and advised weighing the harms and benefits of opioids. It also imposed no cap on dosage, instead advising prescribers to "avoid increasing dosage to ≥90 MME per day." (An MME, or morphine milligram equivalent, is a basic measure of opioid potency relative to morphine: A 15 mg tablet of morphine equals 15 MMEs; 15 mg of oxycodone converts to about 25 mg morphine.)

Yet almost overnight, the CDC guideline became a new justification for dose control, with the 90 MME threshold taking on the power of an enforceable national standard. Policymakers, insurers, health-care systems, quality-assurance agencies, pharmacies, Department of Veterans Affairs medical centers, contractors for the U.S. Centers for Medicare and Medicaid Services, and state health authorities alike employed 90 MME as either a strict daily limit or a soft goal — the latter indicating that although exceptions were possible, they could be made only after much paperwork and delay.

As a result, prescribing fell even more sharply, in terms of both dosages per capita and numbers of prescriptions written. A 2019 Quest Diagnostics survey of 500 primary-care physicians found that over 80% were reluctant to accept patients who were taking prescription opioids, while a 2018 survey of 219 primary-care clinics in Michigan found that 41% of physicians would not prescribe opioids for patients who weren't already receiving them. Pain specialists, too, were cutting back: According to a 2019 survey conducted by the American Board of Pain Medicine, 72% said they or their patients had been required to reduce the quantity or dose of medication. In the words of Dr. Sean Mackey, director of Stanford University's pain-management program, "[t]here's almost a McCarthyism on this, that's silencing so many [health professionals] who are simply scared."

by Sally Satel, National Affairs |  Read more:
Image: uncredited
[ed. Finally, a voice in the wilderness. Relatedly, I'd like to see a more nuanced discussion on the topics of dependency and addiction. The author (like everyone else) assumes anything addictive is unquestionably bad (especially if it's something that makes you feel good). If you need insulin, are you dependent and an addict? Of course, but who would deny insulin to a diabetic, and what's the difference? How might the world look if people had a steady and reliable supply of medications, for whatever reasons? Couldn't be much worse than it is now and might solve a lot of social problems. RIP Tom Petty and Prince.]

Why Do All Records Sound the Same?

When you turn on the radio, you might think music all sounds the same these days, then wonder if you’re just getting old. But you’re right, it does all sound the same. Every element of the recording process, from the first takes to the final tweaks, has been evolved with one simple aim: control. And that control often lies in the hands of a record company desperate to get their song on the radio. So they’ll encourage a controlled recording environment (slow, high-tech and using malleable digital effects).

Every finished track is then coated in a thick layer of audio polish before being market-tested and dispatched to a radio station, where further layers of polish are applied until the original recording is barely visible. That’s how you make a mainstream radio hit, and that’s what record labels want. (...)

When people talk about a shortage of ‘warm’ or ‘natural’ recording, they often blame digital technology. It’s a red herring, because copying a great recording onto CD or into an iPod doesn’t stop it sounding good. Even self-consciously old fashioned recordings like Arif Mardin’s work with Norah Jones was recorded on two inch tape, then copied into a computer for editing, then mixed through an analogue console back into the computer for mastering. It’s now rare to hear recently-produced audio which has never been through any analogue-digital conversion—although a vinyl White Stripes album might qualify.

Until surprisingly recently—maybe 2002—the majority of records were made the same way they’d been made since the early 70s: through vast, multi-channel recording consoles onto 24 or 48-track tape. At huge expense, you’d rent purpose-built rooms containing perhaps a million pounds’ worth of equipment, employing a producer, engineer and tape operator. Digital recording into a computer had been possible since the mid 90s, but major producers were often sceptical.

By 2000, Pro Tools, the industry-standard studio software, was mature and stable and sounded good. With a laptop and a small rack of gear costing maybe £25,000 you could record most of a major label album. So the business shifted from the console—the huge knob-covered desk in front of a pair of wardrobe-sized monitor speakers—to the computer screen. You weren’t looking at the band or listening to the music, you were staring at 128 channels of wiggling coloured lines.

“There’s no big equipment any more,” says John Leckie. “No racks of gear with flashing lights and big knobs. The reason I got into studio engineering was that it was the closest thing I could find to getting into a space ship. Now, it isn’t. It’s like going to an accountant. It changes the creative dynamic in the room when it’s just one guy sitting staring at a computer screen.”

“Before, you had a knob that said ‘Bass.’ You turned it up, said ‘Ah, that’s better’ and moved on. Now, you have to choose what frequency, and the slope, and how many dBs, and it all makes a difference. There’s a constant temptation to tamper.”

What makes working with Pro Tools really different from tape is that editing is absurdly easy. Most bands record to a click track, so the tempo is locked. If a guitarist plays a riff fifty times, it’s a trivial job to pick the best one and loop it for the duration of the verse.

“Musicians are inherently lazy,” says John. “If there’s an easier way of doing something than actually playing, they’ll do that.” A band might jam together for a bit, then spend hours or days choosing the best bits and pasting a track together. All music is adopting the methods of dance music, of arranging repetitive loops on a grid. With the structure of the song mapped out in coloured boxes on screen, there’s a huge temptation to fill in the gaps, add bits and generally clutter up the sound.(...)

Once the band and producer are finished, their multitrack—usually a hard disk containing Pro Tools files for maybe 128 channels of audio—is passed onto a mix engineer. L.A.-based JJ Puig has mixed records for Black Eyed Peas, U2, Snow Patrol, Green Day and Mary J Blige. His work is taken so seriously that he’s often paid royalties rather than a fixed fee. He works from Studio A at Ocean Way Studios on the Sunset Strip. The control room looks like a dimly-lit library. Instead of books, the floor-to-ceiling racks are filled with vintage audio gear. This is the room where Frank Sinatra recorded “It Was A Very Good Year” and Michael Jackson recorded “Beat It.”

And now, it belongs to JJ Puig. Record companies pay him to essentially re-produce the track, but without the artist and producer breathing down his neck. He told Sound On Sound magazine: “When I mixed The Rolling Stones’ A Bigger Bang album, I reckoned that one of the songs needed a tambourine and a shaker, so I put it on. If Glyn Johns [who produced Sticky Fingers] had done that many years ago, he’d have been shot in the head. Mick Jagger was kind of blown away by what I’d done, no-one had ever done it before on a Stones record, but he couldn’t deny that it was great and fixed the record.”

When a multitrack arrives, JJs assistant tidies it up, re-naming the tracks, putting them in the order he’s used to and colouring the vocal tracks pink. Then JJ goes through tweaking and polishing and trimming every sound that will appear on the record. Numerous companies produce plugins for Pro Tools which are digital emulations of the vintage rack gear that still fills Studio One. If he wants to run Fergie’s vocal through a 1973 Roland Space Echo and a 1968 Marshall stack, it takes a couple of clicks.

Some of these plugins have become notorious. Auto Tune, developed by former seismologist Andy Hildebrand, was released as a Pro Tools plugin in 1997. It automatically corrects out of tune vocals by locking them to the nearest note in a given key. The L1 Ultramaximizer, released in 1994 by the Israeli company Waves, launched the latest round of the loudness war. It’s a very simple looking plugin which neatly and relentlessly makes music sound a lot louder (a subject we’ll return to in a little while).

When JJ has tweaked and polished and trimmed and edited, his stereo mix is passed on to a mastering engineer, who prepares it for release. What happens to that stereo mix is an extraordinary marriage of art, science and commerce. The tools available are superficially simple—you can really only change the EQ or the volume. But the difference between a mastered and unmastered track is immediately obvious. Mastered recordings sound like real records. That is to say, they all sound a little bit alike.

by Tom Whitwell, Cuepoint | Read more:
Image: uncredited

Saturday, May 8, 2021

via:

Jack Nicklaus: Golf My Way

[ed. Can't believe it. Jack Nicklaus's Golf My Way on YouTube. One of the best golf instruction videos, ever.]


via:
[ed. Jingle and Sam]

Why Stocks Soared While America Struggled

You would never know how terrible the past year has been for many Americans by looking at Wall Street, which has been going gangbusters since the early days of the pandemic.

“On the streets, there are chants of ‘Stop killing Black people!’ and ‘No justice, no peace!’ Meanwhile, behind a computer, one of the millions of new day traders buys a stock because the chart is quickly moving higher,” wrote Chris Brown, the founder and managing member of the Ohio-based hedge fund Aristides Capital in a letter to investors in June 2020. “The cognitive dissonance is overwhelming at times.”

The market was temporarily shaken in March 2020, as stocks plunged for about a month at the outset of the Covid-19 outbreak, but then something strange happened. Even as hundreds of thousands of lives were lost, millions of people were laid off and businesses shuttered, protests against police violence erupted across the nation in the wake of George Floyd’s murder, and the outgoing president refused to accept the outcome of the 2020 election — supposedly the market’s nightmare scenario — for weeks, the stock market soared. After the jobs report from April 2021 revealed a much shakier labor recovery might be on the horizon, major indexes hit new highs.


The disconnect between Wall Street and Main Street, between corporate CEOs and the working class, has perhaps never felt so stark. How can it be that food banks are overwhelmed while the Dow Jones Industrial Average hits an all-time high? For a year that’s been so bad, it’s been hard not to wonder how the stock market could be so good.

To the extent that there can ever be an explanation for what’s going on with the stock market, there are some straightforward financial answers here. The Federal Reserve took extraordinary measures to support financial markets and reassure investors it wouldn’t let major corporations fall apart. Congress did its part as well, pumping trillions of dollars into the economy across multiple relief bills. Turns out giving people money is good for markets, too. Tech stocks, which make up a significant portion of the S&P 500, soared. And with bond yields so low, investors didn’t really have a more lucrative place to put their money.

To put it plainly, the stock market is not representative of the whole economy, much less American society. And what it is representative of did fine.

“No matter how many times we keep on saying the stock market is not the economy, people won’t believe it, but it isn’t,” said Paul Krugman, a Nobel Prize-winning economist and New York Times columnist. “The stock market is about one piece of the economy — corporate profits — and it’s not even about the current or near-future level of corporate profits, it’s about corporate profits over a somewhat longish horizon.”

Still, those explanations, to many people, don’t feel fair. Investors seem to have remained inconceivably optimistic throughout real turmoil and uncertainty. If the answer to why the stock market was fine is basically that’s how the system works, the follow-up question is: Should it?

“Talking about the prosperous nature of the stock market in the face of people still dying from Covid-19, still trying to get health care, struggling to get food, stay employed, it’s an affront to people’s actual lived experience,” said Solana Rice, the co-founder and co-executive director of Liberation in a Generation, which pushes for economic policies that reduce racial disparities. “The stock market is not representative of the makeup of this country.”

Inequality is not a new theme in the American economy. But the pandemic exposed and reinforced the way the wealthy and powerful experience what’s happening so much differently than those with less power and fewer means — and force the question of how the prosperity of those at the top could be better shared with those at the bottom. There are certainly ideas out there, though Wall Street might not like them.

by Emily Stewart, Vox |  Read more:
Image: Vox
[ed. See also: Counting the Chickens Twice; and, Always a Reckoning (Hussman Funds).]

Friday, May 7, 2021

Sam Middleton, 1927-2015, Untitled, Mixed media on paper.

Liberal Democratic Party (Japan)

The Liberal Democratic Party of Japan (自由民主党, Jiyū-Minshutō), frequently abbreviated to LDP or Jimintō (自民党), is a conservative political party in Japan.

The LDP has almost continuously been in power since its foundation in 1955—a period called the 1955 System—with the exception of a period between 1993 and 1994, and again from 2009 to 2012. In the 2012 election it regained control of the government. It holds 285 seats in the lower house and 113 seats in the upper house, and in coalition with the Komeito since 1999, the governing coalition has a supermajority in both houses. Prime Minister Yoshihide Suga, former Prime Minister Shinzo Abe and many present and former LDP ministers are also known members of Nippon Kaigi, an ultranationalist and monarchist organization.

The LDP is not to be confused with the now-defunct Democratic Party of Japan (民主党, Minshutō), the main opposition party from 1998 to 2016, or the Democratic Party (民進党, Minshintō), the main opposition party from 2016 to 2017.[16] The LDP is also not to be confused with the 1998-2003 Liberal Party (自由党, Jiyūtō) or the 2016-2019 Liberal Party (自由党, Jiyū-tō). (...)

Beginnings

The LDP was formed in 1955 as a merger between two of Japan's political parties, the Liberal Party (自由党, Jiyutō, 1945–1955, led by Shigeru Yoshida) and the Japan Democratic Party (日本民主党, Nihon Minshutō, 1954–1955, led by Ichirō Hatoyama), both right-wing conservative parties, as a united front against the then popular Japan Socialist Party (日本社会党, Nipponshakaitō), now Social Democratic Party (社会民主党, Shakaiminshutō). The party won the following elections, and Japan's first conservative government with a majority was formed by 1955. It would hold majority government until 1993.

The LDP began with reforming Japan's international relations, ranging from entry into the United Nations, to establishing diplomatic ties with the Soviet Union. Its leaders in the 1950s also made the LDP the main government party, and in all the elections of the 1950s, the LDP won the majority vote, with the only other opposition coming from left-wing politics, made up of the Japan Socialist Party and the Japanese Communist Party.

Ideology

The LDP has not espoused a well-defined, unified ideology or political philosophy, due to its long-term government, and has been described as a "catch-all" party. Its members hold a variety of positions that could be broadly defined as being to the right of the opposition parties. The LDP is usually associated with conservatism and Japanese nationalism. The LDP traditionally identified itself with a number of general goals: rapid, export-based economic growth; close cooperation with the United States in foreign and defense policies; and several newer issues, such as administrative reform. Administrative reform encompassed several themes: simplification and streamlining of government bureaucracy; privatization of state-owned enterprises; and adoption of measures, including tax reform, in preparation for the expected strain on the economy posed by an aging society. Other priorities in the early 1990s included the promotion of a more active and positive role for Japan in the rapidly developing Asia-Pacific region, the internationalization of Japan's economy by the liberalization and promotion of domestic demand (expected to lead to the creation of a high-technology information society) and the promotion of scientific research. A business-inspired commitment to free enterprise was tempered by the insistence of important small business and agricultural constituencies on some form of protectionism and subsidies. In addition, the LDP opposes the legalization of same-sex marriage.

by Wikipedia |  Read more:
Image: Wikipedia
[ed. When countries are run like corporations (the LDP has been in power most of my life, how could I not have known this?). See also: the 1955 System link; and Did Japan’s Prime Minister Abe Serve Obama Beefsteak-Flavored Revenge for US Trade Representative Froman’s TPP Rudeness? (Naked Capitalism).]

via:

Mark Lanegan


[ed. See also: The Winding Sheet (full album).]

Thursday, May 6, 2021

Three Club Challenge

You probably have a club in your bag that you love. One that’s as reliable as Congress is dysfunctional. For Kevin Costner in Tin Cup it was his trusty seven iron. For someone like Henrik Stenson, probably his three wood. For me, it’s my eight iron. There are certain clubs, either through experience, ability or default just seem to stand out.

But then there are those clubs that just give us the willies. For example, unlike Henrik I’d put my three wood in that category. I’m convinced no amount of practice will ever make me better with that club. I invariably chunk it or thin it, rarely hitting it straight, but I keep carrying it around because I’m convinced I might need to hit a 165 yard blooper on a 210 yard approach. A friend of mine has problems with his driver. He carries it around off and on for weeks and never uses it because “it’s just not working”. Little wonder.

If you’ve been golfing for a while you’ve probably indulged in the ‘what if’ question. I’m not talking about the misery stories you hear in the clubhouse after a round, those tears-in-the-beers laments of ‘what if I’d only laid up instead of trying to cut that corner’, or, ‘what if I hadn’t bladed that bunker shot into the lake’? Bad decisions and bad breaks. Conversations like those will go on as long as golf exists and really aren’t all that interesting (except, perhaps, for the person drowning their sorrows).

What I’m talking about is a more existential question. One that goes to the heart of every golfer’s game: What if you only had three clubs to play with, which ones would you choose? And why?

It’s a fun thought experiment because it makes you think about your abilities in a more distilled perspective: how well do I hit my clubs and what’s the best combination to use to get around a course in the lowest possible score?

Maybe you’ve had the chance to compete in a three-club tournament. They’re out there. Once in a while someone puts one together and they sound like a lot of fun. I’ve never had the opportunity to play in one myself, but recently did get the chance to try my own three club experiment with some surprising results.

Caveat: I’m not here to suggest that there’s one right mix of clubs for everyone, but I will say that it’s possible to shoot par golf (or better) with only three golf clubs.

First, some background. I’m an old guy, a senior golfer that’s been playing the game for nearly 25 years. High single to low double digit handicap (I’m guessing since I don’t keep a handicap). Usually shoot in the low to mid-80s with an occasional excursion into the high 70s.

Lately I’ve been playing on a nice nine hole course that rarely sees more than a dozen golfers at any time, even on the weekends. It’s not an executive course or a goat-track. In fact it’s as challenging a course as any muni, if not more so, and definitely in better condition. The greens keeping staff keep it in excellent shape and share resources with a nearby Nicklaus-designed course. It’s your average really nice nine hole course, and would command premium prices if expanded to 18 holes.

Anyway, because there’s hardly anyone around I usually play three balls, mainly for exercise and practice. I’ve always carried my bag, so it’s easy to drive up, unload my stuff, stick three balls in my pocket and take off.

A while back we had some strong winds. Stiff, persistent winds. I don’t mind playing in wind, but these were strong enough that my stand bag kept falling over when I set it down, and twisting around my body, throwing me off balance and making it hard to walk. I must have looked a bit like a drunk staggering up the fairways (not an uncommon sight on some of the courses I’ve played).

So I decided to dump the bag and play with three clubs.

But which ones? Keep in mind that everyone is different, so the clubs I selected are the ones I thought would work best for me.

To begin with, I realized that two are already taken. First, I’d need a putter. According to Golf Digest and Game Golf, you need a putter roughly 41 percent of the time on average. I don’t know about you, but I’m not going to try putting with a driver, three wood, or hybrid no matter how utilitarian they might be. It just feels too awkward. Perhaps it’s just personal preference, and if that’s not a big deal with you go for it.

The next club I selected was something that could get me close from a 120 yards out, help around the fringe, and get me out of a bunker. No brainer: sand wedge. I thought about a lob wedge but it didn’t have the distance, and a gap or pitching wedge was just too tough out of the sand and didn’t have enough loft for short flops to tight pins.

Finally, the last club in my arsenal. Six iron. Why the six? A number of reasons. First, and probably most important: I'm terrible with my six iron. Not as bad as my three wood, but for some reason the six has always given me problems. Maybe it's because I’ve never been fitted for clubs and it always stood out as being more difficult than most of the others. I don’t know why, really. In any case, I thought “why not get a little more practice and see if I can get this guy under control”? It also has the distance. When I hit it well I can get it maybe roughly 170 yards. Maybe. So that completed the set. My new streamlined self was ready for the wind.

Here’s where it gets interesting. Given that most Par 4s are generally in the 350 – 450 yard range (see here and here) and Par 5s generally about 450-690 yards (see here), it’s not that hard if you’re hitting a 170 yard six iron to get on the green in two shots on shorter Par 4s, and on in three for shorter Par 5s. Even on longer holes if you come up short, you’re still close enough that it’s a sand wedge into the green, usually pitching or chipping from 100 yards or less. Then it’s just a putt for par. Plus, that second or third shot is usually from the middle of the fairway, so there’s an excellent chance you’ll put your wedge shot in a good position. I’ve been pleasantly surprised to find how many pars I can make, and sometimes even a birdie or two, with just three clubs. It all depends on the length of the hole and the accuracy of my chipping and putting. And of course the wind. It’s a great way to get better at iron play and, especially, your short game from 100 yards in.

But there’s more, and here’s where it really gets fun. For various reasons, sometimes I’ll find myself somewhere in the 120 – 160 yard range coming into a green. Too long for a sand wedge but too short for a six iron so I’ve had to learn to dial it back a bit. Hitting a six iron a 140 yards is not that much different than hitting a half swing pitch, but with more control and easier effort. The fun thing is learning how much swing is needed for the various distances within that 40 yard gap. For a while, I’d frequently come up 10 yards short or 10 yards long of the green, but it’s getting better, and again, it’s been another opportunity to sharpen up my short game.

I’ve tried substituting a five iron and even a hybrid for more distance off the tee, but the second shot seems harder to control with less lofted clubs (particularly tough on short Par 3s). Maybe those clubs might work better for other golfers depending on their skill set, but dialing it back is the trickiest part for me. To each his own. The six iron just seemed to strike the right balance. The main thing is finding the right clubs that will give you the greatest accuracy, distance, and control.

Now I have a whole new perspective on the game. Besides being in the fairway more often, I’m hitting more greens in regulation and, when short, still chipping or pitching up to putt for par. I’ve also enjoyed the new sense of creativity. Too often in the past I’d just take whatever club was at the outer limits of my abilities and swing away, full blast (with variable directional and distance control). Now I don’t mind taking a lesser club and swinging easier. To top it off, my iron play and short game have improved considerably. My sand wedge used to be my go to 80-90 yard club, now it tops out at 115. Six iron went from a shaky 170 to a reliable 170. My putting still stinks. Maybe the pros can dial in pin point accuracy with every club, but given the variability I have throughout my bag it’s been much more helpful to just focus on a few clubs and work on improving those. It also speeds up the game considerably.

So, last week I took my full bag out thinking I needed to tune up my driver, three wood and other clubs since I didn’t want those skills to get too rusty. Guess what? I shot worse than I did with my three club setup - mainly because I was all over the fairway and in the woods again. I’m not ready to give up my whole bag yet, but it is gratifying to know that there are still a few new ways to rediscover the game and enjoy new challenges. Give it a try sometime. You might find less is more.

by markk, Duck Soup
Image: markk

Rotation Of Earth Plunges Entire North American Continent Into Darkness

Millions of eyewitnesses watched in stunned horror Tuesday as light emptied from the sky, plunging the U.S. and neighboring countries into darkness. As the hours progressed, conditions only worsened.

At approximately 4:20 p.m. EST, the sun began to lower from its position in the sky in a westward trajectory, eventually disappearing below the horizon. Reports of this global emergency continued to file in from across the continent until 5:46 p.m. PST, when the entire North American mainland was officially declared dark.

As the phenomenon hit New York, millions of motorists were forced to use their headlights to navigate through the blackness. Highways flooded with commuters who had left work to hurry home to their families. Traffic was bottlenecked for more than two hours in many major metropolitan areas.

Across the country, buses and trains are operating on limited schedules and will cease operation shortly after 12 a.m. EST, leaving hundreds of thousands of commuters in outlying areas effectively stranded in their homes.

Despite the high potential for danger and decreased visibility, scientists say they are unable to do anything to restore light to the continent at this time.

"Vast gravitational forces have rotated the planet Earth on an axis drawn through its north and south poles," said Dr. Elena Bilkins of the National Weather Service. "The Earth is in actuality spinning uncontrollably through space."

Bilkins urged citizens to remain calm, explaining that the Earth's rotation is "utterly beyond human control."

"The only thing a sensible person can do is wait it out," she said.

Commerce has been brought to a virtual standstill, with citizens electing either to remain home with loved ones or gather in dimly lit restaurants and bars.

"I looked out the window and saw it getting dark when I was still at the office working," said Albert Serpa, 27, a lawyer from Tulsa, OK, who had taken shelter with others at Red's Bar and Grill. "That's when I knew I had to leave right away."

Ronald Jarrett, a professor of economics at George Washington University who left his office after darkness blanketed the D.C. metro area, summed up the fears of an entire nation, saying, "Look, it's dark outside. I want to go home," and ended the phone interview abruptly.

Businesses have shut their doors, banks are closed across the nation, all major stock exchanges have suspended trading, and manufacturing in many sectors has ceased.

Some television stations have halted broadcasting altogether, for reasons not immediately understood.

Law-enforcement agencies nationwide were quick to address the crisis.

Said NYPD spokesman Jake Moretti: "Low-light conditions create an environment that's almost tailor-made for crime. It's probably safe to say we'll make more arrests in the next few hours than we have all day."

Darkness victims describe hunger pangs, lassitude, and a slow but steady loss of energy, forcing many to lie down. As many as two-thirds of those believed afflicted have fallen into a state of total unconsciousness.

Many parents report that their younger children have been troubled, even terrified, by the deep darkness. To help allay such fears, some parents are using an artificial light source in the hallway or bedroom.

As of 2 a.m. EST, the continent was still dark, the streets empty and silent. However, some Americans remained hopeful, vowing to soldier on despite the crisis.

"I don't plan on doing anything any different," said Chicago-area hospice worker Janet Cosgrove, 51. "I'm going to get up in the morning and go to work."

by The Onion |  Read more:
Image: Satellite view at 4:50 p.m. EST shows sun disappearing from the sky. Houston-area victims flee their workplaces ahead of the growing wave of darkness.


Katherine Lam
via:

Bad Bunny

Tracking Transparency: The Problem With Apple’s Plan to Stop Facebook’s Data Collection

On Monday, Apple released iOS 14.5, the smartphone update that Facebook fears.

The operating system upgrade includes improvements like a Face ID that’s better attuned to face masks, as well as more convenient interoperability between Siri and various music apps. What’s generating the most discussion, though, is a tool called App Tracking Transparency, which allows users to prevent apps from sharing identifiable personal data with third parties. The tool, billed as a major step forward for user privacy, could roil the digital ads industry, whose major players often track users as they move between apps on their phones. The update should’ve shown up on your iPhone by now; if you weren’t prompted to download it, go to “Software Update” under your general settings. It’s a milestone for the consumer web and a possible blow to social media’s business model, which depends on selling highly personalized advertising. There’s one hitch: Even when you turn it on, you might not notice a single thing has changed.

Whenever you download an app using iOS 14.5, a notification will appear asking whether you want to allow it to “track your activity across other companies’ apps and websites.” You can either select “Ask App Not to Track” or “Allow.” You can also opt in or out of tracking for an app at any time by navigating to the “Privacy” menu in the device’s settings and clicking on “Tracking.” From there you’ll see a list of apps alongside switches you can toggle to turn the tool on or off. Asking an app not to track you means that it isn’t allowed to transmit any of the identifiable location, contact, health, browsing history, or other info that it collected on you with advertisers, data brokers, or anyone else who might be interested in learning more about you. This should prevent, say, Facebook from serving you ads on grills based on the fact that you were searching for them on Chrome. Apps won’t be able to combine data they gather on you with information collected elsewhere by third parties.

It might seem like a fairly unremarkable feature, but App Tracking Transparency has the potential to reorient users’ relationships with their personal data, primarily by making the tracking opt-in. Prior to iOS 14.5, users did have the ability to limit the data that apps shared, but the default was to allow tracking, and you had to proactively check the settings to turn it off. Having the apps themselves ask this question upfront is an important aspect of the shift. “It’s not just giving users the choice,” said Gennie Gebhart, acting activism director of the Electronic Frontier Foundation, the internet-rights advocacy group. “It’s forcing these app developers to ask permission and possibly just stop tracking preemptively so they don’t have this scary permission associated with their app.” There’s a whole industry built around targeting ads using personal data, and if enough people start regularly opting out of tracking, Apple’s new tool could frustrate many of the businesses in this space. Facebook, in particular, is expecting the tool to have a small but noticeable effect on its revenue and has been taking out full-page ads characterizing Apple’s move as hurting small businesses. During the company’s quarterly earnings call in January, Facebook CEO Mark Zuckerberg accused Apple of trying to “use their dominant platform position to interfere with how our apps and other apps work.”

However, if you turn tracking off for everything, will there be any actual differences in how you use your apps or what ads you see? According to Gebhart, it might not be so clear-cut. For instance, if you’re searching for grills on Chrome and then see ads for them on Facebook, that doesn’t necessarily mean that the tool isn’t doing its job. The ad-targeting ecosystem is so complex and multilayered that there are a number of ways that Facebook could determine that you want a grill just from your activity through first-party tracking on its own app. The platform could gather that you’re a middle-aged man who just bought a house in the suburbs at the beginning of the summer and determine that it’s likely that you’re in the market for a grill without ever looking at your Chrome browsing history. In other words, there probably won’t be any noticeable signs indicating that the tool is or isn’t working. You’ll know in theory that apps shouldn’t be swapping your data amongst themselves, but it likely won’t look all that different in practice.

This opacity is partly meant to make the user experience simpler. Being constantly exposed to the under-the-hood mechanics of the apps on your phone could be overwhelming. At the same time, though, keeping all this tracking hidden serves to obscure just how much of your personal info your apps are collecting and sharing. Apple’s new tool adds some more transparency—and thus friction—back into the equation, but there’s still a lot you won’t really be able to see. “There’s so much going on under the surface that advertisers and data brokers don’t want you to see, and all of the sudden Apple is forcing some of that above the surface, but it’s hard to say what to look for to know whether App Tracking Transparency is working,” Gebhart said, adding that users will continue to be at a disadvantage in trying to maintain their privacy online because of this dramatic information asymmetry. Still, one change that could result from Apple’s move is more chicanery from the data-hoovering business, which will need to find more creative ways to build profiles of internet users that can help advertisers target consumers. That’s why Gebhart says she’ll track which apps, if any, Apple decides to kick off its store for violating the tracking rules and any changes in strategy that companies in the digital ads industry are making.

With Apple acting as an unofficial regulator of user privacy, how will companies that want your data cope? It helps to understand how Apple’s update works.

by Aaron Mak, Slate |  Read more:
Image: Loic Venance/AFP via Getty Images
[ed. See also: Apple robbed the mob’s bank (Mobile Dev Memo).]

Wednesday, May 5, 2021

Decision Fatigue: Inside Netflix’s Quest to End Scrolling

Ten years ago, Netflix got the idea that its app should work more like regular TV. This was early on in its transition from DVD delivery to streaming on demand, and product engineers at the company were still figuring out how the platform’s user interface would work. Instead of having subscribers start their streaming sessions scrolling through rows and rows of content, they wondered what would happen if a show or movie simply began playing as soon as someone clicked into the app — you know, like turning on your dad’s boxy old Zenith.

“It was the early days of Netflix having a television user interface, and we saw this as a great possibility,” says Todd Yellin, who as vice-president of product for the streaming giant helps shape how users interact with the platform. They liked the idea so much, they quietly tested it out among a small slice of subscribers.

But users weren’t impressed. “It failed,” the 14-year Netflix vet tells me. “The technology wasn’t as good. And I don’t think consumers were ready for it.”

Netflix believes audiences are ready now. Today, the company is launching Play Something, a new viewing mode designed to make it easier for the indecisive among us to quickly find something to watch. As with those early forays into instant-playing content, the goal of this new shuffle feature is to eliminate, or at least ease, the Peak TV-era anxiety so many of us feel while trying to find something to watch. But unlike its past attempt, it won’t be automatic: You’ll have to opt in — either at start-up or when you’re browsing your home page. If you do, the usual page upon page of box art and show descriptions disappears. Instead, the Netflix matrix chooses something it thinks you’ll be into and just starts streaming it, along with an onscreen graphic briefly explaining why it chose that title. Don’t like what you see? A quick button press skips ahead to another selection. If you suddenly decide an earlier selection is actually a better pick, you can also go backward. (The feature will initially be available on all Netflix TV apps and, soon, on mobile for Android devices.)

by Josef Adalian, Vulture | Read more:
Image: Martin Gee

FDA Will Authorize Pfizer COVID Vaccine for Ages 12 and Older

The Food and Drug Administration is preparing to authorize the use of the Pfizer-BioNTech coronavirus vaccine for adolescent kids aged 12-15 years within a week, according to federal officials who spoke with the New York Times. If and when that happens, it would mark the first time any coronavirus vaccine has been authorized for emergency use for Americans under the age of 16 — a long-awaited development for countless parents in the U.S., as well as the beginning of a new phase in the country’s vaccine rollout.

Pfizer has previously reported that its vaccine was found to be as effective in the 12-15 age group as it was in adults, with no additional side effects, in its clinical trial involving adolescents. (The side effects were in line with what recipients aged 16-25 experienced.) FDA authorization of the vaccine for the new age group would likely be followed by a quick review from the Centers for Disease Control and Prevention’s vaccine advisory panel, as was the case for other previous vaccine authorizations. After the panel makes its recommendation, vaccine administration sites would likely be able to start giving out doses to adolescents immediately.

As the Times notes, amid consistent supply and weakening demand, there is currently a vaccine surplus in the U.S., including some 31 million doses of the Pfizer vaccine which have already been delivered throughout the country — by itself would roughly be enough to fully vaccine every adolescent. While all people 16 and older are currently eligible to receive a COVID vaccine, it’s not clear if all states would immediately expand that eligibility to ages 12 and over following the FDA authorization of the Pfizer vaccine.

by Chas Danner, Intellingencer |  Read more:
Image: Frederic J. Brown/AFP via Getty Images

Higher Ed 2.0 (What We Got Right/Wrong)

Higher education is the most important industry in America. It’s the vaccine against the inequities of capitalism, the lubricant of upward economic mobility, and the midwife of gene therapies and search engines. College graduates lead longer, healthier, and more prosperous lives. University research provides the raw material for corporate innovation. Our institutions call to our shores the best and the brightest from around the world, many of whom stay to make America a stronger nation, and form connective tissue with their home countries.

Higher ed presents a promise for America and the world. But every year, my industry falls further from that promise. The underpinnings of that view are simple and static: Higher ed has increased its prices by 1,400 percent since the late 1970s, but the product has not appreciably changed — the biggest cost driver is administrative bloat. And, rather than catalyzing economic mobility, U.S. higher ed is stifling it. At 38 of the top 100 colleges in America, including five of the Ivies, there are more students from the top 1 percent of income earning households than there are from the bottom 60 percent.

Yet, higher ed’s brand strength (nobody donates $20 million to name a building on Apple’s campus), cash reserves (among elite institutions), and stranglehold on the American psyche have insulated it from disruption for decades. The litmus test for success — and for determining your role in our species — is if (and which) college your daughter graduates from. Suzy might be a heroin addict, but if she’s at Dartmouth, all is forgiven.

Last year, we predicted the pandemic would be the fist of stone that finally meets higher ed’s glass jaw. Were we correct regarding the coming storm? Or were we alarmist, underestimating institutional strength (or rather, inertia)? The answer is … yes.

K-Shaped (College) Recovery

As we begin to emerge from Covid-19 (God’s ears), American colleges look like the rest of America. Specifically, the fifth installment of the Hunger Games franchise, wherein our society engages in the idolatry of economic winners, everyone else hopes to survive, and many meet a gruesome end.

The most selective schools received 17 percent more applications than last year, and the elite of the elite saw even greater increases: Yale, 33 percent; Harvard, 43 percent; MIT 66 percent. The rich are getting richer. (...)

Admission to a top school can be life changing, but in a country that graduates over 3.5 million people from high school every year, the 1,700-person freshman class at Harvard is immaterial. Over the past 30 years, the number of seats at Ivy League schools has increased only 14 percent, while the number of high school graduates has expanded by 44 percent. The Ivy League sits on a total endowment of $140B, and shares this wealth with just 17,000 freshmen each year. (...)


A year ago, I predicted that these schools would leverage their extraordinary brand strength and the forced adoption of distance learning technology to partner with big tech and radically expand their class sizes. “In 10 years,” I told New York magazine, “it’s feasible to think that MIT doesn’t welcome 1,000 freshmen to campus; it welcomes 10,000.”

I was wrong. The leadership and alumni of elite universities continue to register tremendous psychic reward from the Hermès-ification of their institutions, versus any remaining claim of public service.

Wildlings

While the lords of American higher ed fortify their walls, and the erasures of second-tier castles multiply, there is a gathering force about to ignite a fire the North has never seen. Last year, I believed the change would occur on the supply side, with expanded enrollments at the top schools. It now looks as if the demand side will change the game: Employers are rethinking certification.

From Elon to Apple, some of the most admired employers are dropping the college degree requirement from more and more jobs. Over 130 companies have committed to accepting Google-certified courses as equivalent to credits earned at traditional four-year colleges. More than 250,000 people took Google’s IT certificate program in its first two years — and 57 percent of them did not have a college degree.

Similarly, Amazon announced a partnership with Lambda School, launching a nine-month, full-time training program in software engineering, designed to put graduates in jobs with an average base salary of $80,000. Students are not obligated to pay anything upfront; instead, they pay based on their salary after graduation. Again: no college degree required.

Here in Florida, Governor DeSantis has proposed using $75 million in federal Covid relief funds to invest in vocational training programs. There’s a scarcity of skilled tradespeople, and those are good paying, secure jobs — it will be a long time before robots replace electricians and plumbers.

This is bad news for schools without the global brand equity of the elites. They are being unbundled, piece by piece, just as newspapers were dissected (classifieds, movie listings, news, sports) and sold for parts to benign billionaires. Today, we have a handful of elite newspapers that are thriving, a wasteland of dead/dying second-tier papers, and a roiling maelstrom of tech-enabled news sources serving the mass market. Since 2004, two thousand newspapers have shuttered. It could be as bloody among universities.

by Scott Galloway, No Mercy No Malice |  Read more:
Image: Prof G Analysis of Ivycoach Data

Tuesday, May 4, 2021

What “Denuclearization of the Korean Peninsula” Really Means.

The Biden administration has finally announced its North Korea policy after having completed a three-month long policy review. And the policy it has decided on is a bit of a surprise.

That’s because its stated end goal is apparently the same as Pyongyang’s: “The denuclearization of the Korean Peninsula.”

That specific phrase, “denuclearization of the Korean Peninsula,” is one the North Korean government likes to use a lot. To Pyongyang, it means that it is willing to dismantle its nuclear program if and only if South Korea also denuclearizes.

But South Korea doesn’t have any nuclear weapons. What it does have is what’s called the US “nuclear umbrella.” That basically means that the US promises to defend South Korea from the North — up to and including with the use of US nuclear weapons. (There are also currently 28,500 US troops stationed in South Korea to defend it from potential aggression from the North.)

So what North Korea understands from this phrase is, “Sure, we’ll give up our nukes, just as soon as you (President Biden) withdraw all US military support for South Korea.”

That is a very different end goal than what the US has traditionally, at least until President Donald Trump came along, sought: the “denuclearization of North Korea.” That phrasing implies Pyongyang is the only one that must make nuclear concessions. It’s an end goal that would see North Korea give up all of its nuclear weapons while South Korea is still under US nuclear protection.

If you’re Kim Jong Un, that’s one hell of a difference. In the first scenario, he gives up his nuclear arsenal, which makes his regime more vulnerable, but it’s offset by the US no longer supporting South Korea with its nukes, either. In the second scenario, he just gives up his nukes, making his regime more vulnerable. Period.

So the difference of a couple of words here isn’t just semantics — the wording really, really matters.

Which brings us to another question: Why would the Biden administration adopt wording on the perennially thorny nuclear issue that North Korea likes?

Partly because it might make Kim happy — and that’s potentially a good thing.

Three reasons why Biden probably adopted North Korea’s favorite nuclear phrasing

When Biden announced that US troops would be leaving Afghanistan, he noted that one reason for his decision was that former President Trump made a deal with the Taliban that a full withdrawal would happen.

“It is perhaps not what I would have negotiated myself, but it was an agreement made by the United States government, and that means something,” the president said.

Biden may have come to a similar conclusion here. In 2018, Trump met with Kim in Singapore and signed a declaration which stated they would “work towards the complete denuclearization of the Korean Peninsula.”

That may not be the formulation Biden would like, but it is the latest US-North Korea deal on the books, and so he may have honored that. Anything else would look like a unilateral backtrack in the diplomatic process, experts said.

“It’s the right formulation to use because both sides agreed to it,” said Vipin Narang, an expert on North Korea’s nuclear program at MIT.

South Korea’s President Moon Jae-in also adopted this phrase in his yearslong efforts to broker a deal between Pyongyang and Washington. He supported it in an April interview with the New York Times, noting that it was “clearly an achievement” for Trump and Kim to have met in Singapore and signed an agreement.

Moon is coming to the White House on May 21, and it would’ve been awkward if the US dropped the formulation he backs just weeks before his arrival.

by Alex Ward, Vox |  Read more:
Image: Brendan Smialowski/AFP/Getty Images
[ed. They should just get on with it. In this day and age, nobody is going to nuke anybody (Israel excluded). And if a nuclear war did break out, a broken treaty would be the least of our concerns. North Korea is suffering economically and this is a prime opportunity.]

Monday, May 3, 2021

FDA To Ban Menthol Cigarettes



FDA Commits to Evidence-Based Actions Aimed at Saving Lives and Preventing Future Generations of Smokers (FDA)
Image: The Onion

Why Power Is Getting Woke


The U.S. ambassador to the United Nations is one of the nation’s top diplomats, second only to the Secretary of State. And one of the most basic, fundamental duties of a diplomat, especially at such a high level, is to manage the image and reputation of a country throughout the rest of the world. And that’s no small or superficial task — a country’s reputation and public image is a core part of its foreign policy. Often referred to in more academic circles as “soft power,” the positive attitudes and feelings that both the leaders and average citizens of foreign nations have toward a country’s culture, people, and political system are one of its major tools for influencing global affairs and managing international conflicts. And it’s generally believed that the more soft power a country has, the less it has to rely on more coercive “hard power” measures like military action, economic sanctions, etc.

So with that in mind, consider the recent controversy-generating comments made by President Biden’s newly appointed UN ambassador, Linda Thomas-Greenfield, in a remote address:
I have seen for myself how the original sin of slavery weaved white supremacy into our founding documents and principles. But I also shared these stories to offer up an insight, a simple truth I’ve learned over the years: Racism is not the problem of the person who experiences it. Those of us who experience racism cannot, and should not, internalize it, despite the impact it can have on our everyday lives. Racism is the problem of the racist. And it is the problem of the society that produces the racist. And in today’s world, that's every society. In America that takes many forms. It's the white supremacy that led to the senseless killing of George Floyd, Breonna Taylor, Ahmaud Arbery, and so many other black Americans. It's the spike in hate crimes over the past three years against Latino Americans, Sikh, Muslim Americans, Jewish Americans, and immigrants. And it's the bullying, discrimination, brutality, and violence that Asian Americans face every day, especially since the outbreak of COVID-19. That's why the Biden administration has made racial equity a top priority across the entire government. And I'm making it a real focus of my tenure at the U.S. Mission to the United Nations.
The reason I highlight these comments now is not to litigate them directly, however controversial they are. And they indeed remain controversial, on multiple levels, despite what feels like a thorough cultural revolution in every institutional power center in the country. But I think in this instance there is far more to be learned by setting aside that debate, resting your culture war trigger finger for a moment, and instead asking the question: What on earth could possess America’s UN ambassador to decide to broadcast the message that America is a deeply racist country down to its bones? Even if you were to accept some, or maybe even all, of what she’s saying as being true, what sort of powerful incentives could convince a top diplomat to engage in a highly controversial debate that not only has nothing to do with her job, but is in fact quite literally the opposite of her job description? How did we get here?

Wokeness isn’t radical, it’s repressive

One thing to know about Ambassador Thomas-Greenfield is that she isn’t exactly a radical activist storming the gates of power. She is power. Prior to becoming the UN ambassador, Thomas-Greenfield was a Senior Vice President at a “global business strategy” firm called Albright Stonebridge Group, an international lobbying and PR firm for multinational corporations and financial institutions, and perhaps the most powerful and influential of its kind. It was founded and is chaired by former Clinton Secretary of State Madeleine Albright. The main service it offers is to help the world’s wealthiest and most powerful corporate and non-governmental entities navigate political and regulatory hurdles in foreign countries (often very poor countries: Thomas-Greenfield headed their Africa practice).

But perhaps an even better example of this phenomenon is the very man who appointed Thomas-Greenfield: President Biden. In the nearly half century he spent in political office before assuming the presidency, Joe Biden wasn’t exactly known for being a social justice crusader. From his tough on crime focus, anti-busing stance, and drug war hawkishness to his open affection for colleagues who were former segregationists and his trademark habit of making politically incorrect gaffes, Biden was about as unwoke of a politician as they came. But that all now seems to have completely reversed: It feels as if not a day goes by where Biden isn’t heard decrying the country’s “systemic racism” while branding every new initiative as a step toward achieving the newly fashionable concept of “equity.”

And while he might be the most high profile (and perhaps most consequential) example, Biden’s overnight conversion characterizes a broader elite “awokening” that seems to have infected nearly an entire leadership class. What could explain all of it?

It’s extremely common to find critics of “wokeness” and critical race theory decrying it as a radical activist- and academic-driven plot to upend the basic foundations of American society. And while it certainly does like to present itself that way, and is even believed to be exactly that by its most true believers, it’s an analysis that fails to explain why every Fortune 500 company, establishment politician, media executive, and entertainer has become an evangelist for these ideas. After all, radicalism is about threatening and upending existing power structures. What we’re seeing now is quite the opposite. Far from seeing these ideas as a threat, the existing power structure is enthusiastically adopting them as something of a ruling class ideology. So unless you think all of these people are critical theory sleeper cells who are just now being awakened to carry out a plot decades in the making, the more likely explanation is that not only are these ideas compatible with power, but something about them must actually lend themselves to protecting and even enhancing that power. In other words, it’s an ideology that seems much more suited not to radicalism, but to the opposite: repression.

The privatization of politics

One of the most conspicuous things about woke politics is that it politicizes everything. It inserts politics into every space, interaction, and relationship. It problematizes, deconstructs, and dismantles. It calls out and it cancels. And above all, it personalizes politics. But in doing so, it redefines politics itself away from something that takes place in the public sphere — as a way of taking collective action to solve public problems and hold powerful people and institutions accountable — and instead into a matter of personal morality, behaviors, and actions. It privatizes, diffuses, and decentralizes politics. Something that we used to do collectively with a set of defined common purposes with clear objectives is increasingly becoming something we do in the office, with our friends and family members, or while sitting alone at home on the internet.

Woke politics makes politics less about what powerful people do, and more about what everyone does. Sometimes it’s even about what dead people did, in which case we might take down a statue if there is one, or just call for a “reckoning” (whatever that is).

But at a certain point, this stops looking like politics at all, and instead a sort of “anti-politics” — something that diverts energy and attention away from traditional political activity and toward something completely different. And when you see the most powerful people in society, from CEOs to elected officials — the people for whom politics is explicitly an accountability and power-limiting mechanism — championing and encouraging this trend, it has to make you wonder at least a little bit: Maybe the point of politicizing everything is to make you forget what actual politics is?

by Shant Mesrobian, Inquire | Read more:
Image: uncredited
[ed. Interesting perspective (as are the comments to this article). Something is clearly up, although I'm not sure what, exactly. The late Harvard law professor Derrick Bell coined the term “interest convergence” to note that the interests of Black people in achieving racial equality are accommodated only when they converge with the interests of white people. So what are those interests now? Simple voter demographics? Virtue signalling, tribal identification, marketing, pacification, distraction, accomodation, redirection? Personally, I don't think policing speech or behavior and shaming everyone into submission is a good strategy for winning minds, but as the author suggests, might be great for obfuscating the real levers of power (or pushing the pendulum in an opposite direction). See also (for a good laugh): Letter from former Jeopardy! contestants regarding offensive terminology and gesture.]

Rubén Blades

Rubén Blades, a Salsa Legend, Swings in a Different Direction: Jazz (NYT)

Sunday, May 2, 2021

Fulfillment: Winning and Losing in One-Click America

Get big fast was an early Amazon motto. The slogan sounds like a fratty refrain tossed around at the gym. Jeff Bezos had it printed on T-shirts. More than twenty-five years after leaving his position as a Wall Street hedge-fund executive to found Amazon, Bezos’s size anxiety is long gone. (At least as it pertains to his company; the CEO’s Washington, DC, house has eleven bedrooms and twenty-five bathrooms, a bedroom-to-bathroom ratio that raises both architectural and scatological questions.) Bezos is now worth $180 billion. Amazon, were it a country, would have a larger GDP than Australia.

Such numbers are nonsensically large—there’s no way to make them stick. But in 2017, Bezos demonstrated what they mean. That was the year the company conducted a nationwide sweepstakes to choose a location for its second headquarters, or HQ2, as it was called. Seattle was already a company town: Amazon had more than 40,000 employees there, and as much of the city’s office space as the next forty largest employers combined. It was time to take over a new city.

Local and state governments raced to undercut each other. It wasn’t only tax credits that in some locations amounted to over $1 billion; the subsidies offered to Amazon were a display of abject creativity. Bezos is a Trekkie, so Chicago had Star Trek star William Shatner narrate the city’s pitch video. Tucson, Arizona, sent a giant saguaro cactus to Amazon headquarters. Sly James, Kansas City’s mayor at the time, bought and reviewed one thousand Amazon products, giving every item five stars. But the locations Bezos selected—New York City and Northern Virginia—were always going to win. Together, the chosen bids gave the company over $3 billion in tax incentives and grants.

The spectacle was about more than financial benefits; the company sought to flex its power over elected officials. Amazon had engaged in these displays before, such as when it threatened to move jobs out of Seattle after the city council passed a law that would’ve taxed large employers for each employee earning over $150,000 to fund homelessness-outreach services. Indeed, New York had its winning status revoked after a coalition of working-class organizations, left-leaning politicians, and pissed-off residents made too much noise about the downsides of hosting HQ2. The point is to beg on one’s knees; ingratitude is disqualifying.

But the contest was also about Amazon’s life-blood: data. The company learned exactly what each location was willing to give up. It received a precise picture of the strengths, weaknesses, and points of resistance in each corner of the nation. How many NDAs would Alabama officials sign? What did Boston’s elected officials think the region’s future looks like? How many young people in Columbus were entering the workforce each year? How low would Orlando go? The HQ2 affair was a national demonstration of fealty to a private corporation by publicly elected officials. Sure, everyone already knew Amazon was powerful, but this was different: a corporate entity told politicians to jump, and they asked “How high?” How did this happen?

Alec MacGillis’s Fulfillment: Winning and Losing in One-Click America goes a long way toward answering that question. MacGillis, a reporter for ProPublica, investigates the country left in Amazon’s wake, crisscrossing the United States from what he calls the winning cities to those regions on the losing side. His contention is that corporate concentration creates geographic bifurcation. Places like Boston, New York, Washington, DC, the Bay Area, and Seattle win the lottery, attracting an influx of well-heeled residents. Capital divests from other areas—midsize cities in the Midwest, much of the Rust Belt—leaving them hollowed out. Low-wage jobs and drug addictions replace union benefits. But the division is internal to the winning regions too. Poor residents within the lucky cities suffer. Desperation reigns, expectations are lowered, and elected officials become increasingly willing to give Amazon anything in exchange for the promise of jobs.

Using the tech giant as a focal point allows MacGillis to show that this state of affairs was a choice, not an inevitability. It’s not that “good jobs left”; the transformation of work was engineered. Fulfillment meticulously documents how that process plays out, with the fate of millions haggled over by a handful of people in tucked-away conference rooms.

When Amazon wanted to build two new warehouses in 2015, it reached out to JobsOhio, a private nonprofit created by then-governor John Kasich to oversee negotiations over tax incentives with companies. “Every month,” writes MacGillis, “a board called the Ohio Tax Credit Authority approved the incentives negotiated by Jobs-Ohio.” On July 27, 2015, it was Amazon’s turn to meet with the tax board. The company promised two thousand full-time jobs. In exchange, it wanted a fifteen-year tax credit worth $17 million in addition to a $1.5 million cash grant from the state liquor-monopoly profits controlled by JobsOhio. The result? “The board approved the credit 4-0.”

Surveying Amazon’s operations in the state, MacGillis writes that “the company had, in a sense, segmented its workforce into classes and spread them across the map: there were its engineering and software-developer towns, there were the data-center towns, and there were the warehouse towns.” Amazon chose the Columbus area as its location for Amazon Web Services US East, and picked three towns north of the city for its centers. Hilliard, Dublin, and New Albany were “the right sort of exurban communities to target: wealthy enough to support good schools for employees’ kids, but also sufficiently insecure in their civic infrastructure and identity to be easy marks.” The company secured its standard extractions: a fifteen-year exemption from property taxes—worth around $5 million for each data center—accelerated building permits, and waived fees. It required the community to sign NDAs before negotiations could even begin. Dublin threw in sixty-eight acres of farmland for free, and a guarantee that the company did not have to contract with union labor.

The warehouses, by contrast, are south and east of the city, areas poorer than those in the north. The sites in Obetz and Etna are near I-270 and “close enough to the struggling towns of southern and eastern Ohio to be in plausible reach of a long commute for those desperate enough to undertake it.” Amazon’s warehouses come with the standard suite of exemptions—even though ambulances and fire trucks are often called to the locations, which one investigation showed have twice the rate of serious injuries as other warehouses, the company does its best to avoid paying taxes for emergency services.

Tax avoidance is foundational to the company’s empire. MacGillis enumerates a long, damning list of the company’s schemes:
There was the initial decision to settle in Seattle to avoid assessing sales tax in big states such as California. There was the decision to hold off as long as possible on opening warehouses in many large states to avoid the sales taxes there. Amazon employees scattered around the country often carried misleading business cards, so that the company couldn’t be accused of operating in a given state and thus forced to pay taxes there. In 2010, the company went so far as to close its only warehouse in Texas and drop plans for additional ones when state officials pushed Amazon to pay nearly $270 million in back sales taxes there, forcing the state to waive the back taxes. By 2017, the company had even created a secret internal goal of securing $1 billion per year in local tax subsidies.
This is predictable behavior for a company run by a man whose focus has always been on getting as rich as possible. But the government’s support for this cause testifies to its class character. A capitalist state takes its cues from executives. When a region has mostly low-paying work and little in the way of a social-welfare net, it’ll beg employers for jobs with a higher wage (even if that wage is below the industry average, as is true of Amazon’s warehouses). There is no neutrality, only officials groveling at Bezos’s feet, deferring to his fake-business-card-carrying minions. Working-class immiseration is the direct result.

by Alex Press, Bookforum |  Read more:
Image: Farrar, Straus and Giroux