Thursday, October 31, 2019

Joe Jackson


Pretty soon now, you know I'm gonna make a comeback
And like the birds and the bees in the trees it's a sure-fire smash
I'll speak, to the masses through the media
And if you got anything to say to me you can say it with cash

'Cause I've got the trash
And you got the cash
So baby we should get along fine
So give me all your money
'Cause I know you think I'm funny, yeah
Can't you hear me laughing
Can't you see me smile
I'm the man

Lyrics:

What’s Wrong With the New Figurative Painting?


To be clear, what bothered me was not the spectacle of bad work, of which there’s always plenty around. Instead, the problem had to do with good work that still didn’t seem as good as it could have been, art that engaged my interest but left me unsatisfied. “What’s wrong,” I kept asking myself. “What’s the problem?” My silent answer was, repeatedly, “This work is academic.” Or rather, “There’s something academic in this work.”

What’s Wrong With the New Figurative Painting? (The Nation)
Image: Doron Langberg, Daniel Reading (2019)

Peter Luger Used to Sizzle. Now It Sputters.

I can count on Peter Luger Steak House in Brooklyn to produce certain sensations at every meal.

There is the insistent smell of broiled dry-aged steak that hits me the minute I open the door and sometimes sooner, while I’m still outside on the South Williamsburg sidewalk, producing a raised pulse, a quickening of the senses and a restlessness familiar to anyone who has seen a tiger that has just heard the approach of the lunch bucket.

There is the hiss of butter and melted tallow as they slide down the hot platter, past the sliced porterhouse or rib steak and their charred bones, to make a pool at one end. The server will spoon some of this sizzling fat over the meat he has just plated, generally with some line like “Here are your vitamins.”

There is the thunk of a bowl filled with schlag landing on a bare wood table when dessert is served, and soon after, the softer tap-tap-tap of waxy chocolate coins in gold foil dropped one at a time on top of the check.

And after I’ve paid, there is the unshakable sense that I’ve been scammed.

The last sensation was not part of the Peter Luger experience when I started eating there, in the 1990s. I was acutely aware of the cost back then because I would settle the tab by counting out $20 bills; cash was the only way to pay unless you had a Peter Luger credit card. At the end of the night my wallet would be empty. Because a Peter Luger steak made me feel alive in a way that few other things did, I considered this a fair trade, although I could afford it only once a year or so.

I don’t remember when the doubts began, but they grew over time.

Diners who walk in the door eager to hand over literal piles of money aren’t greeted; they’re processed. A host with a clipboard looks for the name, or writes it down and quotes a waiting time. There is almost always a wait, with or without a reservation, and there is almost always a long line of supplicants against the wall. A kind word or reassuring smile from somebody on staff would help the time pass. The smile never comes. The Department of Motor Vehicles is a block party compared with the line at Peter Luger. (...)

The servers, who once were charmingly brusque, now give the strong impression that these endless demands for food and drink are all that’s standing between them and a hard-earned nap. Signals that a customer has a question or request don’t get picked up as quickly; the canned jokes about spinach and schlag don’t flow as freely.

Some things are the same as ever. The shrimp cocktail has always tasted like cold latex dipped in ketchup and horseradish. The steak sauce has always tasted like the same ketchup and horseradish fortified by corn syrup.

Although the fries are reasonably crisp, their insides are mealy and bland in a way that fresh-cut potatoes almost certainly would not be. The sole — yes, I’m the person who ordered the sole at Peter Luger — was strangely similar: The bread crumbs on top were gold and crunchy, but the fish underneath was dry and almost powdery.

Was the Caesar salad always so drippy, the croutons always straight out of a bag, the grated cheese always so white and rubbery? I know there was a time the German fried potatoes were brown and crunchy, because I eagerly ate them each time I went. Now they are mushy, dingy, gray and sometimes cold. I look forward to them the way I look forward to finding a new, irregularly shaped mole.

Lunch one afternoon vividly demonstrated the kitchen’s inconsistency: I ordered a burger, medium-rare, at the bar. So had the two people sitting to my right, it turned out. One of them got what we’d all asked for, a midnight-dark crust giving way to an evenly rosy interior so full of juices it looked like it was ready to cry. The other one got a patty that was almost completely brown inside. I got a weird hybrid, a burger whose interior shaded from nearly perfect on one side to gray and hard on the other. (...)

Luger is not the city’s oldest, but it’s the one in which age, tradition, superb beef, blistering heat, an instinctive avoidance of anything fancy and an immensely attractive self-assurance came together to produce something that felt less like a restaurant than an affirmation of life, or at least life as it is lived in New York City. This sounds ridiculously grand. Years ago I thought it was true, though, and so did other people.

The restaurant will always have its loyalists. They will laugh away the prices, the $16.95 sliced tomatoes that taste like 1979, the $229.80 porterhouse for four. They will say that nobody goes to Luger for the sole, nobody goes to Luger for the wine, nobody goes to Luger for the salad, nobody goes to Luger for the service. The list goes on, and gets harder to swallow, until you start to wonder who really needs to go to Peter Luger, and start to think the answer is nobody.

by Pete Wells, NY Times |  Read more:
Image:Ellen Silverman for The New York Times
[ed. I've never been to NY City, heard of Luger's, or particularly care. I just like reading a good Pete Wells takedown (although, all his restaurant reviews are generally interesting). See also: How a Food Critic Plots His Pans (NY Times).]

New Atheism: The Godlessness That Failed

Thucydides predicted that future generations would underestimate the power of Sparta. It built no great temples, left no magnificent ruins. Absent any tangible signs of the sway it once held, memories of its past importance would sound like ridiculous exaggerations.

This is how I feel about New Atheism.

If I were to describe the power of New Atheism over online discourse to a teenager, they would never believe me. Why should they? Other intellectual movements have left indelible marks in the culture; the heyday of hippiedom may be long gone, but time travelers visiting 1969 would not be surprised by the extent of Woodstock. But I imagine the same travelers visiting 2005, logging on to the Internet, and holy @#$! that’s a lot of atheism-related discourse what is going on here?

My first forays onto the Internet were online bulletin boards about computer games. They would have a lot of little forums about various aspects of the games, plus two off-topic forums. One for discussion of atheism vs. religion. And the other for everything else. This was a common structure for websites in those days. You had to do it, or the atheism vs. religion discussions would take over everything. At the time, this seemed perfectly normal.

In 2005, a college student made a webpage called The Church Of The Flying Spaghetti Monster. It was a joke based on the idea that there was no more scientific evidence for God or creationism than for belief in a flying spaghetti monster. The monster’s website received tens of millions of visitors, 60,000 emails (“about 95 percent” supportive), and was covered in The New York Times, The Washington Post, and The Daily Telegraph. Six publishing companies entered a bidding war for the rights to the spaghetti monster’s “gospel”, with the winner, Random House, offering an $80,000 advance. The book was published to massive fanfare, sold over 100,000 copies, and was translated into multiple languages. Putin’s thugs broke up a pro-Flying-Spaghetti-Monster demonstration in Russia. At the time, this seemed perfectly normal.

People compiled endless lists of arguments and counterarguments for or against atheism. The Talk.Origins newsgroup created a Dewey-Decimal-system-esque index of almost a thousand creationist arguments, from CA211.1 (“Karl Popper said that Darwinism is not testable”), to CD011.1 (“Variable C-14/C-12 ratio invalidates carbon dating”), through CH508 (“Chinese treasure ships show Noah’s Ark was feasible”) – and painstakingly debunked all of them; in case that wasn’t enough they linked 133 other sites doing similar work. Their arch-enemies, creationist site True.Origin, then went through and debunked all of their debunkings. Another atheist group created the Skeptics’ Annotated Bible, a version of the Bible highlighting everything bad or wrong in it. For example, if for some reason you need a hit job on the second chapter of the Book of Malachi in particular, you can look up its SAB page and find that Malachi 2:11 castigates Judah for “marrying the daughter of a strange god” (which is intolerant), Malachi 2:17 accuses the Israelites of “wearying the Lord with your words”, (which is absurd since God cannot be wearied), and Malachi 2:3 says that God will spread dung upon the faces of unbelievers (which is gross). This last entry includes a link to a 2007 YouTube video “God Wants To Smear Dung On Your Face” with 21,947 views. And the video links to a store selling Malachi-2:3-says-God-wants to-put-dung-on-your-face-related t-shirts, bumper stickers, keychains, and coffee mugs. At the time, this seemed perfectly normal.

Whatever media you liked, there were atheism-themed versions of it. Obviously if you liked webcomics you would never be able to finish all the different atheist options from Russell’s Teapot through Jesus & Mo through The Sheeples. If you liked TV, there were atheist TV shows like John Safran vs. God or The Atheist Experience. If you liked pithy quotes, you could read the top 10,000 atheist quotations in order of popularity. If you just liked discussion, you could go to the now-infamous r/atheism subreddit, which at the time was one of Reddit’s highest-ranked, beating topics like “news”, “humor”, and – somehow – “sex”. At the time, this seemed perfectly normal.

But these still don’t quite make my point, because the defining feature of this period wasn’t just that there were a lot of atheism-focused things. It was how the religious-vs-atheist conflict subtly bled into everything. Read enough old articles and blogs from this period and you’ll spot it. Some travel writer going on about how the boring small town he ended up in is probably full of fundies who hate gays and think the Earth is six thousand years old. Some logician giving an example of circular arguments: “I know the Bible is true because it says so in the Bible.” Some political writer saying a stupid policy is only to be expected in a country where X% of people still get their ethics from Bronze Age superstitions. At the time, this seemed perfectly normal.

It seemed perfectly normal because religion vs. atheism was the most important issue, maybe the only issue. How could you run a 21st century democracy with half the population believing in science and compassion, and the other half believing whatever they read in a 3000 year old book about a magic sky father? To truly understand the spirit of the time, you can’t just think of religion as evil. You have to think of it as the ur-evil, without which no other evil would exist. Homophobia? Only there because the Bible says to stone gay people. War? It’s all holy war of one sort or another, whether it’s Arabs vs. Israelis, Sunnis vs. Shias, or the Christian/Muslim “clash of civilizations”. Environmental devastation? Only there because religious people believe God elevated Adam over the animals and told him to exploit them for his own purposes. Poverty? Only because religious people believe in the prosperity gospel that says people get what they deserve. (...)

Between the first stirrings of internet atheism in 2000 and the beginning of the end in 2015, the percent of Americans identifying as Christian dropped about 10%; the percent identifying as no religion increased about the same amount. There are many different ways of looking at the data: self-reported affiliation, church attendance, even polls on whether religion can answer all of today’s problems, but they all show the same story of slow, steady decline.

By the numbers, the decline is slight: from 80% Christian / 15% atheist in 2000 to 70% Christian / 25% atheist in 2018. This could hide wider social changes. The number of gay people has barely changed since 2000, but society’s attitude toward them has totally transformed. Likewise, although religion has barely declined, and nonbelief barely risen, Christianity no longer seems to command quite the same level of political power, nor does atheism provoke quite as much revulsion.

But the sudden fall of New Atheism didn’t feel like a process of gradual social change and eventual acceptance. It felt like a movement certain of its own victory burning out spectacularly over the course of a few short years, followed by mysterious yet near-total contempt from the very people it thought it had convinced. (...)

To get an intuitive feel for the first category, look at the two sites involved. Talk Origins is almost perfectly preserved, a time capsule from an era when people really wanted to debate creationism. Internet Infidels has decayed a bit more, but even its ruins are impressive: a database of forty videotaped atheist-vs-theist debates, an online library of uploaded works by about two hundred atheist authors, and the obligatory list of several hundred Biblical contradictions. Who does that these days?

This exercise is gradually bringing back memories of just how intellectual the Internet was around the turn of the millennium. You would go to bulletin boards, have long and acrimonious debates over whether or not the Gospels were based on pagan myths. Then someone would check Vast Apologetics Library tektonics.org and repost every one of their twenty-eight different articles about all the pagan myths the Gospels weren’t based on, from Adonis (“yet another unprofitable proposition for the copycat theorist”) to Zalmoxis (“there is no comparison, other than by illicit collapsing of terminology and by unsubstantiated speculation”). Both sides had these vast pre-built armories full of facts and arguments to go to.

At some point, in a way unrelated to the fall of New Atheism, the Internet stopped being like this. The topics that interest people today don’t get debated in the same way. People dunk on each other on Twitter, occasionally even have back-and-forth exchanges, but the average person doesn’t post long screeds and get equally long responses fisking each of their points. There’s less need for giant databases containing every fact you might need to win a particular argument, organized Dewey-Decimal-style by which argument you are trying to win. People just stopped caring.

I’m not sure why this happened. Maybe it took about ten years from the founding of the Internet for people to really internalize that online arguments didn’t change minds. The first Internet pioneers, starting their dial-up modems and running headfirst into people outside their filter bubbles, must have been so excited. For the first time in human history, people interested in debating a subject could do so 24-7 out in a joint salon-panopticon with all of the information of the human race at their fingertips. Bible Belt churchgoers for whom atheists had been an almost-fictional bogeyman, and New York atheists who thought of the religious as unsophisticated yokels, came together for the first time thinking “Convincing these people is going to be so easy”. The decade or so before they figured out that it wasn’t was a magical time, of which the great argument-arsenals of the past are almost the only remaining monument.

Or maybe it was something else. Maybe it was that getting online was actually pretty hard in those days, you needed to be technically inclined or attending a college or both, and so netizens were just more educated. Maybe the sort of people who interrupt any attempt at intellectual discussion with words like “rationalbro” or “mansplaining” or “well acktually” were still stuck in their caves, fruitlessly banging AOL CDs against rocks trying to create fire. Maybe it was something as simple as Wikipedia not existing yet, leaving the intellectual world in a sort of state of nature with every man for himself. Maybe it was just that the bulletin board format was more conducive to this than the later social media style fora.

Whatever it was, the decline of this culture started no later than 2000, and is reflected in the fate of argument-related search terms like “biblical contradictions” and “creationism”, and in the fading of the great argument-armories like Talk Origins and Infidels.

But the “atheism” search term keeps rising for another decade. What happened? (...)

I think it seamlessly merged into the modern social justice movement.

by Scott Alexander, Slate Star Codex |  Read more:
[ed. Wow. Some days I thank God/No God for the internet.]

Wednesday, October 30, 2019

How the Rich Get Away With It

Emmanuel Saez and Gabriel Zucman’s The Triumph of Injustice: How the Rich Dodge Taxes and How To Make Them Pay will make you angry. If you want to get a better understanding of what we on the left are actually talking about when we talk about wealth inequality, you should read it. It will show you how taxes work and who pays them, and you’ll come away thinking more clearly about concepts like “redistribution” and “progressive taxation.”

Saez and Zucman destroy some important myths about taxation. One of the helpful things they do is look at all taxes together, so that we can better understand what we’re paying in total. This makes it easy to understand why Mitt Romney’s comment that 47 percent of Americans pay no federal income tax was both technically true and also misleading and despicable. Everyone pays taxes. Saez and Zucman break it down so that we can see what kinds of taxes people pay in total, both federal and state:


The X-axis here shows percentiles: so P0-10 are the poorest ten percent, etc. On the right, Saez and Zucman have broken things down further so that you can see how this changes for the very richest. As you can see, rich people tend to pay a greater percentage of their income in income taxes while poor people pay more of it in payroll and consumption taxes. But here’s one of their striking findings: the rich do not pay a greater percentage of their income in taxes than the poor, and the very rich actually pay less of a percentage of their income than the poor. Warren Buffett pays less of a percentage of his income than I do, even though 25% of his income means much less to him than 25% of my income means to me. Saez and Zucman therefore argue that the U.S. tax system is quite obviously regressive, since billionaires pay lower tax rates than the working class.

In fact, drawing our attention back to the “working class” is one of the values of the Saez-Zucman book. They say that among the 122 million adults in the lower half of the American income distribution, the average annual income is $18,500, before taxes and transfers. People talk a lot in politics about the supposed “vanishing middle class,” but Saez and Zucman say that “the striking fact about the American economy is not that the middle class is vanishing. It’s how little income the working class makes.” (This, by the way, is why young people are not very open to free market arguments about how “capitalism creates prosperity.” Millions and millions of people can see with their own eyes that their pay is low. It’s worthless to tell someone earning $8 an hour—nearly 200,000 workers here in Louisiana would see a pay rise if the minimum wage were $8.50 instead of $7.25—that this system is fair and is creating prosperity. You can’t fool them. They can see your cars.)

A critical point Saez and Zucman make is about the impact of health insurance. They show very effectively why it’s so objectionable for people like Pete Buttigieg and Joe Biden to tell votes that “Medicare for All will raise your taxes.” Saez and Zucman encourage you to think of your health insurance as a “tax” you are already paying. You’re just paying it to a for-profit corporation instead of to the government.

Buttigieg and Biden are encouraging you to ignore the giant amount of money you spend on private insurance. Their framework is deeply conservative: money to the government is bad Tax but money to corporations is good. Saez and Zucman, like Bernie Sanders, say what we need to do is look at the bottom line: if we switched health insurance from a tax you pay for private insurance to a tax you pay for socialized insurance, you would almost certainly save a lot of money without losing anything in services, because for-profit insurance is not designed to give you as much healthcare as you need, it’s designed to soak you for as much of your paycheck as possible. Saez and Zucman say that if we use an income tax to pay for healthcare, over 90% of workers who currently have employer-sponsored health insurance would come out ahead.

One valuable contribution Saez and Zucman make is in their extensive discussion of tax evasion/tax avoidance by the rich. We almost treat it as a joke: everyone knows that Google is not actually based in Bermuda, even though it shifted $23 billion there in 27. Today, “close to 60% of the—large and rising—amount of profits made by US multinationals abroad are booked in low-tax countries,” primarily Ireland and Bermuda. Saez and Zucman say that we need to stop treating this as inevitable. In fact, it’s criminal. It’s simple tax evasion. Google knows that it should actually be paying tax on this money under U.S. law, so it is fraudulently pretending that it makes its money somewhere else. This shouldn’t be considered “using a loophole,” it should be considered “violating the law,” because it requires defrauding the government. “Ah, but we’re technically based in Palau.” Technically my ass. You’re not based there. It doesn’t count. If you ask Google why it has billions of dollars in Bermuda, there is only one answer: to avoid paying the taxes it would owe if it told the truth about where it made its money. The rich always try to draw a distinction between “tax avoidance” (taking advantage of legal loopholes) and tax evasion (“not paying the taxes you are legally required to pay”) but the whole point of stuffing wealth in Panama and the Cayman Islands and pretending you’re headquartered in a place where you just have a PO Box is “not paying the taxes you are legally required to pay.” Shell corporations are shells, i.e. fake, fraudulent, a lie. We need to stop treating Apple’s constant elaborate attempts to avoid contributing to the tax base of the companies it operates in as anything other than a crime.

Now, you might say “well, under existing U.S. law…” But let’s remember how law works: law is vague and slippery, that’s the entire reason why these companies are able to craft elaborate excuses for not paying their fair share. Law also empowers prosecutors and judges. In fact, the job of a good judge is to enforce justice and the principle of the law. If the attorneys for a company have found some badly worded part of the tax code they think they can take advantage of, but it would result in billions being kept out of the public treasury that belongs there, judges need to prioritize fairness over technicalities that create obvious lies and absurdities. If Barack Obama had gotten serious about cracking down on corporate tax evasion, if he had threatened some prosecutions, you might see some corporate behavior changing rapidly.

Making corporations pay the tax they’re already supposed to pay could bring in hundreds of billions of dollars. Remember what this means. (...)

Saez and Zucman propose a new wealth tax, taxing dividends and capital gains, a 30% effective corporate tax rate, and an increased income tax, which they say can be used to fully fund universal health care, universal public child care and education, free tuition at public universities, and the elimination of regressive sales taxes. (Their wealth tax is quite modest. People who are billionaires would still be billionaires, but multi-multi billionaires would just become multi-billionaires. Mark Zuckerberg, for instance, would be worth $21 billion instead of $61. Is having to live life with only 20 billion dollars unjust?) Of course, one of the main points raised in response to proposals like these is: it won’t raise the expected revenue, because the wealthy will just “find ways to dodge the tax.” One reason there is so much effective tax evasion by the rich is that their wealth is mobile: they can just move it to countries where they are taxed less. So you can impose a wealth tax, but just as with today’s corporate taxes, nobody will actually end up paying it.

One of the excellent contributions of Saez and Zucman’s book is that it successfully rebuts this sort of “futility” argument. Actually, there’s nothing to stop the United States from taxing money regardless of where people keep it. It’s just a matter of willingness to enforce the policy. There has been a widespread consensus that countries are trapped in a “race to the bottom” on taxes, because they now have to compete with each other to keep businesses in their country. Saez and Zucman point out that for the United States, there is no reason this need be the case. Amazon operates in the United States, we can tax it how we please and choose whether to recognize tax shelters as legitimate or not. Saez and Zucman say that countries have an interest in agreeing to levy uniform minimum tax rates (they suggest a 25% corporate tax) but there is much they can do unilaterally. As they say:

The choice is ours. The race to the bottom that rages today is a decision we’ve collectively made—perhaps not fully consciously or explicitly, certainly not a choice that was debated transparently and democratically, but a choice nonetheless… We could have chosen to prevent multinationals from booking profits in low-tax places, but we let them do it. We can make other choices, starting today.

So, no, don’t accept that the wealthy will “just evade” new taxes. They won’t if they’re prosecuted for it. This is a matter of political willpower. We can make the wealthy pay their fair share, if we’re willing to enforce the laws as they are written and say that the U.S. corporate tax rate is the U.S. corporate tax rate, period.

by Nathan J. Robinson, Current Affairs |  Read more:
Image: Emmanuel Saez and Gabriel Zucman
[ed. I'd think the one thing liberals and conservatives (making less than the top 10%) could rally around would be fair taxation - a relatively straightforward proposition, unlike climate change, another bi-partisan issue. Where are the riots in the streets? (I'd join one). The only obstacle: a furious and predictable brainwashing campaign by corporations, media and politicians against the idea. People will someday get over that.]

The CBS Golf Crew On What You Don't See on TV

In a roundtable discussion with Jim Nantz, Nick Faldo, Peter Kostis, Gary McCord, Dottie Pepper, Ian Baker-Finch and Amanda Balionis, we covered a variety of topics from what fans don't see on TV to their favorite tournaments to how they handle feedback from fans and players. Here's a glimpse into the minds of some of golf's most recognizable voices and how this close-knit group interacts when the cameras aren't rolling. (...)

What are some challenges to broadcasting golf that the average fan sitting at home doesn’t realize?

Pepper: Everything. It’s a city that gets put up and broken down every week and relocated. The logistics of it are incredibly thick. If you just think about the towers that go up however many stories, they have to have an occupancy certificate. So this is so far in advance, but people think you just flick the light switch and it magically happens. There’s miles and miles of cable, and trucks, and it’s a city that moves every time we move. It’s not like you can just plug it in and press play.

Kostis: There’s also when they watch the show they see this seamless transition of Jim to Nick talking, to throwing it to 16 to this and that, and it’s very soothing. What they fail to realize what’s going on in our ear pieces. An announcer’s first rule is never, ever let what’s going in your ear come out your mouth. Ever. So while it seems like it’s a slow-moving sport, in our world, our lives exist in six-second increments, and we’ve got two people talking to us in our headphones while we’re trying to say something on air. From our perspective, it can be chaotic at times, even if it comes across as anything but.

Pepper: There are no timeouts. There’s a ball in the air all the time, even if we’re not on the air live. So that has to be covered as well.

How do you handle the various roles you play on the broadcast, and what’s your weekly prep schedule like?

Faldo: I still like to prepare myself as a golfer. I always go out and see the golf course, I still draw my own greens to make sure I know when someone misses the green long right, I can blurt out, “He’s dead,” and be 99-percent right. I like to chat to players to get an inside scoop on where they are, what they’re working on. You can know where there confidence level is. So I do all of that and then when I jump in the tower, Jim tees us up and I react to a picture.

Balionis: If it’s a course I haven’t been to, I try to come in on a Thursday as opposed to a Friday and walk the course to get a feel for it. It’s not like I’m doing any analysis, but it makes me feel more comfortable with my whereabouts. And then it’s a lot of watching the broadcast Thursday/Friday, reading every article possible, and really keeping an eye on social media because we’ve found players are less likely sometimes to talk to a reporter than they are to directly connect with fans. But I also go to the range to talk to coaches and caddies to make sure I can ask informed questions. You can never have too much information going in.

Kostis: Like Nick said, it really isn’t much different from playing in the sense you do all your practicing, you do all your work, you do all your drills, but when you get to the first tee, you leave all that behind and you just play. For Dottie and I, we often don’t even know which groups we’re going to be walking with. So you prepare for a whole bunch of stuff, but you can’t guarantee that you’re prepared for that group because you never saw them coming when we were in rehearsal.

Pepper: You have a bit of a safety net because you’ve prepared for so many, but if I had to put the math to it, I think I use about 3 percent of what I’ve prepared for on the air.

Nantz: Times have changed, though. Early in our careers, we didn’t have cable coverage that came right up to within 15 minutes or 30 minutes or whatever the window is until we take over the broadcast. It used to be, in Chirkinian’s world, really important to nail the rehearsal. We seriously rehearsed.

McCord: Yep, half hour. Solid.

Nantz: You’d go through golf action, cutting from hole to hole. He wanted everyone to get into a flow so we’d do the full broadcast like you were doing a rehearsal for a play. Then we’d turn around and go on the air. And if you screwed around, he would yell at you. So by the time you came on the air, it was just a continuum of what you’d done. You were duped into feeling you’d already been on the air. Obviously, those days are long gone now. It was different. (...)

How do you balance covering players while maintaining good relationships with them? Peter, how much blowback have you gotten for breaking down players' swings?

Kostis: Other than the fact Tiger wouldn’t talk to me for a year-and-a-half.

Faldo: Only a year-and-a-half?

Nantz: That’s pretty good.

Kostis: I don’t think I’m being critical when I discuss their swings. I’m being honest and evaluating. Nine times out of 10, if he’s hit the ball right, I’m figuring out what he did that caused that. There’s too many different golf swings out there to say this one is no good and that one is. I just describe what happened, and I’ve had way more players come up to me and say I was right than have complained about what I said. I’ve had less than a handful of players complain. (...)

Anyone else have run-ins with players?

Nantz: You know, they’re really decent guys. And I have the context of seeing a lot of different sports, and for golfers, it’s different for me. By in large, your stars of the sport are around for a long time, and you travel in the same circles. We see them at our hotels, we see them at restaurants, a great many of us, full disclosure, see them at outings together away from tournament weeks. You get to know the player, the family, their children. My wife gets to know some of the wives. And this is not an exception, we all travel in the same circles. You can’t put a wall up and say “I’ve got to cover them, I can’t really get close to them or get to know them.” Part of what makes it so good is we have really great relationships with the subjects we’re covering. And I think that comes out on the air every week.

What are your thoughts on the amount of coverage Tiger Woods gets during a telecast?

Kostis: It’s a cliche, but he doesn’t move the needle, he is the needle right now. You’ve got to cover him. He’s got a tremendous amount of stories to tell, too, with all the injuries and surgeries. Look, half the people who watch Tiger hate him and hope he shoots 90, and the other half hopes he wins by 15. So you’re always going to make someone unhappy with your coverage, too much or too little. Hey, he’s changed the game.

Faldo: He’s probably the most intriguing sportsman on the planet so you have to be there whatever he does. Everything is a story with him. People will start about Augusta and what he’ll be able to do there, he’s like no athlete ever in any sport.

by Alex Meyers, Golf Digest |  Read more:
Image: uncredited
[ed. I'd imagine it's basically similar for all sports broadcasts.]

Underwater


Rising Seas Will Erase More Cities by 2050, New Research Shows (NY Times)
Image: Population data from WorldPop and building footprints from OpenStreetMap

Tuesday, October 29, 2019

All Right Already

Precisely how full of shit is Mark Zukerberg? Does he believe in anything beyond some vague coder’s logic of efficiency and scale? For years I’ve parsed his robotic utterances, and I’m still not sure. Based on his recent public appearances—a soft-focus Fox News interview by former Bush spokesperson Dana Perino; a dead-on-arrival bit of humanitarian philosophizing in front of a Georgetown audience that included Tiffany Trump; a typically awkward grilling before a Congressional subcommittee—it doesn’t seem that Zuckerberg knows either.

In public, Zuckerberg lobs useless bromides about free speech and connection while privately he dines with far-right media figures and frothing South Carolina senator Lindsey Graham and girds his team for battle against Elizabeth Warren. (While Zuckerberg has claimed he has dinners with “lots of people across the spectrum,” that spectrum doesn’t appear to include the left.) His company donates substantial amounts to conservative politicians, and conservative media does extraordinarily well on Facebook, with Ben Shapiro, Breitbart, and other right-wing luminaries often ranking among the top shared articles. (As of Friday, Breitbart is also a “trusted partner” of Facebook News, the company’s long-gestating feed of verified news stories.) A handful of top company executives, including Joel Kaplan, Facebook’s vice president of global public policy and a prominent Brett Kavanaugh supporter, are Republican political operatives. (Kaplan was even considered for a Trump cabinet position.) Recently, in order to investigate whether the platform exhibited bias against conservatives, Facebook hired former Republican senator Jon Kyl, whose report gently chided Facebook as insensitive to conservatives’ concerns while ultimately exonerating the company of anything more serious. When defending Libra, Facebook’s embattled cryptocurrency project, before Congress, Zuckerberg made a plea of economic nationalism that would appeal to any conservative, arguing that the United States must fend off a digital Chinese renminbi.

If it’s not clear already, then it must be said: Facebook is a right-wing company, hostage to conservative ideas about speech and economics, its fortunes tied to its allies in Republican politics, including the president, whose campaign spends millions on Facebook ads. Offering support to some of the worst figures in American political life, Facebook is as nihilistic as an oil company and just as willing to dump its pollution on all of us. That it has come to so thoroughly dominate our public sphere is a tragic indictment of American civic life and American techno-capitalism, which has confused the pitiless surveillance of today’s internet with utopian empowerment.

But an unfeeling right-wing ideology is not the image that Facebook wishes to project. Promoting a kind of beneficent nonpartisanship, Facebook’s latest line is that the company stands for free expression and “Voice” (the capitalization intentional, as is the lack of a definite article). That neither of these, in Zuckerberg’s perambulating formulations, is ever clearly defined doesn’t much matter. Zuckerberg—who, because he reportedly wanted to “maximize for sincerity,” wrote his Georgetown speech without the editorial help customarily afforded to tycoons—has never been one for intellectual rigor or originality, and the official Facebook origin story has long been malleable, eliding early accusations of betrayal and intellectual property theft. Now Zuckerberg claims that Facebook was created to hash out the political division and powerlessness felt over the launch of the Iraq War. Never mind that this is fiction as well as functionally impossible, given that Facebook’s predecessor, the sophomoric Hot-Or-Not rip-off known as FaceMash, was created after the war began. (Early versions of Facebook, too, did not feature walls, news feeds, and discussion groups like it has now.)

Bearing the perverse logic known only to authoritarian state propaganda, Zuckerberg wishes us to believe that Facebook is a benevolent sovereign, a gateway to flourishing connectivity and public discourse, instead of an all-seeing surveillance apparatus that attempts to predict our needs, guide our behaviors, and monetize our dearest relationships and communications for obscene profits. It may not be the death knell to democracy that some claim, but it would be dubious to say that targeted advertising—and the coercion that attends it—has done anything to improve our lives. Nevertheless, Zuckerberg argues otherwise. Part and parcel of the new rhetoric is that Facebook’s technologically enabled users represent “a fifth estate,” a new member of the public sphere. “People no longer have to rely on traditional gatekeepers in politics or media to have their voices heard,” Zuckerberg told the Georgetown crowd.

This speech was a shabby defense of his own blinkered version of free expression, one which Facebook, with its billions of customers and quasi-nation-state status, has been deputized to guard. But while Zuckerberg speaks of the masses and of quashing division, he offers absolutely no specifics, no hint of authentic belief. “When people don’t feel like they can express themselves,” he warned vaguely, “they lose faith in democracy and are more likely to support populist parties that support specific policy outcomes over the health of our democratic and civil norms.”

Imagine people supporting “specific policy outcomes”—the horror!

What Zuckerberg ignores is that his form of rigorous nonpartisanship, his refusal to take any stand at all, is itself a political act, especially when Donald Trump is president. A specific type of right-wing populist movement is currently in power, with a specific, iniquitous ideology, and it got there in part by leveraging the Facebook platform, yet Zuckerberg refuses to give it a name. Nor will he even consent to fact-checking or blocking the deliberately misleading ads that Trump’s campaign and its allies regularly run on Facebook. This passivity is not a form of even-handedness or a devotion to free expression. He has taken a side, and it happens to be occupied by some of the most malign forces in American political life.

But the tech billionaire continues to live in cosseted denial, even as his airless remarks would have you believe that he has been reading from the right-wing playbook that says we are all just too sensitive. During this era of dissensus, he warned the Georgetown crowd, “a popular impulse is to pull back from free expression.” But the social networking boy-king wishes otherwise. “I believe we must continue to stand for free expression,” he said, even if “free expression has never been absolute.” That last part, of course, is the rub, and Facebook, which has no constitutional obligation to free speech, polices speech all the time, using cadres of traumatized and poorly paid contractors to remove pornography, violence, and other unacceptable content from the platform. (When questioned by Rep. Katie Porter last week whether he’d sit in as a content moderator, Zuckerberg averred and said he wasn’t sure it’d be a good use of his time.)

Facebook, in other words, has quite specific rules about speech. It just doesn’t happen to prohibit politicians paying to spread lies. Facebook says that political advertising is a fraction of its overall ad haul, but it has an interest in continuing to accept political ads—it helps the bottom line and keeps politicians dependent on the platform for advertising, communication, and constituent outreach. For many American politicians, there is no alternative to Facebook.

by Jacob Silverman, The Baffler | Read more:
Image: I.Robot
[ed. See also: Trump, Zuckerberg & Pals Are Breaking America (NY Times).]

Miss Alexa Stirling


[ed. The 'Empress of Golf". Friend of Bobby Jones and one the first great women golfers to play the game.]

Carole Bellaiche
via:

‘So Alien! So Other!'

How Western TV Gets Japanese Culture Wrong

It just feels so alien! So other! So extraordinarily strange!” So said Sue Perkins as she walked across Tokyo’s most crowded zebra crossing in the opening sequence of her travelogue. But shouldn’t this all be more familiar by now?

After all, BBC One’s Japan With Sue Perkins, which aired last month, was only the latest in a long run of British TV programmes inviting us to boggle at the east Asian country. These shows always feature a shot of the aforementioned Shibuya Crossing, items on AI and sumo wrestling, and a concerned interview with an undersexed young man (sometimes called otaku) and/or an overexcited young woman (something to do with kawaii). Only rarely do they offer fresh insight.

At least the upcoming Queer Eye: We’re in Japan! on Netflix and BBC Two’s drama Giri/Haji have obeyed the most basic rule of making British TV about Japan: don’t name it after that slightly racist 1980s hit about masturbation. That’s where Channel 5’s Justin Lee Collins: Turning Japanese went wrong. Or rather, it was the first of many wrong turns in which the since-disgraced comic’s 2011 travelogue erred.

TV’s other orientalist missteps are less daft, but more common. The premise of Queer Eye – five sophisticates makeover a sad-sack – puts Karamo Brown in danger of doing an accidental “Lawrence of Arabia” when he arrives in Tokyo. That is, updating the colonial yarn of the westerner who is eventually accepted by an alien community and then asserts his inherent superiority by embodying the culture better than the locals. Happily, Queer Eye has addressed that risk by including Kiko Mizuhara, a Japanese-American model and Tokyo resident as its guide.

The illuminating presence of Mizuhara is, however, unusual. “British television programmes have a tendency to represent Japanese people as stereotypically odd or kooky, without explaining the cultural context,” says Professor Perry R Hinton, an expert in intercultural communication.

This kind of othering reveals a narrow-mindedness. As Shinichi Adachi, the Japanese-British film-maker behind YouTube culinary series The Wagyu Show explains, Japanese culture isn’t particularly strange, just more accepting of humanity’s strangeness. “They respect people, even if they don’t understand them. People don’t really care if others have weird hobbies.” (...)

“To attract viewers, it’s understandable,” says Chiho Aikman, of the Daiwa Anglo-Japanese Foundation in London, “but the reality of Japanese culture is quite different.” She suggests architecture, regional cuisine (“We don’t just eat sushi!”) and the spread of hate speech as topics that don’t get enough attention. Instead, shows about hikikomori (modern-day hermits) and 40-year-old virgins with huge hentai (manga/anime porn) collections give the impression that subcultures typify an entire nation. In truth, such selections often say more about the audience than they do about the subject. So, if anyone comes out of this looking like socially inadequate, culturally insular, sex-obsessed pervs, well, it’s not the Japanese, is it?

by Ellen E Jones, The Guardian | Read more:
Image: Composite: ITV; Alamy Stock Photo; BBC
[ed. See also: Can We Ever Make It Suntory Time Again? (Longreads).]

The Everything Bubble

I’m not going to call it “tech,” because most of the startups in that so-called tech space aren’t tech companies. They’re companies in mundane businesses. And many of these companies aren’t startups anymore but mature companies that have been in business for over a decade and now have tens of thousands of employees. And then there is the entire shale-oil and gas space that has turned the US into the largest oil and gas producer in the world.

They all share two things in common:
  • One, they’re fabulously efficient, finely tuned, and endlessly perfected cash-burn machines.
  • And two, investors in these companies count on new cash from new investors to bail out and remunerate the existing investors.
This scheme is a fundamental part of the Everything Bubble, and there is a huge amount of money involved, and it has a big impact on the real economy in cities where this phenomenon has boomed, and everyone loves it, until these hoped-for new investors start seeing the scheme as what it really is, and they’re suddenly reluctant to get cleaned out, and they refuse to bail out and remunerate existing investors. And suddenly the money runs out. Then what?

Calling these companies “tech” is a misnomer, designed to create hype about them and drive up their “valuations.” They engage in mundane activities such as leasing office space, running taxi operations, doing meal delivery, producing and selling fake-meat hamburgers and hot dogs, providing banking and brokerage services, providing real estate services, and renting personal transportation equipment, such as e-bikes and e-scooters.

And let’s just put this out there right now: e-scooters appeared in public for the first time in the late 1800s, along with electric cars and trucks.

Then there is the endless series of new social media platforms, in addition to the old social media platforms of Facebook, Twitter, WhatsApp, Instagram, and the like, where people post photos, videos, promos, and messages about whatever.

That’s the “tech” sphere mostly today.

There are some tech startups in that group, however. And that technology is about spying on Americans and others and datamining their personal events, purchases, and thoughts to be used by advertisers, government intelligence agencies, law enforcement agencies, political parties and candidates running for office, and whoever is willing to pay for it.

And there is some real tech work going on in the automation scene, which includes self-driving vehicles, but most of this work isn’t done by startups these days – though some of it is – but by big companies such as Google, big chipmakers such as Nvidia, and just about all global automakers.

And there is a slew of big publicly-traded companies that have stopped being startups years ago, that are burning huge amounts of cash to this day, and that need to constantly get even more cash from investors to have more fuel to burn. This includes Tesla, which succeeded in extracting another $2.7 billion in cash in early May from investors. Tesla duly rushed to burn this cash. And it includes Netflix, which extracted another $2.2 billion in April. From day one, these companies – just Netflix and Tesla – have burned tens of billions of dollars in cash and continue to do so, though they’re mature companies.

And it includes Uber which received another $8 billion from investors during its IPO in May, which it is now busy burning up in its cash-burn machine.

Don’t even get me started about the entire shale oil-and-gas space – though there is some real technology involved.

That entire space has burned a mountain of cash. Many of these shale oil companies are privately owned, including by private equity firms, and it’s hard to get cash-flow data on them. But for example, just to get a feel for the magnitude, by sorting through 29 publicly traded shale oil companies, the Institute for Energy Economics and Financial Analysis found that between 2010 through 2018, $181 billion in cash was burned. In 2019, they’re burning an additional pile of cash because oil prices have plunged again. And shale drilling started on a large scale before 2010. Plus, there’s the cash burned by the privately held companies. So, the total cash burned is likely in the neighborhood of several hundred billion bucks.

These companies and industries are “disruptive.” They claim that they change, and some of them actually do change, the way things used to be done.

But they have not figured out how to have a self-sustaining business model, or how to actually make money doing it. It’s easy to quote-unquote “disrupt” an industry if you can lose billions of dollars a year, if you keep getting funded by new investors, while everyone else in this industry would go bankrupt and disappear if they used a similar business model.

The only reason these companies have had such growth is because investors didn’t care about the business model, profits, and positive cash flows. (...)

This scheme is a key feature of the Everything Bubble. And it has had a large impact on the real economy.

When a company has a negative cash flow, which these companies all do, it means that they spend more investor money in the real economy than they take out. This acts like a massive stimulus of the local economy and even of the broader economy.

They’re paying wages, and these employees spend those wages on rent or house payments, on cars, electronics, food, craft beer, shoes, and they’re becoming bank customers and buy insurance and go to restaurants and pay taxes at every twist and turn. Few of those employees end up saving much. Most of them spend most of their wages, and this money goes to other companies and their employees, and it gets recycled over and over again, allowing for more hiring and more wages and more consumption to percolate through the economy.

Some of this money that is circulating comes from revenues, and is thereby extracted from the economy to be recycled. But the rest of the money – the amount that companies spend that exceeds their revenues, so the negative cash flow – comes from investors. And this is pure stimulus.

This is how the $10 billion that Softbank sank into WeWork was and will be recycled via salaries and office leases and purchases, and via local taxes, and purchases of furniture and decorations and rehabbing offices whereby the money was recycled by construction crews and electricians and flooring suppliers. Softbank’s money was routed via WeWork into the various local economies where WeWork is active. And it helped pump up commercial real estate prices and office rents along the way.

The shale oil-and-gas sector spends a lot of the negative cash flow in the oil patch, but also the locations where the equipment they buy is manufactured, such as sophisticated computer equipment, the latest drilling rigs, big generators, high-pressure pumps, and the like.

So an oil driller in Texas will transfer some investor money to manufacturers in distant cities. And the employees at these manufacturing plants buy trucks and boats and used cars, and they buy houses, and all kinds of stuff, and all of those hundreds of billions of dollars that investors plowed into the industry got transferred and recycled endlessly.

This is the multiplier effect of investors plowing their cash into money-losing negative cash-flow operations.

So what happens when investors figure out that this money is gone, and that any new money they might give these companies will also be gone?

by Wolf Richter, Wolfstreet |  Read more:

Monday, October 28, 2019

Taylor Swift

Why We Can't Tell the Truth About Aging

Reading through a recent spate of books that deal with aging, one might forget that, half a century ago, the elderly were, as V. S. Pritchett noted in his 1964 introduction to Muriel Spark’s novel “Memento Mori,” “the great suppressed and censored subject of contemporary society, the one we do not care to face.” Not only are we facing it today; we’re also putting the best face on it that we possibly can. Our senior years are evidently a time to celebrate ourselves and the wonderful things to come: travelling, volunteering, canoodling, acquiring new skills, and so on. No one, it seems, wants to disparage old age. Nora Ephron’s “I Feel Bad About My Neck” tries, but is too wittily mournful to have real angst. Instead, we get such cheerful tidings as Mary Pipher’s “Women Rowing North: Navigating Life’s Currents and Flourishing as We Age,” Marc E. Agronin’s “The End of Old Age: Living a Longer, More Purposeful Life,” Alan D. Castel’s “Better with Age: The Psychology of Successful Aging,” Ashton Applewhite’s “This Chair Rocks: A Manifesto Against Ageism,” and Carl Honoré’s “Bolder: Making the Most of Our Longer Lives”—five chatty accounts meant to reassure us that getting old just means that we have to work harder at staying young. (...)

These authors aren’t blind to the perils of aging; they just prefer to see the upside. All maintain that seniors are more comfortable in their own skins, experiencing, Applewhite says, “less social anxiety, and fewer social phobias.” There’s some evidence for this. The connection between happiness and aging—following the success of books like Jonathan Rauch’s “The Happiness Curve: Why Life Gets Better After 50” and John Leland’s “Happiness Is a Choice You Make: Lessons from a Year Among the Oldest Old,” both published last year—has very nearly come to be accepted as fact. According to a 2011 Gallup survey, happiness follows the U-shaped curve first proposed in a 2008 study by the economists David Blanchflower and Andrew Oswald. They found that people’s sense of well-being was highest in childhood and old age, with a perceptible dip around midlife.

Lately, however, the curve has invited skepticism. Apparently, its trajectory holds true mainly in countries where the median wage is high and people tend to live longer or, alternatively, where the poor feel resentment more keenly during middle age and don’t mind saying so. But there may be a simpler explanation: perhaps the people who participate in such surveys are those whose lives tend to follow the curve, while people who feel miserable at seventy or eighty, whose ennui is offset only by brooding over unrealized expectations, don’t even bother to open such questionnaires.

One strategy of these books is to emphasize that aging is natural and therefore good, an idea that harks back to Plato, who lived to be around eighty and thought philosophy best suited to men of more mature years (women, no matter their age, could not think metaphysically). His most famous student, Aristotle, had a different opinion; his “Ars Rhetorica” contains long passages denouncing old men as miserly, cowardly, cynical, loquacious, and temperamentally chilly. (Aristotle thought that the body lost heat as it aged.) These gruff views were formed during the first part of Aristotle’s life, and we don’t know if they changed before he died, at the age of sixty-two. The nature-is-always-right argument found its most eloquent spokesperson in the Roman statesman Cicero, who was sixty-two when he wrote “De Senectute,” liberally translated as “How to Grow Old,” a valiant performance that both John Adams (dead at ninety) and Benjamin Franklin (dead at eighty-four) thought highly of.

Montaigne took a more measured view. Writing around 1580, he considered the end of a long life to be “rare, extraordinary, and singular . . . ’tis the last and extremest sort of dying: and the more remote, the less to be hoped for.” Montaigne, who never reached sixty, might have changed his mind upon learning that, in the twenty-first century, people routinely live into their seventies and eighties. But I suspect that he’d still say, “Whoever saw old age, that did not applaud the past, and condemn the present times?” No happiness curve for him.

There is, of course, a chance that you may be happier at eighty than you were at twenty or forty, but you’re going to feel much worse. (...)

In short, the optimistic narrative of pro-aging writers doesn’t line up with the dark story told by the human body. But maybe that's not the point. “There is only one solution if old age is not to be an absurd parody of our former life,” Simone de Beauvoir wrote in her expansive 1970 study “The Coming of Age,” “and that is to go on pursuing ends that give our existence a meaning—devotion to individuals, to groups, or to causes—social, political, intellectual, or creative work.” But such meaning is not easily gained. In 1975, Robert Neil Butler, who had previously coined the term “ageism,” published “Why Survive? Being Old in America,” a Pulitzer Prize-winning study of society’s dereliction toward the nation’s aging population. “For many elderly Americans old age is a tragedy, a period of quiet despair, deprivation, desolation and muted rage,” he concluded. (...)

A contented old age probably depends on what we were like before we became old. Vain, self-centered people will likely find aging less tolerable than those who seek meaning in life by helping others. And those fortunate enough to have lived a full and productive life may exit without undue regret. But if you’re someone who—oh, for the sake of argument—is unpleasantly surprised that people in their forties or fifties give you a seat on the bus, or that your doctors are forty years younger than you are, you just might resent time’s insistent drumbeat. Sure, there’s life in the old boy yet, but certain restrictions apply. The body—tired, aching, shrinking—now quite often embarrasses us. Many older men have to pee right after they pee, and many older women pee whenever they sneeze. Pipher and company might simply say “Gesundheit” and urge us on. Life, they insist, doesn’t necessarily get worse after seventy or eighty. But it does, you know. I don’t care how many seniors are loosening their bedsprings every night; something is missing.

It’s not just energy or sexual prowess but the thrill of anticipation. Even if you’re single, can you ever feel again the rush of excitement that comes with the first brush of the lips, the first moment when clothes drop to the floor? Who the hell wants to tear his or her clothes off at seventy-five? Now we dim the lights and fold our slacks and hope we don’t look too soft, too wrinkled, too old. Yes, mature love allows for physical imperfections, but wouldn’t we rather be desired for our beauty than forgiven for our flaws? These may seem like shallow regrets, and yet the loss of pleasure in one’s own body, the loss of pleasure in knowing that one’s body pleases others, is a real one.

I can already hear the objections: If my children are grown and happy; if my grandchildren light up when they see me; if I’m healthy and financially secure; if I’m reasonably satisfied with what I’ve accomplished; if I feel more comfortable now that I no longer have to prove myself—why, then, the loss of youth is a fair trade-off. Those are a lot of “if”s, but never mind. We should all make peace with aging. And so my hat is off to Dr. Oliver Sacks, who chose to regard old age as “a time of leisure and freedom, freed from the factitious urgencies of earlier days, free to explore whatever I wish, and to bind the thoughts and feelings of a lifetime together.” At eighty-two, he rediscovered the joy of gefilte fish, which, as he noted, would usher him out of life as it had ushered him into it.

“No wise man ever wished to be younger,” Swift asserted, never having met me. But this doesn’t mean that we have to see old age as something other than what it is. It may complete us, but in doing so it defeats us. “Life is slow dying,” Philip Larkin wrote before he stopped dying, at sixty-three—a truth that young people, who are too busy living, cavalierly ignore. Should it give them pause, they’ll discover that just about every book on the subject advocates a “positive” attitude toward aging in order to maintain a sense of satisfaction and to achieve a measure of wisdom. And yet it seems to me that a person can be both wise and unhappy, wise and regretful, and even wise and dubious about the wisdom of growing old.

by Arthur Krystal, New Yorker | Read more:
Image: Joost Swarte
[ed. See also: Put down the self-help books. Resilience is not a DIY endeavour (The Globe and Mail).]

Sunday, October 27, 2019

Hillary Clinton Spoils the Party

In the middle of October, Hillary Clinton managed to perform a minor political miracle. By baselessly speculating that Rep. Tulsi Gabbard, D-Hawaii, was a “favorite of the Russians” and preparing to run as an independent, she revived one of the more quixotic, eccentric and moribund campaigns of this election cycle while spoiling a primary that has proved shockingly substantive for a major party in the United States.

Gabbard “clapped back,” tweeting that Hillary was “the queen of warmongers, embodiment of corruption, and personification of the rot that has sickened the Democratic Party.” The congresswoman then proceeded to parlay Clinton’s political anti-genius for hauling feckless enemies out of political obscurity and crowning them with a notoriety they’d never be able to achieve on her own, into a brief turn in the media spotlight. Gabbard even went on the eponymous Fox News show “Hannity,” which makes Tucker Carlson’s white power hour look like the School of Athens, to complain about her treatment by a woman she blames for the last two decades of American wars, and to echo Republican procedural complaints about the ongoing impeachment inquiry into Donald Trump.

Clinton’s record as secretary of state speaks for itself. Her avid cheerleading for the disastrous “intervention” in Libya alone should be tattooed on her forehead and carved onto her eventual monument as a warning to the next hundred generations. But Gabbard’s own anti-war bona fides are themselves questionable, appealing to suckers and desperate contrarians alike. Scratch the surface and her foreign policy reveals itself as little more than pre-Bush realpolitik, with a Kissinger-ian preference for an archipelago of U.S.-aligned strongman governments to keep the dual threats of “Islamic Terrorism” and pan-Arabism in line. That foreign policy includes robust American expeditionary forces and drone warfare capabilities to prosecute the so-called War on Terror.

Clinton, meanwhile, seems constitutionally incapable of letting go of the bogus narrative that she lost to Donald Trump in 2016 not because she ran a lousy campaign that couldn’t turn out the vote in critical states, but because of Jill Stein’s third-party run, which garnered less than one third of the votes of fellow third-party candidate Gary Johnson. Combined with the still-nebulous conspiracy of “Russian interference,” of which Jill Stein is and is not a part, depending on the theorist, this keeps getting Clinton in trouble.

Much like Trump himself, the Clintons have long surrounded themselves with a coterie of slavish hangers-on, so it follows that there is no one left in their inner circle to say, Mrs. Clinton, maybe you’d better not. Ironically, in picking this fight with Gabbard, Hillary could be recapitulating the very error that she and her husband made in 2015, when Bill infamously encouraged Trump to run as Republican spoiler, inadvertently elevating the one character Hillary was least equipped to confront and defeat.

Gabbard is no Trump: she lacks his odious magnetism, his greedy horniness for fame and notoriety. And unlike Trump, for whom a tacky, gross American ordinariness is a huge part of his successful public charm, she is a genuine eccentric—a bundle of personal and political contradictions totally out of keeping with the aggressive someone-oughtta-do-something resentments of the angry America that elected our current president.

But Hillary Clinton is no Hillary Clinton; not anymore. And on the vastly diminished stage of Twitter spats and cable media hits, she cannot hope to win here. Even were she to manage to make some political enemy look small, she can only look smaller, this figure who could have retired to a life of philanthropy, for which she would have been feted by cultural tastemakers, and out of which she might have actually engendered the very sentiment for which she is so obviously and ineffectively clamoring now: a sentimental, hypothetical nostalgia for that which might have been had she won.

This makes all the more grotesquely poignant the recent New York Times report that a “half-dozen Democratic donors” had gathered in Manhattan at the Whitby Hotel, “a celebration of contemporary art and design . . . on the doorstep of some of New York’s leading restaurants, galleries and museums, including MoMA.” (Including MoMA! Lord save us from the Manhattan provincialism of the stupidly rich.) These donors were getting together to ask themselves seemingly the only question their wealth and privilege will allow them about the Democratic primary: “Is there anyone else?” Could they, in other words, draft some other centrist sucker into the race: the already-abandoned Howard Shultz? Former Attorney General Eric Holder? The perennial will-he/won’t-he billionaire, Michael Bloomberg? Hillary?

by Jacob Bacharach, Truthdig | Read more:
Image: Julio Cortez/AP
[ed. : )  Intramurals.]

What the End of Modern Philosophy Would Look Like

Philosophy is, no doubt, the slowest-moving branch of human inquiry. The best proof of this can be seen in its peculiar use of the word “modern.”

When musicians speak of “modern jazz,” they are generally referring to the emergence of bop in the 1940s, as in the music of Charlie Parker and Dizzy Gillespie, and thus to a period of music that is still honored today even if supplanted by later developments. Modern architecture looks further back in time, to the early twentieth century rejection of the Beaux Art and Neoclassical styles. Although modern architecture still produces original variants even now, there would be an inherent challenge in arguing that “modernism” is still an accurate description of that field today. Modern art goes back even further, and is often traced to Édouard Manet’s canvases of 1863. Here, there is wider consensus that modernism is dead, replaced by a “postmodern” period identified as running from the 1960s through the present.

Is modern philosophy a thing of the past, in the way that one might argue for modern jazz, modern architecture, and modern painting? It may be surprising for readers to learn that “modern philosophy” is taken to begin with René Descartes (1596-1650), who abandoned traditional Arisotelianism in favor of regrounding the discipline in the immediate evidence of the thinking human subject: “I think, therefore I am.” A nuance is usually added: Descartes and his fellow seventeenth-century thinkers are often qualified as “early moderns,” while modern philosophy proper (the kind that is still practiced today) is defined temporally by the ideas of the Scottish skeptic David Hume (1711-1776) and the pivotal German thinker Immanuel Kant (1724-1804). No philosopher is likely to be taken seriously if they attempt to return to the period prior to Hume. A case in point is the twentieth-century English philosopher Alfred North Whitehead (1861-1947), who, while widely respected as a mathematician, is not universally recognized as part of the canon of great philosophers. The reason for this can be found largely in his rejection of the basic presuppositions of the Hume/Kant modernism. By and large, those who wish to be taken seriously in academic departments of philosophy need to accept these presuppositions.

This article takes a contrary view. As I see it, we are long overdue for a revision of what counts as an acceptable starting point for philosophy, and, therefore, for a decisive parting with modernism, which the other fields mentioned above completed as long as half a century ago.

The reason for Descartes’ famous principle (“I think, therefore I am”) was his wish to ground philosophy in a rigorous starting point worthy of mathematics or the natural sciences. To do this, he undertook his famous method of radical doubt. Am I really so sure that I am not dreaming or deluded at this very moment? This question has become the basis of much that popular culture considers “philosophical”: in films ranging from The Wizard of Oz to Fight Club to The Matrix, the philosophically minded director is thought to be one who challenges our commonsense notion of reality, to the point that this has become a cliché: “it was all just a dream.” But the science fiction writer J.G. Ballard claimed the opposite: given that we are now surrounded with fictions in everyday life (propaganda, advertisements, conspiracy theories) the role of the artist has been reversed, and should now involve creating realities able to hold together the many fictions that perplex us.

In any case, Descartes asks us to imagine the worst-case scenario of an evil God who deceives us about absolutely everything, so that even my body and the facts of my everyday life are “fake news.” But even under this nightmare scenario, Descartes held, I must still be thinking in order to be deceived. If I did not at least exist as a thinker, the evil God would have had no one to deceive. Therefore, to repeat: “I think, therefore I am.” From there, Descartes provides further arguments that strike most contemporary readers as more naïve, and, therefore, as not fully modern (but just “early modern”) whereas Hume and Kant can still be called modern in the full-fledged sense. Namely, Descartes said that since I have an idea of perfection in my mind but no experience of anything perfect, the idea of perfection must have been put in my mind from the outside. This must have been done by a perfect God, who (since He is perfect) could not be deceiving us constantly, and, as a result, we cannot be experiencing a world of sheer illusion. We do make many errors, of course, but for Descartes these errors result only from an improper use of our reasoning powers. If we use our reason correctly, the truth is well within our grasp. Aside from his philosophical work, Descartes was also a pioneer in the use of mathematical reasoning in physics, and is a key figure in the scientific revolution no less than in modern philosophy itself.

Hume and Kant strike us as more fully modern because they do not resort to the argument from God, but base their philosophies on the evidence of immediate experience. For Hume, all we see in experience are perceptions, qualities, or ideas, not objects. The apple sitting before me need not be an independent object called an “apple,” since all we really experiences are its shifting qualities: red, ripe, spherical, shiny, and so forth. In other words, the apple is just a “bundle of qualities” rather than an object, and by analogy, what I take to be my mind is really just a “bundle of perceptions” with no evidence of an identical “soul” or even “mind” that persists from birth to death and beyond. By the same token, we have no evidence of causal relations between things outside the mind. Although it seems as if every time we touch fire it burns and hurts us—meaning that we would do well never to put our hands in fire—, there is no way to prove that the next time we touch fire it will not freeze us instead. All we have is experience. Enduring objects, our own enduring minds, and the apparent causal powers of both, are known only from experience and cannot be established as existing outside it.

Kant wrote that Hume’s writings awoke him “from his dogmatic slumber.” Hume, he held, was basically right, but with disastrous consequences for human knowledge. Accordingly, Kant created a famous divide between what he called the thing-in-itself (the real world outside us) and appearances (the world as it seems in experience). Although he agreed with Hume that we could never prove the existence of individual objects or souls, let alone causal relations outside human experience, he denied that knowledge of such things was impossible. Instead, we merely need to accept that no knowledge is possible of the thing-in-itself outside experience, but that knowledge can be attained of the unvarying structures of human experience itself. Examples of such structures include the fact that time moves in one direction from past to future, that space is experienced as having exactly three dimensions, and that human understanding functions according to twelve “categories” which are so basic that we cannot even imagine a kind of experience that would not follow them. For Kant, cause and effect is one of these categories: perhaps in the world of the thing-in-itself things happen randomly, or do not happen at all, but for human experience it is an absolute law that everything happens according to a law of cause and effect.

Now, philosophers since then have by no means accepted everything said by these two thinkers. Many do not accept Hume’s notion that mathematics deals with logic alone, and practically no one accepts Kant’s idea of a thing-in-itself beyond all human experience. Nonetheless, Hume and Kant still feel like contemporaries, because nearly every self-respecting philosopher accepts the two pillars of philosophy borrowed from them.

I call these pillars “Taxonomical.” Usually, taxonomy refers to the classification of the various types of anything that exists: the numerous kinds of birds, fish, or berries there are in the world. But Taxonomy is very simple in the case of modern philosophy, which recognizes two, and only two, kinds of things: (1) human thought, and (2) everything else. While this may look absurd at first glance –why should a fragile and recent animal species like human beings deserve to fill up half of philosophy?– we recall that Descartes gave what looked like a good reason for it. Namely, human thought is immediately given to me and must exist, since otherwise I cannot even be dreaming or hallucinating, while everything else (including God and the world) is only known derivatively by comparison with thought.

In short, Modern Taxonomy can be rewritten as: (1) that which is immediately given, and (2) that which is known only in mediated fashion. This gave rise to what might be called the intellectual division of labor in the modern world, in which philosophy is left to puzzle over the thought-world relation, while the relation between any two parts of the world itself is reserved for natural science.

This, in turn, leads us to another variant of Modern Taxonomy: (1) restriction of philosophy to the thought-world relation, and (2) science-worship when it comes to the relation between inanimate objects. By “science-worship” I mean something very specific: the notion that in treating topics such as causality, time, space, and individuals, philosophy must not stray far from the current state of “the best science we have.” We saw that Hume and Kant set the general horizon for what counts as respectable philosophizing today. The best way to look like a philosophical crackpot is to reject one or both pillars of Modern Taxonomy: if you say that we can think about the features belonging to all relations and not just that between thought and world, or if you claim that philosophy has nothing to add about “nature” that the natural sciences are not already saying better, you are likely to look suspicious, even if you are as great a speculative thinker as Whitehead himself. In this spirit, and returning to the title of this article, what would the end of modern philosophy look like? It would amount to the end of the two pillars of Taxonomy: (1) philosophy is primarily concerned with the relation between world and thought, and (2) science-worship as concerns the relation between anything in the world outside thought. If we reject these two principles, have we not automatically become pre-modern crackpots? Not at all. Let’s take them one at a time.

by Graham Harman, The Philosophical Salon |  Read more:
Image: uncredited

Saturday, October 26, 2019

When GoFundMe Gets Ugly

GoFundMe has become the largest crowdfunding platform in the world— 50 million people gave more than $5 billion on the site through 2017, the last year fundraising totals were released. The company used to take 5 percent of each donation, but two years ago, when Facebook eliminated some charges for fundraisers, GoFundMe announced that it would do the same and just ask donors for tips. (Company officials wouldn’t say whether this model is profitable, though the site does have other sources of revenue, such as selling its online tools to nonprofits; the “grand ambition,” Solomon told me, is to have all internet charity, whether initiated by individuals or large organizations, flow through GoFundMe.)

The spectacularly fruitful GoFundMes are the ones that make the news—$24 million for Time’s Up, Hollywood’s legal-defense fund to fight sexual harassment; $7.8 million for the victims of the Pulse nightclub shooting in Orlando—but most efforts fizzle without coming close to their financial goals. Comparing the hits and misses reveals a lot about what matters most to us, our divisions and our connections, our generosity and our pettiness. And even the blockbuster successes, the stories that make the valedictory lap that is GoFundMe’s homepage, are much more complicated than any viral marketer would care to admit. (...)

Gofundme campaigns that go viral tend to follow a template similar to Chauncy’s Chance: A relatively well-off person stumbles upon a downtrodden but deserving “other” and shares his or her story; good-hearted strangers are moved to donate a few dollars, and thus, in the relentlessly optimistic language of GoFundMe, “transform a life.” The call-and-response between the have-nots and the haves poignantly testifies to the holes in our safety net—and to the ways people have jerry-rigged community to fill them. In an era when membership in churches, labor unions, and other civic organizations has flatlined, GoFundMe offers a way to help and be helped by your figurative neighbor.

What doesn’t fit neatly into GoFundMe’s salvation narratives are the limits of private efforts like Matt White’s. GoFundMe campaigns blend the well-intentioned with the cringeworthy, and not infrequently bring to mind the “White Savior Industrial Complex”—the writer Teju Cole’s phrase for the way sentimental stories of uplift can hide underlying structural problems. “The White Savior Industrial Complex is not about justice,” Cole wrote in 2012. “It is about having a big emotional experience that validates privilege.” (...)

Search the GoFundMe site for cancer or bills or tuition or accident or operation and you’ll find pages of campaigns with a couple thousand, or a couple hundred, or zero dollars in contributions. While the platform can be a stopgap solution for families on the financial brink—one study estimated that it prevented about 500 bankruptcies from medical-related debt a year, the most common reason for bankruptcy in the U.S.—the average campaign earns less than $2,000 from a couple dozen donors; the majority don’t meet their stated goal. (...)

Part of the allure of GoFundMe is that it’s a meritocratic way to allocate resources—the wisdom of the crowd can identify and reward those who most need help. But researchers analyzing medical crowdfunding have concluded that one of the major factors in a campaign’s success is who you are—and who you know. Which sounds a lot like getting into Yale. Most donor pools are made up of friends, family, and acquaintances, giving an advantage to relatively affluent people with large, well-resourced networks. A recent Canadian study found that people crowdfunding for health reasons tend to live in high-income, high-education, and high-homeownership zip codes, as opposed to areas with greater need. As a result, the authors wrote, medical crowdfunding can “entrench or exacerbate socioeconomic inequality.” Solomon calls this “hogwash.” The researchers made assumptions based on “limited data sets,” he said, adding that GoFundMe could not give them better information, because of privacy concerns.

The Roys did not have a robust social-media network, or real-life one, for that matter. A native of England, Richard has no family nearby, and his wife’s only relatives are her aging mother and a sister. Laila had deleted her Facebook account not long after her twins’ premature birth, a tense, precarious time when vague well wishes and “likes” from acquaintances only made her feel more alone. Richard worked from home and had only a couple hundred Facebook friends. “Maybe if he worked for a large local company and I worked for a large local company, maybe if we were churchgoers—that’s another network. But I don’t go to church, and he doesn’t either,” Laila said. “I have been told explicitly by social workers that you should go to church just to network. But I try not to be a hypocrite.”

What’s wrong with you also influences whether you score big with medical crowdfunding, according to the University of Washington at Bothell medical anthropologist Nora Kenworthy and the media scholar Lauren Berliner, who have been studying the subject since 2013. Successful campaigns tend to focus on onetime fixes (a new prosthetic, say) rather than chronic, complicated diagnoses like Laila’s. Terminal cases and geriatric care are also tough to fundraise for, as are stigmatized conditions such as HIV and addiction- or obesity-related problems.

“It’s not difficult to imagine that people who are traditionally portrayed as more deserving, who benefit from the legacies of racial and social hierarchies in the U.S., are going to be seen as more legitimate and have better success,” Kenworthy told me. At the same time, the ubiquity of medical crowdfunding “normalizes” the idea that not everyone deserves health care just because they’re sick, she said. “It undermines the sense of a right to health care in the U.S. and replaces it with people competing for what are essentially scraps.”

by Rachel Monroe, The Atlantic | Read more:
Image: Akasha Rabut