Monday, December 17, 2018
The Orange Trapper
Fishing from a canoe in the Delaware River, I like to ship the paddle and let the boat go where it will. I watch the stony bottom, which flies by under fast-moving water. This is not Philadelphia. This is two hundred river miles above Philadelphia, where the stream-rounded rocks are so clear they look printed. Shoving the rocks, anadromous lampreys have built fortress nests, which are spread around the river like craters of the moon. Mesmerized, I watch the rocks go by. Fly-casting for bass, I see golf balls.
From shallows in the Merrimack in Manchester, New Hampshire, I once picked up a ball that bore the logo of a country club two and a half miles upstream. If the river brought it there, the ball had come through deep water and then over the Amoskeag Dam. In the Connecticut River above Northampton, Massachusetts, I’ve seen golf balls by the constellation—too deep to reach and too far from any upstream golf course for their presence to make sense unless people hit them off their lawns. Compulsions are easy to come by and hard to explain. Mine include watching for golf balls, which I do with acute attention, the fact notwithstanding that I quit golf cold when I was twenty-four. These days, my principal form of exercise is on a bicycle, which I ride a good bit upward of two thousand miles a year. I go past golf courses. How could I not? I live in New Jersey, which has a golf-course density of five per hundred square miles, or twice the G.C.D. of Florida, which has more golf courses than any other state. Moreover, the vast undeveloped forests of the southern part of New Jersey tend to shove the densities toward and beyond Princeton, in whose environs I ride my bike. The woods that lie between public roads and private fairways remind me of the dry terrain between a river levee and the river itself. In Louisiana along the Mississippi this isolated and often wooded space is known as the river batture. If you’re in Louisiana, you pronounce it “batcher.” From my bicycle in New Jersey, if I am passing a golf-links batture, my head is turned that way and my gaze runs through the woods until a white dot stops it, which is not an infrequent occurrence. I get off my bike and collect the ball.
The Delaware is less accommodating. When you are flying along on fast current, you don’t just get off your canoe and prop it up on a kickstand in order to pick up a golf ball. Over time, seeing so many golf balls in the river was such a threatening frustration that I had to do something about it. Research led to the telephone number of a company then in Michigan. A real person answered and was even more than real. She understood me. She knew what I was asking and did not call 911. Instead, she had questions of her own: What was the speed of the current? What was the depth of the river? Was the bottom freestone? Sand? Clay? Silt? After completing the interview, she said, “You want the Orange Trapper.”
“The Orange Trapper?”
“The Orange Trapper.”
It came in various lengths. I said I thought the nine-footer would do. The nine might be stiffer in the current than the twelve, the fifteen, the eighteen, the twenty-one, or the twenty-four. Besides, nine (actually, 9.6) just felt right. It was the length of my fly rods.
What came in the mail was only twenty-one inches long, with an orange head, a black grip, and a shaft that consisted of ten concentric stainless tubes with a maximum diameter of five-eighths of an inch. You could conduct an orchestra with it. It was beautiful. The orange head was a band of industrial-strength plastic, as obovate as a pear and slightly wider than a golf ball. A depression in its inside top was there to secure one side of a ball, but the genius of the device was in a working part, a bevelled “flipper” that came up through the throat and would waggle into place on the other side of the ball. The Orange Trapper worked two ways. It had no upside or downside. You could surround a golf ball with either side, then lift it up as if you were playing lacrosse with no strings. You could turn the head over—a hundred and eighty degrees—and the ball would generally stay put. But flip the thing over once more and the ball would always roll free. Made by JTD Enterprises, it could have been designed by Apple.
Even so, finesse was required to trap a ball in shallow current. After seeing one, and swinging around, and going hard upstream, and shipping the paddle, you had about five seconds to place the head of the Trapper over the ball. I missed as often as not. It wasn’t the Trapper’s fault. My average would have been higher chasing hummingbirds with a butterfly net. The river is an almost endless sequence of shallows, riffles, rapids, and slow pools. For the real action, I went below some white water into a long deep pool with Don Schlaefer in his johnboat. Don is a fishing pal. He plays golf. He had no interest in the balls in the river, but he could put his boat right over them and hold it there while I fished with the Orange Trapper. I picked up a dozen golf balls in half an hour.
Marvelling at the craziness, Don said, “Why are you doing this? They’re only golf balls. Golf balls are cheap.”
I said, “Money has nothing to do with it.”
A Titleist Pro V1, currently the Prada golf ball, costs four or five dollars on the Internet and more in a pro shop. If a person of Scottish blood says money has nothing to do with that, he is really around the corner. True, I don’t find balls of such quality often in the river. But they’re a high percentage of what I pick up in the roadside woods of New Jersey. Titleist makes about a million balls a day. In the United States, for all qualities and brands, a present estimate is that golfers lose three hundred million golf balls a year.
Why? Ask George Hackl, who grew up playing golf on courses around Princeton, now lives in central New Hampshire, and is a member of Bald Peak, Yeamans Hall, Pine Valley, and the Royal and Ancient Golf Club of St. Andrews.
Hackl: “It is an indication of the vast disparity of wealth in this country that golfers in some places can hit seven-dollar balls into woods and thickets and not even bother to look for them.”
There is less to it than that. Golfers have egos in the surgeon range. They hit a drive, miss the fairway, and go looking for the ball thirty yards past where it landed. When their next drive goes into timber and sounds like a woodpecker in the trees, there is no way to know the vector of the carom, so they drop another ball and play on. It must be said, in their defense, that various pressures concatenate and force them to keep moving, no matter the cost in golf balls. The foursome behind is impatient. A major issue is how long it takes to play. It is infra dig to cause “undue delay.” In the Rules of Golf, there’s a five-minute time limit on looking for lost balls. The rule may be unknown to some golfers and by others ignored, but five minutes or less is what most golfers give to finding lost balls. The rest are mine.

The Delaware is less accommodating. When you are flying along on fast current, you don’t just get off your canoe and prop it up on a kickstand in order to pick up a golf ball. Over time, seeing so many golf balls in the river was such a threatening frustration that I had to do something about it. Research led to the telephone number of a company then in Michigan. A real person answered and was even more than real. She understood me. She knew what I was asking and did not call 911. Instead, she had questions of her own: What was the speed of the current? What was the depth of the river? Was the bottom freestone? Sand? Clay? Silt? After completing the interview, she said, “You want the Orange Trapper.”
“The Orange Trapper?”
“The Orange Trapper.”
It came in various lengths. I said I thought the nine-footer would do. The nine might be stiffer in the current than the twelve, the fifteen, the eighteen, the twenty-one, or the twenty-four. Besides, nine (actually, 9.6) just felt right. It was the length of my fly rods.
What came in the mail was only twenty-one inches long, with an orange head, a black grip, and a shaft that consisted of ten concentric stainless tubes with a maximum diameter of five-eighths of an inch. You could conduct an orchestra with it. It was beautiful. The orange head was a band of industrial-strength plastic, as obovate as a pear and slightly wider than a golf ball. A depression in its inside top was there to secure one side of a ball, but the genius of the device was in a working part, a bevelled “flipper” that came up through the throat and would waggle into place on the other side of the ball. The Orange Trapper worked two ways. It had no upside or downside. You could surround a golf ball with either side, then lift it up as if you were playing lacrosse with no strings. You could turn the head over—a hundred and eighty degrees—and the ball would generally stay put. But flip the thing over once more and the ball would always roll free. Made by JTD Enterprises, it could have been designed by Apple.
Even so, finesse was required to trap a ball in shallow current. After seeing one, and swinging around, and going hard upstream, and shipping the paddle, you had about five seconds to place the head of the Trapper over the ball. I missed as often as not. It wasn’t the Trapper’s fault. My average would have been higher chasing hummingbirds with a butterfly net. The river is an almost endless sequence of shallows, riffles, rapids, and slow pools. For the real action, I went below some white water into a long deep pool with Don Schlaefer in his johnboat. Don is a fishing pal. He plays golf. He had no interest in the balls in the river, but he could put his boat right over them and hold it there while I fished with the Orange Trapper. I picked up a dozen golf balls in half an hour.
Marvelling at the craziness, Don said, “Why are you doing this? They’re only golf balls. Golf balls are cheap.”
I said, “Money has nothing to do with it.”
A Titleist Pro V1, currently the Prada golf ball, costs four or five dollars on the Internet and more in a pro shop. If a person of Scottish blood says money has nothing to do with that, he is really around the corner. True, I don’t find balls of such quality often in the river. But they’re a high percentage of what I pick up in the roadside woods of New Jersey. Titleist makes about a million balls a day. In the United States, for all qualities and brands, a present estimate is that golfers lose three hundred million golf balls a year.
Why? Ask George Hackl, who grew up playing golf on courses around Princeton, now lives in central New Hampshire, and is a member of Bald Peak, Yeamans Hall, Pine Valley, and the Royal and Ancient Golf Club of St. Andrews.
Hackl: “It is an indication of the vast disparity of wealth in this country that golfers in some places can hit seven-dollar balls into woods and thickets and not even bother to look for them.”
There is less to it than that. Golfers have egos in the surgeon range. They hit a drive, miss the fairway, and go looking for the ball thirty yards past where it landed. When their next drive goes into timber and sounds like a woodpecker in the trees, there is no way to know the vector of the carom, so they drop another ball and play on. It must be said, in their defense, that various pressures concatenate and force them to keep moving, no matter the cost in golf balls. The foursome behind is impatient. A major issue is how long it takes to play. It is infra dig to cause “undue delay.” In the Rules of Golf, there’s a five-minute time limit on looking for lost balls. The rule may be unknown to some golfers and by others ignored, but five minutes or less is what most golfers give to finding lost balls. The rest are mine.
You get off your bike, pick up a ball, and sometimes are able to identify the species it hit. Pine pitch makes a clear impression. Tulip poplars tend to smear. An oak or hickory leaves a signature writ small and simple. A maple does not leave maple syrup. At your kitchen sink, you can tell how long a ball sat on the ground by the length of time required to take the ground off the ball.
With felt-tip pens and indelible ink, golfers decorate balls to individualize them beyond the markings of the manufacturer. If more than one player is using a Callaway 3 HX hot bite or a Pinnacle 4 gold FX long—or, far more commonly, there’s a coincidence of Titleists—you need your own pine tree. Some golfers’ graffiti are so elaborate that they resemble spiderwebs festooned with Christmas ornaments. Golfers also draw straight, longitudinal lines that serve as gunsights in putting. It is possible to mark a ball with a ballpoint pen, but some golfers actually believe that the weight of ballpoint ink, altering the pattern of flight, will affect the precision of their shots. It is tempting to say that the prevalence of this belief is in direct proportion to handicap.
In the frenzy of marketing, golf balls are sold in such complex variety that golf’s pro shops are not far behind fishing’s fly shops, where line weights and rod weights and tip flex and reel seats are sold in so many forms for so many different capabilities and so many different situations that people’s basements are forested with tackle. And, as with fishing equipment, the spectrum of subtlety in golf balls includes price. The difference is not among manufacturers but within the product lines of manufacturers. You can buy a dozen Titleist DT SoLos for less than twenty dollars. I know a golfer who has spoken as follows about looking for a wayward ball: “If you don’t find yours but find another of the same quality, you’re even. If you find a ball that’s not up to your standards, you leave it there for a lower class of golfer.” How he happened to get into the woods in the first place was not a topic he addressed. He reminded me of a pirate in the Guayas River near Guayaquil. With six other pirates, he came off a needle boat and over the stern of a Lykes Brothers merchant ship. They were armed mainly with knives. One of them held a hacksaw blade at a sailor’s throat while others tied him to a king post. A pirate pointed at the sailor’s watch, and said, “Give me.” The sailor handed over the watch. The pirate looked at it and gave it back.
With felt-tip pens and indelible ink, golfers decorate balls to individualize them beyond the markings of the manufacturer. If more than one player is using a Callaway 3 HX hot bite or a Pinnacle 4 gold FX long—or, far more commonly, there’s a coincidence of Titleists—you need your own pine tree. Some golfers’ graffiti are so elaborate that they resemble spiderwebs festooned with Christmas ornaments. Golfers also draw straight, longitudinal lines that serve as gunsights in putting. It is possible to mark a ball with a ballpoint pen, but some golfers actually believe that the weight of ballpoint ink, altering the pattern of flight, will affect the precision of their shots. It is tempting to say that the prevalence of this belief is in direct proportion to handicap.
In the frenzy of marketing, golf balls are sold in such complex variety that golf’s pro shops are not far behind fishing’s fly shops, where line weights and rod weights and tip flex and reel seats are sold in so many forms for so many different capabilities and so many different situations that people’s basements are forested with tackle. And, as with fishing equipment, the spectrum of subtlety in golf balls includes price. The difference is not among manufacturers but within the product lines of manufacturers. You can buy a dozen Titleist DT SoLos for less than twenty dollars. I know a golfer who has spoken as follows about looking for a wayward ball: “If you don’t find yours but find another of the same quality, you’re even. If you find a ball that’s not up to your standards, you leave it there for a lower class of golfer.” How he happened to get into the woods in the first place was not a topic he addressed. He reminded me of a pirate in the Guayas River near Guayaquil. With six other pirates, he came off a needle boat and over the stern of a Lykes Brothers merchant ship. They were armed mainly with knives. One of them held a hacksaw blade at a sailor’s throat while others tied him to a king post. A pirate pointed at the sailor’s watch, and said, “Give me.” The sailor handed over the watch. The pirate looked at it and gave it back.
by John McPhee, New Yorker | Read more:
Image: Phillip ToledanoAre You Ready for the Financial Crisis of 2019?
For moneyed Americans, most of the past year has felt like 1929 all over again — the fun, bathtub-gin-quaffing, rich-white-people-doing-the-Charleston early part of 1929, not the grim couple of months after the stock market crashed.
After a decade-long stock market party, which saw the stocks of the S. & P. 500 index create some $17 trillion in new wealth, the rich indulged in $1,210 cocktails at the Four Seasons hotel’s Ty Bar in New York, in $325,000 Rolls-Royce Cullinan sport-utility vehicles in S.U.V.-loving Houston and in nine-figure crash pads like Aaron Spelling’s 56,000-square-foot mansion in Los Angeles (currently on the market for $175 million, more than double what it fetched just five years ago).
Will it last? Who knows? But in recent months, the anxiety that we could be in for a replay of 1929 — or 1987, or 2000, or 2008 — has become palpable not just for the Aspen set, but for any American with a 401(k).
Overall, stocks are down 1.5 percent this year, after hitting dizzying heights in early October. Hedge funds are having their worst yearsince the 2008 crisis. And household debt recently hit another record high of $13.5 trillion — up $837 billion from the previous peak, which preceded the Great Recession.
After a decade of low interest rates that fueled a massive run-up in stocks, real estate and other assets, financial Cassandras are not hard to find. Paul Tudor Jones, the billionaire investor, recently posited that we are likely in a “global debt bubble,” and Jim Rogers, the influential fund manager and commentator, has forewarned of a crash that will be “the biggest in my lifetime” (he is 76).
What might prove the pinprick to the “everything bubble,” as doomers like to call it? Could be anything. Could be nothing. Only time will tell if the everything bubble is a bubble at all. But, just a decade after the last financial crisis, here are five popular doom-and-gloom scenarios.
Happy holidays!
After a decade-long stock market party, which saw the stocks of the S. & P. 500 index create some $17 trillion in new wealth, the rich indulged in $1,210 cocktails at the Four Seasons hotel’s Ty Bar in New York, in $325,000 Rolls-Royce Cullinan sport-utility vehicles in S.U.V.-loving Houston and in nine-figure crash pads like Aaron Spelling’s 56,000-square-foot mansion in Los Angeles (currently on the market for $175 million, more than double what it fetched just five years ago).

Overall, stocks are down 1.5 percent this year, after hitting dizzying heights in early October. Hedge funds are having their worst yearsince the 2008 crisis. And household debt recently hit another record high of $13.5 trillion — up $837 billion from the previous peak, which preceded the Great Recession.
After a decade of low interest rates that fueled a massive run-up in stocks, real estate and other assets, financial Cassandras are not hard to find. Paul Tudor Jones, the billionaire investor, recently posited that we are likely in a “global debt bubble,” and Jim Rogers, the influential fund manager and commentator, has forewarned of a crash that will be “the biggest in my lifetime” (he is 76).
What might prove the pinprick to the “everything bubble,” as doomers like to call it? Could be anything. Could be nothing. Only time will tell if the everything bubble is a bubble at all. But, just a decade after the last financial crisis, here are five popular doom-and-gloom scenarios.
Happy holidays!
5. Student Debt
Remember how the 2008 crisis was triggered by a bunch of people, who probably should not have been lent giant amounts of money in the first place, not making their mortgage payments? That was just the precipitating factor, but go back and stream “The Big Short” if none of this rings a bell.
Then fast-forward to 2018, where bad mortgages may not be the problem. Consider, instead, the mountain of student debt out there, which is basically a $1.5 trillion bet that a generation of underemployed young people will ever be able pay off a hundred grand in tuition loans in an economy where even hedge funders are getting creamed. Already, a lot of them aren’t paying and can’t pay. In a climate where “there are massive amounts of unaffordable loans being made to people who can’t pay them,” as Sheila Bair, the former head of the Federal Deposit Insurance Corporation, described the student debt problem in Barron’s earlier this year, nearly 20 percent of those loans are already delinquent or in default. That number could balloon to 40 percent by 2023, according to a report earlier this year by the Brookings Institution.
Now, lots of that debt is owed the federal government, so it’s unlikely to poison the banking system, as mortgages did a decade ago. But this burden of debt is already beginning to wipe out the next generation of home buyers and auto purchasers. As a result, a generation of well-educated and underemployed millennials, told to value a college education above all, could drag down an economy that never seemed to want them in the first place.
4. China
You know who has racked up even more debt than hopeful 20-something ceramics-studies grads in the United States? Here’s a hint: It’s a not-exactly-Communist country in Asia that has been on such a wild debt-fueled building spree that it somehow used more cement in just three years earlier this decade than the United States did in the entire 20th century. Think about that. Now think about it some more. Over the past decade, China devoted mountains of cash to build airports, factories and entire would-be cities — now known as “ghost” cities, since the cities are populated by largely empty skyscrapers and apartment towers — all in the name of economic growth. And grow it did.
The result is a country with a supersized population (1.4 billion people) and supersized debt. Where things go from here is anyone’s guess. Optimists might argue that those trillions bought a 21st-century Asian equivalent of the American dream. Pessimists describe that massive debt as a “mountain,” a “horror movie,” a “bomb” and a “treadmill to hell,” all in the same Bloomberg article. One thing seems certain, though: If the so-called “debt bomb” in China explodes, it’s likely to sprinkle the global economy with ash. And with President Trump teasing a trade war that already seems to be threatening China’s massive, export-based economy, we may have our answer soon.
3. The End of Easy Money
Say you lived in the suburbs, and one day your neighbor suddenly pulled up her driveway in a new $75,000 Cadillac Escalade. A week later, she was tugging a new speedboat. A few weeks after that, it was Jet Skis. You might either think, “Wow, she’s rolling in it,” or “Golly, she hates glaciers.” (Hatred of glaciers may prove, actually, to be the real spark of the financial end times.) But what if it turned out that she bought all of those carbon-dioxide-spewing toys on credit, at crazy-low interest rates? And what if those rates suddenly started to spike? The result would likely be good news for the polar ice caps and bad news for her, when the repo man (not to cave to gender stereotypes about repo-persons) came calling.
O.K., overstretched metaphor alert: The “neighbor” is us. Ever since the Federal Reserve started printing money in the name of “quantitative easing” to pull us out of the last financial crisis, money has been cheap, and seemingly any American with a pulse and a credit line has been able to fake “rich” by bingeing on all sorts of indulgences — real estate (despite tighter lending standards), fancy watches and awesome gaming systems, to say nothing of the debt that corporations were racking up, which some market analysts think might be the biggest threat of all.
Remember how the 2008 crisis was triggered by a bunch of people, who probably should not have been lent giant amounts of money in the first place, not making their mortgage payments? That was just the precipitating factor, but go back and stream “The Big Short” if none of this rings a bell.
Then fast-forward to 2018, where bad mortgages may not be the problem. Consider, instead, the mountain of student debt out there, which is basically a $1.5 trillion bet that a generation of underemployed young people will ever be able pay off a hundred grand in tuition loans in an economy where even hedge funders are getting creamed. Already, a lot of them aren’t paying and can’t pay. In a climate where “there are massive amounts of unaffordable loans being made to people who can’t pay them,” as Sheila Bair, the former head of the Federal Deposit Insurance Corporation, described the student debt problem in Barron’s earlier this year, nearly 20 percent of those loans are already delinquent or in default. That number could balloon to 40 percent by 2023, according to a report earlier this year by the Brookings Institution.
Now, lots of that debt is owed the federal government, so it’s unlikely to poison the banking system, as mortgages did a decade ago. But this burden of debt is already beginning to wipe out the next generation of home buyers and auto purchasers. As a result, a generation of well-educated and underemployed millennials, told to value a college education above all, could drag down an economy that never seemed to want them in the first place.
4. China
You know who has racked up even more debt than hopeful 20-something ceramics-studies grads in the United States? Here’s a hint: It’s a not-exactly-Communist country in Asia that has been on such a wild debt-fueled building spree that it somehow used more cement in just three years earlier this decade than the United States did in the entire 20th century. Think about that. Now think about it some more. Over the past decade, China devoted mountains of cash to build airports, factories and entire would-be cities — now known as “ghost” cities, since the cities are populated by largely empty skyscrapers and apartment towers — all in the name of economic growth. And grow it did.
The result is a country with a supersized population (1.4 billion people) and supersized debt. Where things go from here is anyone’s guess. Optimists might argue that those trillions bought a 21st-century Asian equivalent of the American dream. Pessimists describe that massive debt as a “mountain,” a “horror movie,” a “bomb” and a “treadmill to hell,” all in the same Bloomberg article. One thing seems certain, though: If the so-called “debt bomb” in China explodes, it’s likely to sprinkle the global economy with ash. And with President Trump teasing a trade war that already seems to be threatening China’s massive, export-based economy, we may have our answer soon.
3. The End of Easy Money
Say you lived in the suburbs, and one day your neighbor suddenly pulled up her driveway in a new $75,000 Cadillac Escalade. A week later, she was tugging a new speedboat. A few weeks after that, it was Jet Skis. You might either think, “Wow, she’s rolling in it,” or “Golly, she hates glaciers.” (Hatred of glaciers may prove, actually, to be the real spark of the financial end times.) But what if it turned out that she bought all of those carbon-dioxide-spewing toys on credit, at crazy-low interest rates? And what if those rates suddenly started to spike? The result would likely be good news for the polar ice caps and bad news for her, when the repo man (not to cave to gender stereotypes about repo-persons) came calling.
O.K., overstretched metaphor alert: The “neighbor” is us. Ever since the Federal Reserve started printing money in the name of “quantitative easing” to pull us out of the last financial crisis, money has been cheap, and seemingly any American with a pulse and a credit line has been able to fake “rich” by bingeing on all sorts of indulgences — real estate (despite tighter lending standards), fancy watches and awesome gaming systems, to say nothing of the debt that corporations were racking up, which some market analysts think might be the biggest threat of all.
by Alex Williams, NY Times | Read more:
Image: The New York Times; Spencer Platt/Getty ImagesThe Annals of Flannel
Three years ago, Bayard Winthrop, the chief executive and founder of the clothing brand American Giant, started thinking about a flannel shirt he wore as a kid in the 1970s. It was blue plaid and bought for him by his grandmother, probably at Caldor, a discount department store popular in the northeast back then. The flannel was one of the first pieces of clothing Mr. Winthrop owned that suggested a personality.
“I thought it looked great,” he said, “and I thought it said something about me. That I was cool and physical and capable and outdoorsy.”
Since 2011 American Giant, or AG, has mass-produced everyday sportswear for men and women, like the Lee jeans or Russell sweatshirts once sold in stores like Caldor — from the ginned cotton to the cutting and sewing — entirely in the U.S. Mr. Winthrop, a former financier who had run a snowshoe firm, made it the company’s mission to, in his words, “bring back ingenuity and optimism to the towns that make things.” He’s been very successful, especially with a full-zip sweatshirt Slate called “the greatest hoodie ever made.” AG has introduced denim, leggings and socks, among other products.
But Mr. Winthrop’s madeleine of a garment proved elusive. “We kept asking around and hearing, ‘Not flannel. You can do all these other things here, maybe. Flannel is gone.’” he said.
L.L. Bean, Woolrich, Ralph Lauren and Pendleton all made their reputations on rugged, cozy flannel shirts, but not one of those brands make them domestically today. In fact, “flannel hasn’t been made in America for decades,” said Nate Herman, an executive for the American Apparel & Footwear Association, a Washington D.C.-based trade group. (...)
Bringing its manufacture back to America, Mr. Winthrop thought, could be deeply symbolic. Both of the capability of U.S. manufacturing and of the need for big fashion brands to invest here again. It was a quixotic artisanal project, perhaps, but one with potentially high business stakes.
Flannel 101
“Forty years ago, we were able to make great shirts here, great jeans here, sold at a price that made sense to mainstream consumers,” Mr. Winthrop said at the outset of his project. “We’ve lost that capability in 40 years? We can’t make a flannel shirt in America? I’m not going to accept that answer.”
“Made in America” has become a marketing catchphrase espoused by both Brooklyn $400 selvage denim enthusiasts and Trump isolationists. And brands like American Apparel have led a renaissance of sorts in domestic manufacturing. But producing clothes in the U.S. today is exceedingly complicated. Over the last 30 years, the textile industry has been decimated by outsourcing and unfavorable trade deals, shedding 1.4 million jobs in the process, said Augustine Tantillo, president of the National Council of Textile Organizations.
Communities that produced clothes for generations, like Fort Payne, Ala., the former sock capital of the world, were mortally wounded when mills closed. Sometimes the expertise or work force have dissipated. Sometimes it’s the machinery, the looms, that have gone overseas.
Each time AG develops a new product, Mr. Winthrop must patch together its supply chain from what remains. To help him navigate the process, he relies on “old dogs in the industry,” he said, though AG is based in San Francisco and runs like a tech start-up, with sales almost entirely online.
For flannel, he called James McKinnon.
At 50, Mr. McKinnon is not that old (Mr. Winthrop is 49). But he is the third McKinnon to run Cotswold Industries, the textile manufacturer his grandfather started in 1954. Cotswold made the woven fabric for headliners inside Ford cars. Later, the firm manufactured pocket linings for Lee, Wrangler and Levi jeans. Cotswold still handles pocketing business for many U.S. brands, part of a diverse portfolio that includes making fabrics for culinary apparel. The fabrics are woven at its mill in Central, S.C.
Mr. Winthrop called Mr. McKinnon at his office in midtown Manhattan and ran through the list of questions. Why is flannel gone? What would it take to bring it back? How would you do it?
Mr. Winthrop specified that he wanted to make yarn-dyed flannel, not flannel in which the pattern is simply printed onto the fabric. (...)
Shirting in general is more complicated than a T-shirt or fleece because it’s woven rather than knit. Wovens typically require more needlework, which means higher labor costs, which means that they have been outsourced more aggressively than knits or denim. And a flannel is a very complicated woven shirt.
For a T-shirt, raw material is fed into a circular knitting machine and a roll of fabric is cranked out and dyed red or blue or purple. But flannel requires the dyeing of each individual yarn, which is what gives it the patterned look of, say, Buffalo plaid.
Those dyed yarns are put on a weaving machine, or loom. There are lengthwise, or warp, yarns and crosswise, or weft, yarns. To get the famous red and black squares even and blended, the warping must be done precisely right. And the more intricate the pattern or numerous the colors, the more complex the warping and the harder the weave.
As anyone who loves one knows, flannel shirts are soft, which is achieved through a finishing process called napping.
“Flannel, of all the things in your wardrobe, is the one thing that you know intuitively if you like or not,” Mr. Winthrop said. “It has to feel right in your hand.”
“I thought it looked great,” he said, “and I thought it said something about me. That I was cool and physical and capable and outdoorsy.”

But Mr. Winthrop’s madeleine of a garment proved elusive. “We kept asking around and hearing, ‘Not flannel. You can do all these other things here, maybe. Flannel is gone.’” he said.
L.L. Bean, Woolrich, Ralph Lauren and Pendleton all made their reputations on rugged, cozy flannel shirts, but not one of those brands make them domestically today. In fact, “flannel hasn’t been made in America for decades,” said Nate Herman, an executive for the American Apparel & Footwear Association, a Washington D.C.-based trade group. (...)
Bringing its manufacture back to America, Mr. Winthrop thought, could be deeply symbolic. Both of the capability of U.S. manufacturing and of the need for big fashion brands to invest here again. It was a quixotic artisanal project, perhaps, but one with potentially high business stakes.
Flannel 101
“Forty years ago, we were able to make great shirts here, great jeans here, sold at a price that made sense to mainstream consumers,” Mr. Winthrop said at the outset of his project. “We’ve lost that capability in 40 years? We can’t make a flannel shirt in America? I’m not going to accept that answer.”
“Made in America” has become a marketing catchphrase espoused by both Brooklyn $400 selvage denim enthusiasts and Trump isolationists. And brands like American Apparel have led a renaissance of sorts in domestic manufacturing. But producing clothes in the U.S. today is exceedingly complicated. Over the last 30 years, the textile industry has been decimated by outsourcing and unfavorable trade deals, shedding 1.4 million jobs in the process, said Augustine Tantillo, president of the National Council of Textile Organizations.
Communities that produced clothes for generations, like Fort Payne, Ala., the former sock capital of the world, were mortally wounded when mills closed. Sometimes the expertise or work force have dissipated. Sometimes it’s the machinery, the looms, that have gone overseas.
Each time AG develops a new product, Mr. Winthrop must patch together its supply chain from what remains. To help him navigate the process, he relies on “old dogs in the industry,” he said, though AG is based in San Francisco and runs like a tech start-up, with sales almost entirely online.
For flannel, he called James McKinnon.
At 50, Mr. McKinnon is not that old (Mr. Winthrop is 49). But he is the third McKinnon to run Cotswold Industries, the textile manufacturer his grandfather started in 1954. Cotswold made the woven fabric for headliners inside Ford cars. Later, the firm manufactured pocket linings for Lee, Wrangler and Levi jeans. Cotswold still handles pocketing business for many U.S. brands, part of a diverse portfolio that includes making fabrics for culinary apparel. The fabrics are woven at its mill in Central, S.C.
Mr. Winthrop called Mr. McKinnon at his office in midtown Manhattan and ran through the list of questions. Why is flannel gone? What would it take to bring it back? How would you do it?
Mr. Winthrop specified that he wanted to make yarn-dyed flannel, not flannel in which the pattern is simply printed onto the fabric. (...)
Shirting in general is more complicated than a T-shirt or fleece because it’s woven rather than knit. Wovens typically require more needlework, which means higher labor costs, which means that they have been outsourced more aggressively than knits or denim. And a flannel is a very complicated woven shirt.
For a T-shirt, raw material is fed into a circular knitting machine and a roll of fabric is cranked out and dyed red or blue or purple. But flannel requires the dyeing of each individual yarn, which is what gives it the patterned look of, say, Buffalo plaid.
Those dyed yarns are put on a weaving machine, or loom. There are lengthwise, or warp, yarns and crosswise, or weft, yarns. To get the famous red and black squares even and blended, the warping must be done precisely right. And the more intricate the pattern or numerous the colors, the more complex the warping and the harder the weave.
As anyone who loves one knows, flannel shirts are soft, which is achieved through a finishing process called napping.
“Flannel, of all the things in your wardrobe, is the one thing that you know intuitively if you like or not,” Mr. Winthrop said. “It has to feel right in your hand.”
by Steven Kurutz, NY Times | Read more:
Image: Travis DoveSunday, December 16, 2018
Giant Steps
via: Vox and YouTube
[ed. I don't fully get it either. But it does give you a sense of where genius finds expression.]
Am I ‘Old’?
A few years ago at a college reunion, I listened transfixed as the silver-haired philanthropist David Rubenstein urged us “to accelerate” as we entered the last chapters of our lives. Pick up the pace? So many of my contemporaries were stopping — if not stooping — to smell the roses.
With his admonition in mind, I recently spoke with Mr. Rubenstein, now 69, and asked him if he considers himself old. “Sixty-nine seems like a teenager to me,” he replied. Coincidentally, just a few days earlier, a 68-year-old poet I know, in between surgeries to help her mend after a fall, told me point blank, “I am an old lady now.”
What makes one sexagenarian identify as old when another doesn’t? And what is “old,” anyway?
Having turned 61, this is a question very much on my mind — and likely to be on the minds of the 70 million baby boomers who are 50-plus (yes, even the tail end of the boom is now “middle-aged” or “old”). Dinner conversations are now hyper-focused on how to stay young or at least delay old.
Certainly the definition of “old” is changing, as life spans have grown longer. “Someone who is 60 years old today is middle-aged,” said Sergei Scherbov, the lead researcher of a multiyear study on aging. When does old begin? I asked.
Dr. Scherbov says for Americans, it’s roughly 70 to 71 for men and 73 to 74 for women, though, as he has written, “your true age is not just the number of years you have lived.”
“The main idea of the project,” he told me, “is that an old age threshold should not be fixed but depend on the characteristics of people.” Factors such as life expectancy, personal health, cognitive function and disability rates all play a role, he said, and today’s 65-year-old is more like a 55-year-old from 45 years ago.
As with beauty, the meaning of “old” also depends on the person you ask. Millennials, now in their 20s and 30s, say that old starts at 59, according to a 2017 study by U.S. Trust. Gen Xers, now in their 40s — and no doubt with a new appreciation for just how close they are to entering their 50s — say 65 is the onset of old. Boomers and the Greatest Generation pegged 73 as the beginning of old. Clearly, much depends on the perspective of who’s being asking to define “old.”
To that very point, I was curious to see how my friends who are 50-plus defined old — and asked them on Facebook. Among the dozens of responses, two made me smile: “Old is my current age + 4.” And this: “Tomorrow. Always tomorrow. Never today.” Perhaps the one most difficult to hear: “When you get called “ma’am instead of “miss.” (That will never happen to me, although I’m constantly called “sir” these days.)
Other friends pointed to various physical milestones as the visible line in the sand. A colleague posted: “When you can’t jog a 15-minute mile.” Another friend said, “When I have to stop playing tennis.” Others ominously noted cognitive benchmarks: “When you stop being interested in new information and experiences.” Many focused on “memory issues” as defining the onset of old.
The bottom line: “old” is subjective, a moving target.
That’s why David Rubenstein, 69, the board chairman of both the Kennedy Center for the Performing Arts and the Smithsonian Institution and co-founder and co-executive chairman of the Carlyle Group, can claim he’s not old, while my poet friend, a year younger than he is, refers to herself as old. Recently, because of problems getting around, she had to bring in a home health aide for assistance, only deepening her increased dependence on others. Indeed, as Dr. Scherbov discovered, loss of independence and mobility are among the characteristics that define “old.”
For his book “Healthy Aging,” Dr. Andrew Weil, now 76, asked people to list attributes associated with “old.” Among those most frequently cited: ancient, antiquated, dated, dried up, frail, passé, shriveled, used up, useless and withered, worthless and wrinkled. Nice stereotypes, huh?
“Negative ageist attitudes toward older people are widespread,” a 2015 analysis by the World Health Organization confirmed in a survey. Nearly two-thirds of the respondents, 83,000 people of all ages in 57 countries, did not respect older people, with the lowest levels of respect reported in high-income countries like the United States. Even more damning: These views adversely “affect older people’s physical and mental health.”
[ed. I still feel like a teenager (although my body tells me otherwise). In Hawaii, younger people often address older folks as "Uncle" (older women as "Auntie"). But when they start calling you "Papa san" well, you know you're probably getting pretty old. See also: Retiring Retirement (Nautilus).]
With his admonition in mind, I recently spoke with Mr. Rubenstein, now 69, and asked him if he considers himself old. “Sixty-nine seems like a teenager to me,” he replied. Coincidentally, just a few days earlier, a 68-year-old poet I know, in between surgeries to help her mend after a fall, told me point blank, “I am an old lady now.”

Having turned 61, this is a question very much on my mind — and likely to be on the minds of the 70 million baby boomers who are 50-plus (yes, even the tail end of the boom is now “middle-aged” or “old”). Dinner conversations are now hyper-focused on how to stay young or at least delay old.
Certainly the definition of “old” is changing, as life spans have grown longer. “Someone who is 60 years old today is middle-aged,” said Sergei Scherbov, the lead researcher of a multiyear study on aging. When does old begin? I asked.
Dr. Scherbov says for Americans, it’s roughly 70 to 71 for men and 73 to 74 for women, though, as he has written, “your true age is not just the number of years you have lived.”
“The main idea of the project,” he told me, “is that an old age threshold should not be fixed but depend on the characteristics of people.” Factors such as life expectancy, personal health, cognitive function and disability rates all play a role, he said, and today’s 65-year-old is more like a 55-year-old from 45 years ago.
As with beauty, the meaning of “old” also depends on the person you ask. Millennials, now in their 20s and 30s, say that old starts at 59, according to a 2017 study by U.S. Trust. Gen Xers, now in their 40s — and no doubt with a new appreciation for just how close they are to entering their 50s — say 65 is the onset of old. Boomers and the Greatest Generation pegged 73 as the beginning of old. Clearly, much depends on the perspective of who’s being asking to define “old.”
To that very point, I was curious to see how my friends who are 50-plus defined old — and asked them on Facebook. Among the dozens of responses, two made me smile: “Old is my current age + 4.” And this: “Tomorrow. Always tomorrow. Never today.” Perhaps the one most difficult to hear: “When you get called “ma’am instead of “miss.” (That will never happen to me, although I’m constantly called “sir” these days.)
Other friends pointed to various physical milestones as the visible line in the sand. A colleague posted: “When you can’t jog a 15-minute mile.” Another friend said, “When I have to stop playing tennis.” Others ominously noted cognitive benchmarks: “When you stop being interested in new information and experiences.” Many focused on “memory issues” as defining the onset of old.
The bottom line: “old” is subjective, a moving target.
That’s why David Rubenstein, 69, the board chairman of both the Kennedy Center for the Performing Arts and the Smithsonian Institution and co-founder and co-executive chairman of the Carlyle Group, can claim he’s not old, while my poet friend, a year younger than he is, refers to herself as old. Recently, because of problems getting around, she had to bring in a home health aide for assistance, only deepening her increased dependence on others. Indeed, as Dr. Scherbov discovered, loss of independence and mobility are among the characteristics that define “old.”
For his book “Healthy Aging,” Dr. Andrew Weil, now 76, asked people to list attributes associated with “old.” Among those most frequently cited: ancient, antiquated, dated, dried up, frail, passé, shriveled, used up, useless and withered, worthless and wrinkled. Nice stereotypes, huh?
“Negative ageist attitudes toward older people are widespread,” a 2015 analysis by the World Health Organization confirmed in a survey. Nearly two-thirds of the respondents, 83,000 people of all ages in 57 countries, did not respect older people, with the lowest levels of respect reported in high-income countries like the United States. Even more damning: These views adversely “affect older people’s physical and mental health.”
by Steven Petrow, NY Times | Read more:
Image: Stuart Bradford[ed. I still feel like a teenager (although my body tells me otherwise). In Hawaii, younger people often address older folks as "Uncle" (older women as "Auntie"). But when they start calling you "Papa san" well, you know you're probably getting pretty old. See also: Retiring Retirement (Nautilus).]
War on Cash: State and City Governments Push Back
In a Q and A with New York City council member Richie J. Torres, Grub Street notes that in addition to New Jersey, politicians in some eastern cities – including New York City, Philadelphia, and Washington D.C.- are also mulling restrictions on cashless stores. Another recent Grub Street piece, More Restaurants and Cafés Refuse to Accept Cash — That’s Not a Good Thing “Just because you don’t have a piece of plastic, you can’t get a sandwich? ”, describes the cashless trend in more detail.
Torres regards cash bans as both classist and racist:
Despite these pushback measures, last week the Pew Research Center reported in More Americans are making no weekly purchases with cash that roughly 29% of US adults say they make no purchases using cash during a typical week – up from 25% in 2015. At the same time, the share of those who claim to make all or almost all of their weekly purchases with cash has dropped from 25% in 2015 to 18% today. (...)
As Figure 2 makes clear, declining use of cash correlates heavily with income. So, adults with an annual household income of $75,000 or more are more than twice as likely as those earning less than $30,000 a year to eschew cash purchases in a typical week (41% compared to 18%). Whereas more than four times as many lower-income Americans report they make all or almost all of their purchases using cash, compared to higher income Americans (29% vs. 7%).
Last week, the Washington Post noted in The global cashless movement meets its foe: Local government, that one reason for the higher reliance of lower-income Americans on cash is their restricted access to financial services:
by Jerri-Lynn Scofield, Naked Capitalism | Read more:
Image: Pew Research Center
[ed. Not only that, but in an emergency if the electrical grid goes down the only thing that works is cash (no ATMs). Also, it's harder for government (and banks) to do funny things with your money if it's not just in bits and bytes.]
Torres regards cash bans as both classist and racist:
Why do you think cashless business models “gentrify the marketplace”?
On the surface, cashlessness seems benign, but when you reflect on it, the insidious racism that underlies a cashless business model becomes clear. In some ways, making a card a requirement for consumption is analogous to making identification a requirement for voting. The effect is the same: It disempowers communities of color.
These are public accommodations. The Civil Rights Act established a framework for prohibiting discrimination in matters of housing, employment, and public accommodations. If you’re intent on a cashless business model, it will have the effect of excluding lower-income communities of color from what should be an open and free market.
And we’ll start to attach a certain stigma to people who pay for things with cash?
Exactly, in the same way that one might stigmatize [Electronic Benefit Transfer (EBT)] cards. When I was growing up, I remember the embarrassment that surrounded the use of food stamps. We live in a society where it’s not enough to stigmatize poverty; we are also going to stigmatize the means with which poor people pay for goods and services.More Consumers Abandon Cash
Despite these pushback measures, last week the Pew Research Center reported in More Americans are making no weekly purchases with cash that roughly 29% of US adults say they make no purchases using cash during a typical week – up from 25% in 2015. At the same time, the share of those who claim to make all or almost all of their weekly purchases with cash has dropped from 25% in 2015 to 18% today. (...)
As Figure 2 makes clear, declining use of cash correlates heavily with income. So, adults with an annual household income of $75,000 or more are more than twice as likely as those earning less than $30,000 a year to eschew cash purchases in a typical week (41% compared to 18%). Whereas more than four times as many lower-income Americans report they make all or almost all of their purchases using cash, compared to higher income Americans (29% vs. 7%).
Last week, the Washington Post noted in The global cashless movement meets its foe: Local government, that one reason for the higher reliance of lower-income Americans on cash is their restricted access to financial services:
According to FDIC estimates, 6.5 percent of American households were unbanked in 2017, meaning they did not have an account with an insured financial institution. Another 18.7 percent of households in the United States have a checking or savings account but still relied on financial services outside of a traditional bank — such as payday loans or check-cashing businesses — the estimate showed.Over to Grub Street and Torres again for a trenchant summary of the main issue:
What do you make of the claim, “But these days everyone has a card!”
People who say that are living in a bubble of privilege — they look around and all their friends have cards. In response I say, “Does it occur to you that your world is pretty unrepresentative?” There are hundreds and thousands of New Yorkers who may have no permanent address or home, and many New Yorkers who are underbanked, either because of poverty or because they lack documentation. Requiring a card is erecting a barrier for low-income New Yorkers — period — and it’s coming from the very communities that claim to be progressive, as if, “Well, I am all for racial justice just so long as it doesn’t come at the expense of my own privilege.”
I think that many of these places actively want to keep a certain type of person out.
Of course! Earlier I said that no matter what the intention was, its effect is discriminatory, but I do think that it can also be intentional where the idea is to filter out the deplorables.Even advocates of cashless transactions concede critics have a point – but reject the stark conclusion that the purpose of cashless policies is to exclude certain types of customers.
Image: Pew Research Center
[ed. Not only that, but in an emergency if the electrical grid goes down the only thing that works is cash (no ATMs). Also, it's harder for government (and banks) to do funny things with your money if it's not just in bits and bytes.]
How the Seahawks Dismantled the Legion of Boom and Still Thrived
The Seahawks are this year’s surprise outfit. It feels like a long, long time since Seattle went through the will-they-won’t-they Earl Thomas dating game; since John Schneider and Pete Carroll detonated the Legion of Boom era and kicked Richard Sherman and Michael Bennett to the curb; since Cliff Avril and Kam Chancellor were forced to retire; since Thomas flipped off his own sideline in an act of understandable insubordination.
Seattle entered the season with few expectations. Vegas odds placed their chances at a Super Bowl a hair ahead of the Browns, and any Seahawks discussion elicited a shrug. Unless, of course, you wanted to talk about the glory days and how different (read: boring) this year was going to be.
Except Carroll hasn’t had a blah team in almost two decades and, like Andy Dufresne, the 2018 Seahawks have emerged triumphant on the other side of all the melodrama. They’re 8-5, heading for the playoffs and peaking at the right time. They’re eighth in weighted DVOA, which assesses a team’s most recent performances to indicate how well they are playing right now rather than over the course of the entire season. They’re one of only eight teams with a point differential over 70, ahead of the Patriots, Cowboys, and Steelers, despite playing in seven one-score games.
Carroll and company transitioned the organization from one led by its defense, to one led by Russell Wilson and the offense. It makes sense too: having a long-term franchise quarterback is more stable than consistently fielding an elite defense: players get hurt, free-agency saps talent, age and attrition begin to take over. A very good quarterback – which Russell Wilson is – can overcome some of those problems on his side of the ball.
Carroll doubled down on his belief that a ground-and-pound, power-running game can still succeed in the era of pace-and-space. It’s worked. Seattle are second in the league in power-run success, trailing just the Ravens’ rush-only offense. While Carroll deserves serious coach of the year consideration his supporting cast have been impressive too. Offensive coordinator Brian Schottenheimer has done a brilliant job (stunning, I know) coaching around the limitations on the team’s offense. Mike Solari replaced Tom Cable, a man who makes Brick Tamland look like Jean-Paul Sartre, as offensive-line coach and the unit, predictably, improved (under Cable the Raiders offensive line has submarined, for what it’s worth).
Seattle haven’t relied wholly on their rushing game though. They’ve benefited from Wilson’s rare brand of escape magic to create plays on the fly, and his connection with Tyler Lockett has been the most efficient quarterback-receiver partnership in recent years. Doug Baldwin is the guy who makes the whole thing sing, though. Wilson is a different quarterback when Baldwin is on the field. With Baldwin in 2018, he has a touchdown-to-interception ratio of 11.5 (23-2). Without Baldwin that number collapses to 1.5 (6-4), his completion percentage drops by seven points; his passer rating by 41. Almost as importantly, Wilson’s average yards per target drops from 8.68 to 6.96. To put it simply: without Baldwin Wilson goes from an excellent quarterback to an average one. (...)
Perhaps most importantly, Pete Carroll has reignited the sense of camaraderie that had dissipated in recent years. Fans loved the early bombast of the Legion of Boom; they grew tired of it by the end – and the players grew tired of the organization itself. Meanwhile, Seattle’s 2018 band of upstart free-agent castoffs and young pups seem to be relishing the chance to just play. There’s no drama.
Seattle entered the season with few expectations. Vegas odds placed their chances at a Super Bowl a hair ahead of the Browns, and any Seahawks discussion elicited a shrug. Unless, of course, you wanted to talk about the glory days and how different (read: boring) this year was going to be.

Carroll and company transitioned the organization from one led by its defense, to one led by Russell Wilson and the offense. It makes sense too: having a long-term franchise quarterback is more stable than consistently fielding an elite defense: players get hurt, free-agency saps talent, age and attrition begin to take over. A very good quarterback – which Russell Wilson is – can overcome some of those problems on his side of the ball.
Carroll doubled down on his belief that a ground-and-pound, power-running game can still succeed in the era of pace-and-space. It’s worked. Seattle are second in the league in power-run success, trailing just the Ravens’ rush-only offense. While Carroll deserves serious coach of the year consideration his supporting cast have been impressive too. Offensive coordinator Brian Schottenheimer has done a brilliant job (stunning, I know) coaching around the limitations on the team’s offense. Mike Solari replaced Tom Cable, a man who makes Brick Tamland look like Jean-Paul Sartre, as offensive-line coach and the unit, predictably, improved (under Cable the Raiders offensive line has submarined, for what it’s worth).
Seattle haven’t relied wholly on their rushing game though. They’ve benefited from Wilson’s rare brand of escape magic to create plays on the fly, and his connection with Tyler Lockett has been the most efficient quarterback-receiver partnership in recent years. Doug Baldwin is the guy who makes the whole thing sing, though. Wilson is a different quarterback when Baldwin is on the field. With Baldwin in 2018, he has a touchdown-to-interception ratio of 11.5 (23-2). Without Baldwin that number collapses to 1.5 (6-4), his completion percentage drops by seven points; his passer rating by 41. Almost as importantly, Wilson’s average yards per target drops from 8.68 to 6.96. To put it simply: without Baldwin Wilson goes from an excellent quarterback to an average one. (...)
Perhaps most importantly, Pete Carroll has reignited the sense of camaraderie that had dissipated in recent years. Fans loved the early bombast of the Legion of Boom; they grew tired of it by the end – and the players grew tired of the organization itself. Meanwhile, Seattle’s 2018 band of upstart free-agent castoffs and young pups seem to be relishing the chance to just play. There’s no drama.
by Oliver Connolly, The Guardian | Read more:
Image: Joe Nicholson/USA Today Sports
[ed. They are a surprise this year, but one of the main reasons isn't even mentioned in this article: Bobby Wagner. The second most important man on the team and one of the best middle linebackers to ever play the game (and possible future Hall of Famer). Go Hawks!]
Saturday, December 15, 2018
What the Media Gets Wrong About Opioids
After Jillian Bauer-Reese created an online collection of opioid recovery stories, she began to get calls for help from reporters. But she was dismayed by the narrowness of the requests, which sought only one type of interviewee.
“They were looking for people who had started on a prescription from a doctor or a dentist,” says Bauer-Reese, an assistant professor of journalism at Temple University in Philadelphia. “They had essentially identified a story that they wanted to tell and were looking for a character who could tell that story.”
Although this profile doesn’t fit most people who become addicted, it is typical in reporting on opioids. Often, stories focus exclusively on people whose use started with a prescription; take this, from CNN (“It all started with pain killers after a dentist appointment.”), and this, from New York’s NBC affiliate (“He started taking Oxycontin after a crash.”)
Alternatively, reporters downplay their subjects’ earlier drug misuse to emphasize the role of the medical system, as seen in this piece from the Kansas City Star. The story, headlined “Prescription pills; addiction ‘hell,’” features a woman whose addiction supposedly started after surgery, but only later mentions that she’d previously used crystal meth for six months.
The “relatable” story journalists and editors tend to seek—of a good girl or guy (usually, in this crisis, white) gone bad because pharma greed led to overprescribing—does not accurately characterize the most common story of opioid addiction. Most opioid patients never get addicted and most people who do get addicted didn’t start their opioid addiction with a doctor’s prescription. The result of this skewed public conversation around opioids has been policies focused relentlessly on cutting prescriptions, without regard for providing alternative treatment for either pain or addiction.
While some people become addicted after getting an opioid prescription for reasons such as a sports injury or wisdom teeth removal, 80 percent start by using drugs not prescribed to them, typically obtained from a friend or family member, according to surveys conducted for the government’s National Household Survey on Drug Use and Health. Most of those who misuse opioids have also already gone far beyond experimentation with marijuana and alcohol when they begin: 70 percent have previously taken drugs such as cocaine or methamphetamine.
Conversely, a 2016 review published in the New England Journal of Medicine and co-authored by Dr. Nora Volkow, director of the National Institute on Drug Abuse, put the risk of new addiction at less than 8 percent for people prescribed opioids for chronic pain. Since 90 percent of all addictions begin in the teens or early 20s, the risk for the typical adult with chronic pain who is middle aged or older is actually even lower.
This does not in any way absolve the pharmaceutical industry. Companies like Purdue Pharma, the maker of Oxycontin, profited egregiously by minimizing the risks of prescribing in general medicine. Purdue also lied about how Oxycontin’s effects last (a factor that affects addiction risk) and literally gave salespeople quotas to push doctors to push opioids.
The industry flooded the country with opioids and excellent journalism has exposed this part of the problem. But journalists need to become more familiar with who is most at risk of addiction and why—and to understand the utter disconnect between science and policy—if we are to accurately inform our audience.
The Innocent Victim Narrative
The reporters who called Bauer-Reese were not ill-intentioned in seeking the most sympathetic addiction stories; it is genuinely altruistic to want to portray those who are suffering in a way that is most likely to move readers and viewers to act compassionately. But such cases can have an unintended side effect: highlighting “innocent” white people whose opioid addiction seems to have begun in a doctor’s office sets up a clear contrast with the “guilt” of people whose addiction starts on the streets.
This is a result of racist drug policies that began decades ago. The war on drugs declared by Richard Nixon in 1971 was part of the Republican “Southern strategy,” which used code words like “drugs” “crime,” and “urban” to signal racist white voters that the party was on their side. When Ronald Reagan doubled down harsh law enforcement during the crack years, he merely intensified that strategy. (...)
Now that the problem is seen as “white,” however, socioeconomic factors and other reasons that people turn to drugs are more commonly discussed. The result is that today’s white drug users are portrayed as inherently less culpable than the black people who were caught up in the crack epidemic of the ’80s and ’90s.
Craig Reinarman, professor of sociology emeritus at the University of California, Santa Cruz, has documented biased coverage of addiction since before the crack era. “Now that the iconic user is white and middle class, the answer is no longer a jail cell for every addict, it’s a treatment bed,” he says. The biased coverage ends up perpetuating a public perception that some drug use, usually by African Americans, is criminal while other drug use, usually by white people, is not. (...)
It’s important for journalists to understand that criminalization is not some sort of natural fact, and laws are not necessarily made for rational reasons. Our system does not reflect the relative risks of various drugs; legal ones are among the most harmful in terms of their pharmacological effects. With the exception of the legislation that resulted in the creation and maintenance of the FDA, our drug laws were actually born in a series of racist panics that had nothing to do with the relative harms of actual substances.
In order to do better, journalists must recognize that addiction is not simply a result of exposure to a drug, and that “innocence” isn’t at issue. The critical risk factors for addiction are child trauma, mental illness, and economic factors like unemployment and poverty. The “innocent victim” narrative focuses on individual choice and ignores these factors, along with the dysfunctional nature of the entire system that determines a drug’s legal status. (...)
The critical difference between addiction and dependence becomes clear when you look at specific drugs. Crack cocaine, for example, doesn’t cause severe physical withdrawal symptoms, but it’s one of the most addictive drugs known. Antidepressants like Prozac, meanwhile, don’t produce compulsive craving the way cocaine can, but some have severe withdrawal syndromes.
Needing opioids for pain alone, then, doesn’t meet the criteria for addiction. If the consequences of drug use are positive and the benefits outweigh the harm from side effects, then that use is no different from taking any other daily medication. Dependence in and of itself isn’t a problem unless the drug isn’t working or is more harmful than it is helpful.
Unfortunately, while the scientific understanding has changed to reflect these facts, the press hasn’t caught up. The Washington Post conducted a poll of pain patients on opioids that labeled one third of them as addicted after they responded “yes” to a question that asked whether they were “addicted or dependent,” without defining either term. A CBS affiliate in Chicago talked about treating “opioid dependence” when they actually meant “addiction”; this CNN story has the same problem.
This would be a mere semantic issue if it didn’t have such awful effects on policy. Conflating addiction and dependence results in harm to pain patients, children exposed to opioids in utero, and people who take medication to treat addiction.
by Maia Szalavitz, CJR | Read more:
“They were looking for people who had started on a prescription from a doctor or a dentist,” says Bauer-Reese, an assistant professor of journalism at Temple University in Philadelphia. “They had essentially identified a story that they wanted to tell and were looking for a character who could tell that story.”

Alternatively, reporters downplay their subjects’ earlier drug misuse to emphasize the role of the medical system, as seen in this piece from the Kansas City Star. The story, headlined “Prescription pills; addiction ‘hell,’” features a woman whose addiction supposedly started after surgery, but only later mentions that she’d previously used crystal meth for six months.
The “relatable” story journalists and editors tend to seek—of a good girl or guy (usually, in this crisis, white) gone bad because pharma greed led to overprescribing—does not accurately characterize the most common story of opioid addiction. Most opioid patients never get addicted and most people who do get addicted didn’t start their opioid addiction with a doctor’s prescription. The result of this skewed public conversation around opioids has been policies focused relentlessly on cutting prescriptions, without regard for providing alternative treatment for either pain or addiction.
While some people become addicted after getting an opioid prescription for reasons such as a sports injury or wisdom teeth removal, 80 percent start by using drugs not prescribed to them, typically obtained from a friend or family member, according to surveys conducted for the government’s National Household Survey on Drug Use and Health. Most of those who misuse opioids have also already gone far beyond experimentation with marijuana and alcohol when they begin: 70 percent have previously taken drugs such as cocaine or methamphetamine.
Conversely, a 2016 review published in the New England Journal of Medicine and co-authored by Dr. Nora Volkow, director of the National Institute on Drug Abuse, put the risk of new addiction at less than 8 percent for people prescribed opioids for chronic pain. Since 90 percent of all addictions begin in the teens or early 20s, the risk for the typical adult with chronic pain who is middle aged or older is actually even lower.
This does not in any way absolve the pharmaceutical industry. Companies like Purdue Pharma, the maker of Oxycontin, profited egregiously by minimizing the risks of prescribing in general medicine. Purdue also lied about how Oxycontin’s effects last (a factor that affects addiction risk) and literally gave salespeople quotas to push doctors to push opioids.
The industry flooded the country with opioids and excellent journalism has exposed this part of the problem. But journalists need to become more familiar with who is most at risk of addiction and why—and to understand the utter disconnect between science and policy—if we are to accurately inform our audience.
The Innocent Victim Narrative
The reporters who called Bauer-Reese were not ill-intentioned in seeking the most sympathetic addiction stories; it is genuinely altruistic to want to portray those who are suffering in a way that is most likely to move readers and viewers to act compassionately. But such cases can have an unintended side effect: highlighting “innocent” white people whose opioid addiction seems to have begun in a doctor’s office sets up a clear contrast with the “guilt” of people whose addiction starts on the streets.
This is a result of racist drug policies that began decades ago. The war on drugs declared by Richard Nixon in 1971 was part of the Republican “Southern strategy,” which used code words like “drugs” “crime,” and “urban” to signal racist white voters that the party was on their side. When Ronald Reagan doubled down harsh law enforcement during the crack years, he merely intensified that strategy. (...)
Now that the problem is seen as “white,” however, socioeconomic factors and other reasons that people turn to drugs are more commonly discussed. The result is that today’s white drug users are portrayed as inherently less culpable than the black people who were caught up in the crack epidemic of the ’80s and ’90s.
Craig Reinarman, professor of sociology emeritus at the University of California, Santa Cruz, has documented biased coverage of addiction since before the crack era. “Now that the iconic user is white and middle class, the answer is no longer a jail cell for every addict, it’s a treatment bed,” he says. The biased coverage ends up perpetuating a public perception that some drug use, usually by African Americans, is criminal while other drug use, usually by white people, is not. (...)
It’s important for journalists to understand that criminalization is not some sort of natural fact, and laws are not necessarily made for rational reasons. Our system does not reflect the relative risks of various drugs; legal ones are among the most harmful in terms of their pharmacological effects. With the exception of the legislation that resulted in the creation and maintenance of the FDA, our drug laws were actually born in a series of racist panics that had nothing to do with the relative harms of actual substances.
In order to do better, journalists must recognize that addiction is not simply a result of exposure to a drug, and that “innocence” isn’t at issue. The critical risk factors for addiction are child trauma, mental illness, and economic factors like unemployment and poverty. The “innocent victim” narrative focuses on individual choice and ignores these factors, along with the dysfunctional nature of the entire system that determines a drug’s legal status. (...)
The critical difference between addiction and dependence becomes clear when you look at specific drugs. Crack cocaine, for example, doesn’t cause severe physical withdrawal symptoms, but it’s one of the most addictive drugs known. Antidepressants like Prozac, meanwhile, don’t produce compulsive craving the way cocaine can, but some have severe withdrawal syndromes.
Needing opioids for pain alone, then, doesn’t meet the criteria for addiction. If the consequences of drug use are positive and the benefits outweigh the harm from side effects, then that use is no different from taking any other daily medication. Dependence in and of itself isn’t a problem unless the drug isn’t working or is more harmful than it is helpful.
Unfortunately, while the scientific understanding has changed to reflect these facts, the press hasn’t caught up. The Washington Post conducted a poll of pain patients on opioids that labeled one third of them as addicted after they responded “yes” to a question that asked whether they were “addicted or dependent,” without defining either term. A CBS affiliate in Chicago talked about treating “opioid dependence” when they actually meant “addiction”; this CNN story has the same problem.
This would be a mere semantic issue if it didn’t have such awful effects on policy. Conflating addiction and dependence results in harm to pain patients, children exposed to opioids in utero, and people who take medication to treat addiction.
by Maia Szalavitz, CJR | Read more:
Image: Pixabay
Springsteen on Netflix
With Netflix’s faithful film version of “Springsteen on Broadway,” there’s no need to re-review the show itself. What my colleague Jesse Green wrote when it opened in October 2017 still stands: “As portraits of artists go, there may never have been anything as real — and beautiful — on Broadway.”
Bruce Springsteen’s solo monologue-plus-concert was sold out far in advance during its entire run, with an average face-value ticket price of around $500. For the last performance (on Saturday, the day before Netflix is releasing the film), resale tickets are currently running from $3,000 to well over $40,000 each. Making the show available for the cost of a streaming subscription is an unqualified boon, a greater contribution to the public good than our civic institutions seem capable of at the moment.
Admittedly, the feeling of being in the audience at the Walter Kerr Theater, sharing the distinct but equally electric currents of an unplugged rock show, a cadenced sermon and a shrewdly theatrical entertainment, can’t be replicated. The live experience is inimitable, and the post-show emotional high as you walk out of the theater probably can’t be duplicated, either.
But the film, directed by Thom Zimny and shot by Joe DeSalvo at two private performances this year, has its own compensations. “Springsteen on Broadway” has sold out on the strength of its star’s connection with his huge fan base, and the opportunity to see him do a clutch of his best-known songs in a relatively small setting. But the show’s revelation — and the reason it actually worked so well — was his ability to take the stagecraft he’d honed in rock clubs and arenas and transfer it so effortlessly to the theater.
It’s a master class in pacing, dynamics, modulation of volume and tone, and the film brings you right up onstage with Springsteen, giving you a more intimate view of his technique — understated, seemingly casual but absolutely controlled — than you could get in the theater. Each expression, gesture, artful hesitation and sly punch line is zeroed in on, framed for our appreciation.
Zimny, who served as his own editor, presents the show unadorned, almost entirely without directorial intervention — it’s just Springsteen onstage, joined for two songs by his wife and fellow E Street Band member, Patti Scialfa. The one noticeable strategy Zimny employs has to do with the audience, which is unseen during the first half of the film, when Springsteen delivers a series of vignettes about his childhood and his beginnings as a musician. Zimny films these highly personal anecdotes, and their accompanying songs, in close-ups and medium shots that don’t stray beyond the stage.
In the show’s second half, as Springsteen’s text opens up (and loses some of its poetic intensity) to encompass themes like fatherhood, relationships and the current political moment, Zimny gradually opens up, too, showing us hints of the audience members. They finally appear in full during the rousing closing performance of “Born to Run,” and the film ends on a note of community, with the Boss reaching across the lights to shake hands with his fans.
Bruce Springsteen’s solo monologue-plus-concert was sold out far in advance during its entire run, with an average face-value ticket price of around $500. For the last performance (on Saturday, the day before Netflix is releasing the film), resale tickets are currently running from $3,000 to well over $40,000 each. Making the show available for the cost of a streaming subscription is an unqualified boon, a greater contribution to the public good than our civic institutions seem capable of at the moment.

But the film, directed by Thom Zimny and shot by Joe DeSalvo at two private performances this year, has its own compensations. “Springsteen on Broadway” has sold out on the strength of its star’s connection with his huge fan base, and the opportunity to see him do a clutch of his best-known songs in a relatively small setting. But the show’s revelation — and the reason it actually worked so well — was his ability to take the stagecraft he’d honed in rock clubs and arenas and transfer it so effortlessly to the theater.
It’s a master class in pacing, dynamics, modulation of volume and tone, and the film brings you right up onstage with Springsteen, giving you a more intimate view of his technique — understated, seemingly casual but absolutely controlled — than you could get in the theater. Each expression, gesture, artful hesitation and sly punch line is zeroed in on, framed for our appreciation.
Zimny, who served as his own editor, presents the show unadorned, almost entirely without directorial intervention — it’s just Springsteen onstage, joined for two songs by his wife and fellow E Street Band member, Patti Scialfa. The one noticeable strategy Zimny employs has to do with the audience, which is unseen during the first half of the film, when Springsteen delivers a series of vignettes about his childhood and his beginnings as a musician. Zimny films these highly personal anecdotes, and their accompanying songs, in close-ups and medium shots that don’t stray beyond the stage.
In the show’s second half, as Springsteen’s text opens up (and loses some of its poetic intensity) to encompass themes like fatherhood, relationships and the current political moment, Zimny gradually opens up, too, showing us hints of the audience members. They finally appear in full during the rousing closing performance of “Born to Run,” and the film ends on a note of community, with the Boss reaching across the lights to shake hands with his fans.
by Mike Hale, NY Times | Read more:
Image: Kevin Mazur/NetflixFriday, December 14, 2018
A New Connection between the Gut and Brain
It is well known that a high salt diet leads to high blood pressure, a risk factor for an array of health problems, including heart disease and stroke. But over the last decade, studies across human populations have reported the association between salt intake and stroke irrespective of high blood pressure and risk of heart disease, suggesting a missing link between salt intake and brain health.
Interestingly, there is a growing body of work showing that there is communication between the gut and brain, now commonly dubbed the gut–brain axis. The disruption of the gut–brain axis contributes to a diverse range of diseases, including Parkinson’s disease and irritable bowel syndrome. Consequently, the developing field of gut–brain axis research is rapidly growing and evolving. Five years ago, a couple of studies showed that high salt intake leads to profound immune changes in the gut, resulting in increased vulnerability of the brain to autoimmunity—when the immune system attacks its own healthy cells and tissues by mistake, suggesting that perhaps the gut can communicate with the brain via immune signaling.
Now, new research shows another connection: immune signals sent from the gut can compromise the brain’s blood vessels, leading to deteriorated brain heath and cognitive impairment. Surprisingly, the research unveils a previously undescribed gut–brain connection mediated by the immune system and indicates that excessive salt might negatively impact brain health in humans through impairing the brain’s blood vessels regardless of its effect on blood pressure.
This research proposes new therapeutic targets for countering stroke—the second leading cause of death worldwide—and cognitive dysfunction. Reducing salt intake is applicable to people around the globe, as nearly every adult consumes too much salt: on average 9–12 grams per day or around twice the recommended maximum level of intake (5 grams) by the World Health Organization. (...)
The implications of this newly identified gut–brain connection extend toseveral autoimmune disorders, including multiple sclerosis, rheumatoid arthritis, psoriasis, and inflammatory bowel disease, that have been shown to activate the same immune signaling pathway implicated in this study. These autoimmune disorders have a high stroke risk and are linked to poorly functioning blood vessels in the nervous system. This research is also a demonstration that what we eat affects how we think, and that seemingly isolated parts of the body can play vital roles in brain health. These results motivate research on how everyday stressors to our digestive systems and blood vessels might change the brain and, consequently, how we see, and experience, the world.
by Jonathan D. Grinstein, Scientific American | Read more:
Image: Getty
Interestingly, there is a growing body of work showing that there is communication between the gut and brain, now commonly dubbed the gut–brain axis. The disruption of the gut–brain axis contributes to a diverse range of diseases, including Parkinson’s disease and irritable bowel syndrome. Consequently, the developing field of gut–brain axis research is rapidly growing and evolving. Five years ago, a couple of studies showed that high salt intake leads to profound immune changes in the gut, resulting in increased vulnerability of the brain to autoimmunity—when the immune system attacks its own healthy cells and tissues by mistake, suggesting that perhaps the gut can communicate with the brain via immune signaling.

This research proposes new therapeutic targets for countering stroke—the second leading cause of death worldwide—and cognitive dysfunction. Reducing salt intake is applicable to people around the globe, as nearly every adult consumes too much salt: on average 9–12 grams per day or around twice the recommended maximum level of intake (5 grams) by the World Health Organization. (...)
The implications of this newly identified gut–brain connection extend toseveral autoimmune disorders, including multiple sclerosis, rheumatoid arthritis, psoriasis, and inflammatory bowel disease, that have been shown to activate the same immune signaling pathway implicated in this study. These autoimmune disorders have a high stroke risk and are linked to poorly functioning blood vessels in the nervous system. This research is also a demonstration that what we eat affects how we think, and that seemingly isolated parts of the body can play vital roles in brain health. These results motivate research on how everyday stressors to our digestive systems and blood vessels might change the brain and, consequently, how we see, and experience, the world.
by Jonathan D. Grinstein, Scientific American | Read more:
Image: Getty
Dave Matthews & Tim Reynolds
[ed. Ok Dave, next time you take the hard part...]
What We Don't See
Breaking News! -- as NBC Nightly News anchor Lester Holt often puts it when beginning his evening broadcast. Here, in summary, is my view of the news that’s breaking in the United States on just about any day of the week:
Trump. Trump. Trump. Trump. Trump.
Or rather (in the president’s style):
Trump! Trump! Trump! Trump! Trump!!!!!!!! (...)
After all, as hard as it may still be to believe, HE looms over our lives, our planet, in a way no other human being ever has, not even a Joseph Stalin or a Mao Zedong, whose images were once plastered all over the Soviet Union and China. Even the staggering attention recently paid to an otherwise less than overwhelming dead president, one George H.W. Bush, could only have occurred because, in his relative diffidence, he seemed like the un-Trump of some long gone moment. The blanket coverage was, in other words, really just another version of Trump! Trump! Trump! Trump! Trump!!!!!!!!
All in all, check off these first two presidential years of his as a bravura performance, which shouldn’t really surprise any of us. What was he, after all, but a whiz of a performer long before he hit the White House? And what are we -- the media and the rest of us -- but (whether we like it or not, whether we care to be or not) his apprentices?
Now, for a little breaking news of another sort! Unbelievably enough, despite all evidence to the contrary, there’s still an actual world out there somewhere, even if Donald Trump’s shambling 72-year-old figure has thrown so much of it into shadow. I’m talking about a world -- or parts of it, anyway -- that doesn’t test well in focus groups and isn’t guaranteed, like this American president, to keep eyes eternally (or even faintly) glued to screens, a world that, in the age of Donald Trump, goes surprisingly unnoted and unnoticed.
So consider the rest of this piece the most minimalist partial rundown on, in particular, an American imperial world of war and preparations for the same, that is, but shouldn’t be, in the shadows; that shouldn’t be, but often is dealt with as if it existed on the far side of nowhere.
What We Don’t See
Let’s start with the only situation I can recall in which Donald Trump implicitly declared himself to be an apprentice. In the wake of the roadside-bomb deaths of three American soldiers in Afghanistan (a fourth would die later) -- neither Donald Trump nor anyone else in Washington gives a damn, of course, about the escalating numbers of dead Afghans, military and civilian -- the president expressed his condolences in an interview with the Washington Post. He then went on to explain why he (and so we) were still in Afghanistan (14,000 or so U.S. military personnel, a vast array of American air power, and nearly 27,000 private contractors). “We’re there,” he said, “because virtually every expert that I have and speak to say[s] if we don’t go there, they’re going to be fighting over here. And I’ve heard it over and over again.”
Those “experts” are undoubtedly from among the very crew who have, over the last 17-plus years, helped fight the war in Afghanistan to what top U.S. commanders now call a “stalemate,” which might otherwise be defined as the edge of defeat. In those years, before Donald Trump entered the Oval Office threatening to dump the longest war in American history, it had largely disappeared from American consciousness. So had much else about this country’s still-spreading wars and the still-growing war state that went with them.
In other words, none of what’s now happening in Afghanistan and elsewhere is either unique to, or even attributable to, the Trumpian moment. This president has merely brought to a head a process long underway in which America’s never-ending war on terror, which might more accurately be thought of as a war to spread terror, had long ago retreated to the far side of nowhere.
Similarly, the war state in Washington, funded in a fashion that no other set of countries on this planet even comes close to, and growing in preeminence, power, and influence by the year, continues to go largely unnoticed. Today, it is noted only in terms of Donald Trump, only to the degree that he blasts its members or former members for their attitudes toward him, only to the degree to which his followers denounce “the deep state." Meanwhile, ex-CIA, ex-NSA, and ex-FBI officials he’s excoriated suddenly morph into so many liberal heroes to be all-but-worshipped for opposing him. What they did in the “service” of their country -- from overseeing torture, warrantless wiretapping, wars, and drone assassination programs to directly intervening for the first time in an American election -- has been largely forgiven and forgotten, or even turned into bestsellerdom.
Yes, American troops (aka “warriors,” aka “heroes”) from the country’s all volunteer force, or AVF, continue to be eternally and effusively thanked for their service in distant war zones, including by a president who speaks of “my generals” and “my military.” However, that military has essentially become the U.S. equivalent of the French Foreign Legion, an imperial police force fighting wars in distant lands while most Americans obliviously go about their business.
And who these days spends any time thinking about America’s drone wars or the assassin-in-chief in the Oval Office who orders “targeted killings” across significant parts of the planet? Yes, if you happened to read a recent piece by Spencer Ackerman at the Daily Beast, you would know that, under President Trump, the already jacked-up drone strikes of the Obama era have been jacked-up again: 238 of them in Yemen, Somalia, and Pakistan alone in the first two years of Trump’s presidency (and that doesn't even include Libya). And keep in mind that those figures also don’t include far larger numbers of drone strikes in Syria, Iraq, and Afghanistan. The numbers of dead from such strikes (civilian as well as terrorist) are essentially of no interest here.
And here’s another crucial aspect of Washington’s militarized global policies that has almost completely disappeared into the shadows. If you read a recent piece by Nick Turse at the Intercept, you would know that, across the continent of Africa, the U.S. now has at least 34 military installations, ranging from small outposts to enormous, still expanding bases. To put this in the context of the much-ballyhooed new great power struggle on Planet Earth, the Chinese have one military base on that continent (in Djibouti near the biggest U.S. base in Africa, Camp Lemonnier) and the Russians none.
In the Greater Middle East, from Afghanistan to Turkey, though it’s hard to come up with a good count, the U.S. certainly has 50 or more significant garrisons (in Afghanistan, Bahrain, Egypt, Iraq, Jordan, Israel, Oman, Qatar, and Turkey, among other places); Russia two (in Syria); and China none. In fact, never has any country garrisoned the planet in such an imperial and global fashion. The U.S. still has an estimated 800 or so military bases spread across the globe, ranging from tiny “lily pads” to garrisons the size of small American towns in what Chalmers Johnson once called its “empire of bases.” And the American high command is clearly still thinking about where further garrisons might go. As the Arctic, for instance, begins to melt big time, guess who’s moving in?
And yet, in the age of Trump, when on any given day the New York Times has scads of employees focused on the president, neither that paper nor any other mainstream media outlet finds it of interest to cover developments in that empire of bases. In other words, for the media as for the American public, one of the major ways this country presents itself to others, weapons in hand, essentially doesn’t exist.
Trump. Trump. Trump. Trump. Trump.
Or rather (in the president’s style):
Trump! Trump! Trump! Trump! Trump!!!!!!!! (...)
After all, as hard as it may still be to believe, HE looms over our lives, our planet, in a way no other human being ever has, not even a Joseph Stalin or a Mao Zedong, whose images were once plastered all over the Soviet Union and China. Even the staggering attention recently paid to an otherwise less than overwhelming dead president, one George H.W. Bush, could only have occurred because, in his relative diffidence, he seemed like the un-Trump of some long gone moment. The blanket coverage was, in other words, really just another version of Trump! Trump! Trump! Trump! Trump!!!!!!!!

Now, for a little breaking news of another sort! Unbelievably enough, despite all evidence to the contrary, there’s still an actual world out there somewhere, even if Donald Trump’s shambling 72-year-old figure has thrown so much of it into shadow. I’m talking about a world -- or parts of it, anyway -- that doesn’t test well in focus groups and isn’t guaranteed, like this American president, to keep eyes eternally (or even faintly) glued to screens, a world that, in the age of Donald Trump, goes surprisingly unnoted and unnoticed.
So consider the rest of this piece the most minimalist partial rundown on, in particular, an American imperial world of war and preparations for the same, that is, but shouldn’t be, in the shadows; that shouldn’t be, but often is dealt with as if it existed on the far side of nowhere.
What We Don’t See
Let’s start with the only situation I can recall in which Donald Trump implicitly declared himself to be an apprentice. In the wake of the roadside-bomb deaths of three American soldiers in Afghanistan (a fourth would die later) -- neither Donald Trump nor anyone else in Washington gives a damn, of course, about the escalating numbers of dead Afghans, military and civilian -- the president expressed his condolences in an interview with the Washington Post. He then went on to explain why he (and so we) were still in Afghanistan (14,000 or so U.S. military personnel, a vast array of American air power, and nearly 27,000 private contractors). “We’re there,” he said, “because virtually every expert that I have and speak to say[s] if we don’t go there, they’re going to be fighting over here. And I’ve heard it over and over again.”
Those “experts” are undoubtedly from among the very crew who have, over the last 17-plus years, helped fight the war in Afghanistan to what top U.S. commanders now call a “stalemate,” which might otherwise be defined as the edge of defeat. In those years, before Donald Trump entered the Oval Office threatening to dump the longest war in American history, it had largely disappeared from American consciousness. So had much else about this country’s still-spreading wars and the still-growing war state that went with them.
In other words, none of what’s now happening in Afghanistan and elsewhere is either unique to, or even attributable to, the Trumpian moment. This president has merely brought to a head a process long underway in which America’s never-ending war on terror, which might more accurately be thought of as a war to spread terror, had long ago retreated to the far side of nowhere.
Similarly, the war state in Washington, funded in a fashion that no other set of countries on this planet even comes close to, and growing in preeminence, power, and influence by the year, continues to go largely unnoticed. Today, it is noted only in terms of Donald Trump, only to the degree that he blasts its members or former members for their attitudes toward him, only to the degree to which his followers denounce “the deep state." Meanwhile, ex-CIA, ex-NSA, and ex-FBI officials he’s excoriated suddenly morph into so many liberal heroes to be all-but-worshipped for opposing him. What they did in the “service” of their country -- from overseeing torture, warrantless wiretapping, wars, and drone assassination programs to directly intervening for the first time in an American election -- has been largely forgiven and forgotten, or even turned into bestsellerdom.
Yes, American troops (aka “warriors,” aka “heroes”) from the country’s all volunteer force, or AVF, continue to be eternally and effusively thanked for their service in distant war zones, including by a president who speaks of “my generals” and “my military.” However, that military has essentially become the U.S. equivalent of the French Foreign Legion, an imperial police force fighting wars in distant lands while most Americans obliviously go about their business.
And who these days spends any time thinking about America’s drone wars or the assassin-in-chief in the Oval Office who orders “targeted killings” across significant parts of the planet? Yes, if you happened to read a recent piece by Spencer Ackerman at the Daily Beast, you would know that, under President Trump, the already jacked-up drone strikes of the Obama era have been jacked-up again: 238 of them in Yemen, Somalia, and Pakistan alone in the first two years of Trump’s presidency (and that doesn't even include Libya). And keep in mind that those figures also don’t include far larger numbers of drone strikes in Syria, Iraq, and Afghanistan. The numbers of dead from such strikes (civilian as well as terrorist) are essentially of no interest here.
And here’s another crucial aspect of Washington’s militarized global policies that has almost completely disappeared into the shadows. If you read a recent piece by Nick Turse at the Intercept, you would know that, across the continent of Africa, the U.S. now has at least 34 military installations, ranging from small outposts to enormous, still expanding bases. To put this in the context of the much-ballyhooed new great power struggle on Planet Earth, the Chinese have one military base on that continent (in Djibouti near the biggest U.S. base in Africa, Camp Lemonnier) and the Russians none.
In the Greater Middle East, from Afghanistan to Turkey, though it’s hard to come up with a good count, the U.S. certainly has 50 or more significant garrisons (in Afghanistan, Bahrain, Egypt, Iraq, Jordan, Israel, Oman, Qatar, and Turkey, among other places); Russia two (in Syria); and China none. In fact, never has any country garrisoned the planet in such an imperial and global fashion. The U.S. still has an estimated 800 or so military bases spread across the globe, ranging from tiny “lily pads” to garrisons the size of small American towns in what Chalmers Johnson once called its “empire of bases.” And the American high command is clearly still thinking about where further garrisons might go. As the Arctic, for instance, begins to melt big time, guess who’s moving in?
And yet, in the age of Trump, when on any given day the New York Times has scads of employees focused on the president, neither that paper nor any other mainstream media outlet finds it of interest to cover developments in that empire of bases. In other words, for the media as for the American public, one of the major ways this country presents itself to others, weapons in hand, essentially doesn’t exist.
by Tom Englehardt, Tom Dispatch | Read more:
Image: Wikipedia
[ed. See also: Wall Street, Banks, and Angry Citizens and Biological Annihilation.]
Labels:
Government,
Journalism,
Military,
Politics,
Security
Subscribe to:
Posts (Atom)