Friday, June 15, 2018
A Walk in the Woods
On the afternoon of July 5, 1983, three adult supervisors and a group of youngsters set up camp at a popular spot beside Lake Canimina in the fragrant pine forests of western Quebec, about eighty miles north of Ottawa, in a park called La Verendrye Provincial Reserve. They cooked dinner and, afterwards, in the correct fashion, secured their food in a bag and carried it a hundred or so feet into the woods, where they suspended it above the ground between two trees, out of the reach of bears.
About midnight, a black bear came prowling around the margins of the camp, spied the bag, and brought it down by climbing one of the trees and breaking a branch. He plundered the food and departed, but an hour later he was back, this time entering the camp itself, drawn by the lingering smell of cooked meat in the campers’ clothes and hair, in their sleeping bags and tent fabric. It was to be a long night for the Canimina party. Three times between midnight and 3:30 A.M. the bear came to the camp.
Imagine, if you will, lying in the dark alone in a little tent, nothing but a few microns of trembling nylon between you and the chill night air, listening to a 400-pound bear moving around your campsite. Imagine its quiet grunts and mysterious snufflings, the clatter of upended cookware and sounds of moist gnawings, the pad of its feet and the heaviness of its breath, the singing brush of its haunch along your tent side. Imagine the hot flood of adrenaline, that unwelcome tingling in the back of your arms, at the sudden rough bump of its snout against the foot of your tent, the alarming wild wobble of your frail shell as it roots through the backpack that you left casually propped by the entrance—with, you suddenly recall, a Snickers in the pouch. Bears adore Snickers, you’ve heard.
And then the dull thought—oh, God—that perhaps you brought the Snickers in here with you, that it’s somewhere in here, down by your feet or underneath you or—oh, shit, here it is. Another bump of grunting head against the tent, this time near your shoulders. More crazy wobble. Then silence, a very long silence, and—wait, shhhhh…yes! —the unutterable relief of realizing that the bear has withdrawn to the other side of the camp or shambled back into the woods. I tell you right now, I couldn’t stand it.
So imagine then what it must have been like for poor little David Anderson, aged twelve, when at 3:30 A.M., on the third foray, his tent was abruptly rent with a swipe of claw and the bear, driven to distraction by the rich, unfixable, everywhere aroma of hamburger, bit hard into a flinching limb and dragged him shouting and flailing through the camp and into the woods. In the few moments it took the boy’s fellow campers to unzip themselves from their accoutrements—and imagine, if you will, trying to swim out of suddenly voluminous sleeping bags, take up flashlights and makeshift cudgels, undo tent zips with helplessly fumbling fingers, and give chase—in those few moments, poor little David Anderson was dead.
Now imagine reading a nonfiction book packed with stories such as this—true tales soberly related—just before setting off alone on a camping trip of your own into the North American wilderness. The book to which I refer is Bear Attacks: Their Cause and Avoidance, by a Canadian academic named Stephen Herrero. If it is not the last word on the subject, then I really, really, really do not wish to hear the last word. Through long winter nights in New Hampshire, while snow piled up outdoors and my wife slumbered peacefully beside me, I lay saucer-eyed in bed reading clinically precise accounts of people gnawed pulpy in their sleeping bags, plucked whimpering from trees, even noiselessly stalked (I didn’t know this happened!) as they sauntered unawares down leafy paths or cooled their feet in mountain streams. People whose one fatal mistake was to smooth their hair with a dab of aromatic gel, or eat juicy meat, or tuck a Snickers in their shirt pocket for later, or have sex, or even, possibly, menstruate, or in some small, inadvertent way pique the olfactory properties of the hungry bear. Or, come to that, whose fatal failing was simply to be very, very unfortunate—to round a bend and find a moody male blocking the path, head rocking appraisingly, or wander unwittingly into the territory of a bear too slowed by age or idleness to chase down fleeter prey.
Now it is important to establish right away that the possibility of a serious bear attack on the Appalachian Trail is remote. To begin with, the really terrifying American bear, the grizzly Ursus horribilis, as it is so vividly and correctly labeled—doesn’t range east of the Mississippi, which is good news because grizzlies are large, powerful, and ferociously bad tempered. When Lewis and Clark went into the wilderness, they found that nothing unnerved the native Indians more than the grizzly, and not surprisingly since you could riddle a grizzly with arrows—positively porcupine it—and it would still keep coming. Even Lewis and Clark with their big guns were astounded and unsettled by the ability of the grizzly to absorb volleys of lead with barely a wobble. (...)
If I were to be pawed and chewed—and this seemed to me entirely possible, the more I read—it would be by a black bear, Ursus americanus. There are at least 500,000 black bears in North America, possibly as many as 700,000. They are notably common in the hills along the Appalachian Trail (indeed, they often use the trail, for convenience), and their numbers are growing. Grizzlies, by contrast, number no more than 35,000 in the whole of North America, and just 1,000 in the mainland United States, principally in and around Yellowstone National Park. Of the two species, black bears are generally smaller (though this is a decidedly relative condition; a male black bear can still weigh up to 650 pounds) and unquestionably more retiring.
Black bears rarely attack. But here’s the thing. Sometimes they do. All bears are agile, cunning, and immensely strong, and they are always hungry. If they want to kill you and eat you, they can, and pretty much whenever they want. That doesn’t happen often, but—and here is the absolutely salient point—once would be enough.

Imagine, if you will, lying in the dark alone in a little tent, nothing but a few microns of trembling nylon between you and the chill night air, listening to a 400-pound bear moving around your campsite. Imagine its quiet grunts and mysterious snufflings, the clatter of upended cookware and sounds of moist gnawings, the pad of its feet and the heaviness of its breath, the singing brush of its haunch along your tent side. Imagine the hot flood of adrenaline, that unwelcome tingling in the back of your arms, at the sudden rough bump of its snout against the foot of your tent, the alarming wild wobble of your frail shell as it roots through the backpack that you left casually propped by the entrance—with, you suddenly recall, a Snickers in the pouch. Bears adore Snickers, you’ve heard.
And then the dull thought—oh, God—that perhaps you brought the Snickers in here with you, that it’s somewhere in here, down by your feet or underneath you or—oh, shit, here it is. Another bump of grunting head against the tent, this time near your shoulders. More crazy wobble. Then silence, a very long silence, and—wait, shhhhh…yes! —the unutterable relief of realizing that the bear has withdrawn to the other side of the camp or shambled back into the woods. I tell you right now, I couldn’t stand it.
So imagine then what it must have been like for poor little David Anderson, aged twelve, when at 3:30 A.M., on the third foray, his tent was abruptly rent with a swipe of claw and the bear, driven to distraction by the rich, unfixable, everywhere aroma of hamburger, bit hard into a flinching limb and dragged him shouting and flailing through the camp and into the woods. In the few moments it took the boy’s fellow campers to unzip themselves from their accoutrements—and imagine, if you will, trying to swim out of suddenly voluminous sleeping bags, take up flashlights and makeshift cudgels, undo tent zips with helplessly fumbling fingers, and give chase—in those few moments, poor little David Anderson was dead.
Now imagine reading a nonfiction book packed with stories such as this—true tales soberly related—just before setting off alone on a camping trip of your own into the North American wilderness. The book to which I refer is Bear Attacks: Their Cause and Avoidance, by a Canadian academic named Stephen Herrero. If it is not the last word on the subject, then I really, really, really do not wish to hear the last word. Through long winter nights in New Hampshire, while snow piled up outdoors and my wife slumbered peacefully beside me, I lay saucer-eyed in bed reading clinically precise accounts of people gnawed pulpy in their sleeping bags, plucked whimpering from trees, even noiselessly stalked (I didn’t know this happened!) as they sauntered unawares down leafy paths or cooled their feet in mountain streams. People whose one fatal mistake was to smooth their hair with a dab of aromatic gel, or eat juicy meat, or tuck a Snickers in their shirt pocket for later, or have sex, or even, possibly, menstruate, or in some small, inadvertent way pique the olfactory properties of the hungry bear. Or, come to that, whose fatal failing was simply to be very, very unfortunate—to round a bend and find a moody male blocking the path, head rocking appraisingly, or wander unwittingly into the territory of a bear too slowed by age or idleness to chase down fleeter prey.
Now it is important to establish right away that the possibility of a serious bear attack on the Appalachian Trail is remote. To begin with, the really terrifying American bear, the grizzly Ursus horribilis, as it is so vividly and correctly labeled—doesn’t range east of the Mississippi, which is good news because grizzlies are large, powerful, and ferociously bad tempered. When Lewis and Clark went into the wilderness, they found that nothing unnerved the native Indians more than the grizzly, and not surprisingly since you could riddle a grizzly with arrows—positively porcupine it—and it would still keep coming. Even Lewis and Clark with their big guns were astounded and unsettled by the ability of the grizzly to absorb volleys of lead with barely a wobble. (...)
If I were to be pawed and chewed—and this seemed to me entirely possible, the more I read—it would be by a black bear, Ursus americanus. There are at least 500,000 black bears in North America, possibly as many as 700,000. They are notably common in the hills along the Appalachian Trail (indeed, they often use the trail, for convenience), and their numbers are growing. Grizzlies, by contrast, number no more than 35,000 in the whole of North America, and just 1,000 in the mainland United States, principally in and around Yellowstone National Park. Of the two species, black bears are generally smaller (though this is a decidedly relative condition; a male black bear can still weigh up to 650 pounds) and unquestionably more retiring.
Black bears rarely attack. But here’s the thing. Sometimes they do. All bears are agile, cunning, and immensely strong, and they are always hungry. If they want to kill you and eat you, they can, and pretty much whenever they want. That doesn’t happen often, but—and here is the absolutely salient point—once would be enough.
by Bill Bryson, LitHub | Read more:
Image: : Bess Sadler/FlickrLarry David and the Game Theory of Anonymous Donations
In a Curb Your Enthusiasm episode from 2007, Larry David and his wife Cheryl and their friends attend a ceremony to celebrate his public donation to the National Resources Defense Council, a non-profit environmental advocacy group. Little does he know that the actor Ted Danson, his arch-frenemy, also donated money, but anonymously. “Now it looks like I just did mine for the credit as opposed to Mr. Wonderful Anonymous,” David tells Cheryl. David feels upstaged, as if his public donation has been transformed from a generous gesture to an egotistical one. Cheryl says, about Danson, “Isn’t that great? He donated the whole wing. Didn’t want anybody to know.” “I didn’t need the world to know either!” David says. “Nobody told me I could be ‘anonymous’ and tell people!” He would have done it Danson’s way, he says, but, realizing the contradiction, he fumes, “You can’t have it halfway! You’re either anonymous, or you’re not.” What Danson did, David concludes, is “fake philanthropy and faux anonymity!”
I thought of this scene this week, after reading a new study in Nature Human Behavior. “People sometimes make their admirable deeds and accomplishments hard to spot, such as by giving anonymously or avoiding bragging,” write the authors—Moshe Hoffman, Christian Hilbe, and Martin A. Nowak, evolutionary biologists from Harvard University and the Institute of Science and Technology Austria. But if “we give to gain reputational benefits, why would we ever wish to hide the fact that we gave?”
The answer to this question may seem less mysterious for anyone who’s seen that Curb episode, “The Anonymous Donor.” We “hide” the fact that we gave precisely for the reputational benefits. For example, at the ceremony, when Danson pops over to David, who’s chatting with then-California Senator Barbara Boxer, she calls Danson a “hero” and stands in awe of the altruism of his “anonymous” donation. Danson playfully shushes her—he’s meant to have only told one or two people but everyone seems to know. David can’t believe it, and later resolves to always donate anonymously for the sake of his reputation.
The episode hits on exactly what Hoffman, Hilbe, and Nowak describe in their paper. “Donations are never fully anonymous,” they write. “These donations are often revealed to the recipient, the inner circle of friends or fellow do-gooders,” and these “few privy observers, in turn, do not only learn that the donor is generous” but are “also likely to infer that the generosity was not motivated by immediate fame or the desire for recognition from the masses…”—exactly what everyone seemed to figure was true of David, to his chagrin.
What’s intriguing about anonymous giving, and other behaviors apparently designed to obscure good traits and acts, like modesty, is that it’s “hard to reconcile with standard evolutionary accounts of pro-social behavior,” the researchers write. Donations fall under a form of cooperation called “indirect reciprocity.” “Direct reciprocity is like a barter economy based on the immediate exchange of goods, while indirect reciprocity resembles the invention of money,” Nowak wrote in his highly cited 2006 paper “Five Rules for the Evolution of Cooperation.” “The money that fuels the engines of indirect reciprocity is reputation.” Donation evolved, in other words, because it granted a good reputation, which helped humans in securing mates and cementing alliances. But if that’s true, how did the practice of anonymous giving arise? The title of the new paper suggests a solution: “The signal-burying game can explain why we obscure positive traits and good deeds.”
The signal-burying game is one of the latest examples of scientists gaining insight into human behavior from game theoretic models and signalling theory. These games, the authors write, make sense of “seemingly counterintuitive behaviors by carefully analyzing which information these behaviors convey in a given context.” Geoffrey Miller, an evolutionary psychologist at the University of New Mexico, said recently on Sam Harris’ podcast, “Waking Up,” “Signalling theory is probably the part of game theory I use most often. The idea there is: How do you credibly demonstrate what kind of organism you are through the signals you give out? And what makes those signals honest, and hard to fake, rather than easily faked, like cheap talk?”
In the signal-burying game, a sender and a receiver pair up randomly, and are rewarded for the kind of match that is made. There are three types of senders (or donors)—low, medium, and high—and two types of receivers—weakly selecting and strongly selecting, or weak and strong for short. A strong receiver corresponds to one of the donor’s close friends or a fellow altruist, and a weak receiver to a member of the general public. The best payoff results from a strong receiver partnering with a high sender, while the worst payoff results from a weak receiver partnering with a high sender.

The answer to this question may seem less mysterious for anyone who’s seen that Curb episode, “The Anonymous Donor.” We “hide” the fact that we gave precisely for the reputational benefits. For example, at the ceremony, when Danson pops over to David, who’s chatting with then-California Senator Barbara Boxer, she calls Danson a “hero” and stands in awe of the altruism of his “anonymous” donation. Danson playfully shushes her—he’s meant to have only told one or two people but everyone seems to know. David can’t believe it, and later resolves to always donate anonymously for the sake of his reputation.
The episode hits on exactly what Hoffman, Hilbe, and Nowak describe in their paper. “Donations are never fully anonymous,” they write. “These donations are often revealed to the recipient, the inner circle of friends or fellow do-gooders,” and these “few privy observers, in turn, do not only learn that the donor is generous” but are “also likely to infer that the generosity was not motivated by immediate fame or the desire for recognition from the masses…”—exactly what everyone seemed to figure was true of David, to his chagrin.
What’s intriguing about anonymous giving, and other behaviors apparently designed to obscure good traits and acts, like modesty, is that it’s “hard to reconcile with standard evolutionary accounts of pro-social behavior,” the researchers write. Donations fall under a form of cooperation called “indirect reciprocity.” “Direct reciprocity is like a barter economy based on the immediate exchange of goods, while indirect reciprocity resembles the invention of money,” Nowak wrote in his highly cited 2006 paper “Five Rules for the Evolution of Cooperation.” “The money that fuels the engines of indirect reciprocity is reputation.” Donation evolved, in other words, because it granted a good reputation, which helped humans in securing mates and cementing alliances. But if that’s true, how did the practice of anonymous giving arise? The title of the new paper suggests a solution: “The signal-burying game can explain why we obscure positive traits and good deeds.”
The signal-burying game is one of the latest examples of scientists gaining insight into human behavior from game theoretic models and signalling theory. These games, the authors write, make sense of “seemingly counterintuitive behaviors by carefully analyzing which information these behaviors convey in a given context.” Geoffrey Miller, an evolutionary psychologist at the University of New Mexico, said recently on Sam Harris’ podcast, “Waking Up,” “Signalling theory is probably the part of game theory I use most often. The idea there is: How do you credibly demonstrate what kind of organism you are through the signals you give out? And what makes those signals honest, and hard to fake, rather than easily faked, like cheap talk?”
In the signal-burying game, a sender and a receiver pair up randomly, and are rewarded for the kind of match that is made. There are three types of senders (or donors)—low, medium, and high—and two types of receivers—weakly selecting and strongly selecting, or weak and strong for short. A strong receiver corresponds to one of the donor’s close friends or a fellow altruist, and a weak receiver to a member of the general public. The best payoff results from a strong receiver partnering with a high sender, while the worst payoff results from a weak receiver partnering with a high sender.
by Brian Gallagher, Nautilus | Read more:
Image: David Hume Kennerly / GettyThursday, June 14, 2018
The 2004 U.S. Open and a Sunday to Forget
SOUTHAMPTON, N.Y. — Tom Meeks was on the 11th green at Shinnecock Hills on the morning of June 20, 2004, when he got the call on the radio. It was Mike Davis, now the CEO of the USGA, but then the U.S. Open championship director. “Tom, we need you at 7 right away,” Davis said.
Meeks, hearing the urgency in his colleague’s voice, jumped onto his cart and drove to the short, par-3 seventh hole. He had spent considerable time there that morning searching for, as he put it, “the easiest possible hole location I could find.” He had finally selected a spot on the front right of the green, not wanting to go too far back on a green that sloped front-to-back and had been very difficult for players to hold the previous day during the third round.
When Meeks arrived, Davis was standing on the green with the flag lying on the ground. He had watched from a spot nearby as the first two groups of the morning flailed helplessly at a green that simply wouldn’t allow any putt of more than a foot or a two to stop if it didn’t hit the hole dead center.
Those first four players—J.J. Henry, Kevin Stadler, Cliff Kresge and Billy Mayfair had made three triple-bogeys and a bogey. Ironically, it was Mayfair, who would go on to shoot 89—the highest score of a day when 28 players failed to break 80—who made the miracle bogey.
Meeks had been setting up championship golf courses for the USGA for close to 30 years, and this was the 10th time he had been charged with setting up a U.S. Open course. His reputation was as someone who would squeeze courses to their limits, following the USGA mantra of “fast and firm” at all times.
“You have to understand that the USGA had pushed the notion that it wanted its U.S. Open to be the biggest, baddest, toughest golf tournament in the world,” said David Fay, who was the organization’s executive director from 1989 to 2010. “Heck, even before I started working there [1979] that was the image of the Open. We were in love with fast and firm. That day, we went too far.”
Even as the Open finally returns to Shinnecock 14 years later, that sunny, windy Sunday is talked about in hushed tones in USGA circles. No one broke 70 and only Robert Allenby shot 70—even par—moving him from a tie for 34th to a tie for seventh. Retief Goosen and Phil Mickelson, who finished first and second, each shot 71. Ernie Els, who began the day two shots behind Goosen, shot 80—and still finished tied for ninth.
Meeks. who’s now 77, retired from the USGA in 2005 and lives in Indianapolis. He says one man was responsible for the debacle: him.
“It was my fault,” he said last weekend. “On Saturday, I thought the golf course was perfect—exactly where we wanted it to be. I went out to dinner with Susie [who he’s been married to for almost 53 years] and some friends and I was in a great mood. I loved Shinnecock as much as any Open venue we’d ever played, and I thought we were about to walk away from a perfect week.”
Meeks, hearing the urgency in his colleague’s voice, jumped onto his cart and drove to the short, par-3 seventh hole. He had spent considerable time there that morning searching for, as he put it, “the easiest possible hole location I could find.” He had finally selected a spot on the front right of the green, not wanting to go too far back on a green that sloped front-to-back and had been very difficult for players to hold the previous day during the third round.

Those first four players—J.J. Henry, Kevin Stadler, Cliff Kresge and Billy Mayfair had made three triple-bogeys and a bogey. Ironically, it was Mayfair, who would go on to shoot 89—the highest score of a day when 28 players failed to break 80—who made the miracle bogey.
Meeks had been setting up championship golf courses for the USGA for close to 30 years, and this was the 10th time he had been charged with setting up a U.S. Open course. His reputation was as someone who would squeeze courses to their limits, following the USGA mantra of “fast and firm” at all times.
“You have to understand that the USGA had pushed the notion that it wanted its U.S. Open to be the biggest, baddest, toughest golf tournament in the world,” said David Fay, who was the organization’s executive director from 1989 to 2010. “Heck, even before I started working there [1979] that was the image of the Open. We were in love with fast and firm. That day, we went too far.”
Even as the Open finally returns to Shinnecock 14 years later, that sunny, windy Sunday is talked about in hushed tones in USGA circles. No one broke 70 and only Robert Allenby shot 70—even par—moving him from a tie for 34th to a tie for seventh. Retief Goosen and Phil Mickelson, who finished first and second, each shot 71. Ernie Els, who began the day two shots behind Goosen, shot 80—and still finished tied for ninth.
Meeks. who’s now 77, retired from the USGA in 2005 and lives in Indianapolis. He says one man was responsible for the debacle: him.
“It was my fault,” he said last weekend. “On Saturday, I thought the golf course was perfect—exactly where we wanted it to be. I went out to dinner with Susie [who he’s been married to for almost 53 years] and some friends and I was in a great mood. I loved Shinnecock as much as any Open venue we’d ever played, and I thought we were about to walk away from a perfect week.”
by John Feinstein, Golf Digest | Read more:
Why Wall Street Isn’t Freaking
If there is a single question that I have heard more than any other during the past few months, it is: “How are markets so blasé about the endless threats to financial stability and order from the president?”
Many theories are tossed around. Some resonate more than others. What follows is a short summary of why markets haven't freaked out over the ongoing spectacle of what used to be called market-moving news, but today merely is the state of the world.
1. Just wait: I hate this arrogant, frustrating response. Typically, it comes from people who refuse to admit error after predicting disaster from the Trump presidency. It amounts to saying: The market is wrong, I know better and I will be proven right eventually. Any thesis that requires one to wait years to prove or disprove its validity is inherently annoying.
However, (you knew a however was coming, right?) momentum does sometimes carry things much further than anyone reasonably expects. During the tech bubble in the 1990s, there were overwhelming signs of absurd and inflated valuations; that went on for what seemed like a very long time before its eventual ignominious end. So too did housing excesses continue for a long time during the credit boom during the 2000s. That too, of course, ended in a debacle.
Maybe this will be one of those times, leading to crisis. Recall the maxim uttered by economist Herbert Stein: “When something is unsustainable, it will eventually stop.” I leave it up to you to decide whether whatever is happening right now is unsustainable.
2. Investors have become inured to the noise: The endless stream of blunt presidential tweets; the constant preening and posturing; the drumbeat of ego and bluster. We have a sense of what those alleged sonic attacks on U.S. consulates overseas must be like -- an annoying, relentless, headache-inducing, weapon of noise and disorientation.
When he was first elected, Trump could move markets and stock prices with a single scolding tweet. Now, not so much. Not only do markets barely respond to his Twitter outbursts, they seem willing to take the other side of the trade; stocks that Trump has disparaged have outperformed the ones he likes.
Why is this? The constant of human adaptability looms large. We eventually get used to just about everything (that doesn’t kills us), both positive and negative.
3. The economy is so big, it doesn‘t matter: Everyday, you wake up, have a cup of coffee or breakfast, get dressed, send the kids to school, go to work. We forget just how huge this is: it adds up to more than $18 trillion of annual economic activity.
A $50 billion tariff here or a $100 billion trade war there is barely a rounding error in the global economy. The U.S. economy is huge, and the global economy is four times as large. I don't want to suggest tariffs are meaningless, but they matter much less than some of the fevered imaginations around us might have you believe.
Many theories are tossed around. Some resonate more than others. What follows is a short summary of why markets haven't freaked out over the ongoing spectacle of what used to be called market-moving news, but today merely is the state of the world.
1. Just wait: I hate this arrogant, frustrating response. Typically, it comes from people who refuse to admit error after predicting disaster from the Trump presidency. It amounts to saying: The market is wrong, I know better and I will be proven right eventually. Any thesis that requires one to wait years to prove or disprove its validity is inherently annoying.
However, (you knew a however was coming, right?) momentum does sometimes carry things much further than anyone reasonably expects. During the tech bubble in the 1990s, there were overwhelming signs of absurd and inflated valuations; that went on for what seemed like a very long time before its eventual ignominious end. So too did housing excesses continue for a long time during the credit boom during the 2000s. That too, of course, ended in a debacle.
Maybe this will be one of those times, leading to crisis. Recall the maxim uttered by economist Herbert Stein: “When something is unsustainable, it will eventually stop.” I leave it up to you to decide whether whatever is happening right now is unsustainable.
2. Investors have become inured to the noise: The endless stream of blunt presidential tweets; the constant preening and posturing; the drumbeat of ego and bluster. We have a sense of what those alleged sonic attacks on U.S. consulates overseas must be like -- an annoying, relentless, headache-inducing, weapon of noise and disorientation.
When he was first elected, Trump could move markets and stock prices with a single scolding tweet. Now, not so much. Not only do markets barely respond to his Twitter outbursts, they seem willing to take the other side of the trade; stocks that Trump has disparaged have outperformed the ones he likes.
Why is this? The constant of human adaptability looms large. We eventually get used to just about everything (that doesn’t kills us), both positive and negative.
3. The economy is so big, it doesn‘t matter: Everyday, you wake up, have a cup of coffee or breakfast, get dressed, send the kids to school, go to work. We forget just how huge this is: it adds up to more than $18 trillion of annual economic activity.
A $50 billion tariff here or a $100 billion trade war there is barely a rounding error in the global economy. The U.S. economy is huge, and the global economy is four times as large. I don't want to suggest tariffs are meaningless, but they matter much less than some of the fevered imaginations around us might have you believe.
by Barry Ritholtz, Bloomberg | Read more:
Image: no image
Instagram’s Wannabe-Stars Are Driving Luxury Hotels Crazy
Three years ago, Lisa Linh quit her full-time job to travel the world and document it on Instagram, where she has nearly 100,000 followers; since then, she has stayed in breathtaking hotels everywhere from Mexico to Quebec to the Cook Islands. Often, she stays for free.
Linh is part of an ever-growing class of people who have leveraged their social media clout to travel the world, frequently in luxury. While Linh and other elite influencers are usually personally invited by hotel brands, an onslaught of lesser-known wannabes has left hotels scrambling to deal with a deluge of requests for all-expense-paid vacations in exchange for some social media posts.
Kate Jones, marketing and communications manager at the Dusit Thani, a five-star resort in the Maldives, said that her hotel receives at least six requests from self-described influencers per day, typically through Instagram direct message.
“Everyone with a Facebook these days is an influencer,” she said. “People say, I want to come to the Maldives for 10 days and will do two posts on Instagram to like 2,000 followers. It's people with 600 Facebook friends saying, ‘Hi, I'm an influencer, I want to stay in your hotel for 7 days,’” she said. Others send vague one-line emails, like “I want to collaborate with you,”with no further explanation. “These people are expecting five to seven nights on average, all inclusive. Maldives is not a cheap destination.” She said that only about 10 percent of the requests she receives are worth investigating.
Jack Bedwani, who runs The Projects, a brand consulting agency that works with several top hospitality brands, said that he’s close with the PR manager for a new hotel and day club in Bali. “They get five to 20 direct inquiries a day from self-titled influencers,” he said. “The net is so wide, and the term ‘influencer’ is so loose.”
“You can sort the amateurs from the pros very quickly,” Bedwani said.“The vast majority of cold-call approaches are really badly written. It sounds like when you're texting a friend inviting yourself over for dinner—it's that colloquial. They don't give reasons why anyone should invest in having them as a guest.”
Some hotels report being so overwhelmed by influencer requests that they've simply opted out. In January, a luxury boutique hotel in Ireland made headlinesfor banning all YouTubers and Instagram stars after a 22-year-old requested a free five-night stay in exchange for exposure. (...)
But to influencers themselves, this is a fundamental misunderstanding of the value exchange. Instagram has ballooned to more than 800 million monthly active users, many of whom come to it for travel ideas, and influencers argue that the promotions they offer allow hotels to directly market to new audiences in an authentic way.
They're not completely wrong. Most hotels acknowledge that there's some benefit to working with influencers, it's just that determining how to work with them—and manage their requests—is a challenge.
Linh is part of an ever-growing class of people who have leveraged their social media clout to travel the world, frequently in luxury. While Linh and other elite influencers are usually personally invited by hotel brands, an onslaught of lesser-known wannabes has left hotels scrambling to deal with a deluge of requests for all-expense-paid vacations in exchange for some social media posts.

“Everyone with a Facebook these days is an influencer,” she said. “People say, I want to come to the Maldives for 10 days and will do two posts on Instagram to like 2,000 followers. It's people with 600 Facebook friends saying, ‘Hi, I'm an influencer, I want to stay in your hotel for 7 days,’” she said. Others send vague one-line emails, like “I want to collaborate with you,”with no further explanation. “These people are expecting five to seven nights on average, all inclusive. Maldives is not a cheap destination.” She said that only about 10 percent of the requests she receives are worth investigating.
Jack Bedwani, who runs The Projects, a brand consulting agency that works with several top hospitality brands, said that he’s close with the PR manager for a new hotel and day club in Bali. “They get five to 20 direct inquiries a day from self-titled influencers,” he said. “The net is so wide, and the term ‘influencer’ is so loose.”
“You can sort the amateurs from the pros very quickly,” Bedwani said.“The vast majority of cold-call approaches are really badly written. It sounds like when you're texting a friend inviting yourself over for dinner—it's that colloquial. They don't give reasons why anyone should invest in having them as a guest.”
Some hotels report being so overwhelmed by influencer requests that they've simply opted out. In January, a luxury boutique hotel in Ireland made headlinesfor banning all YouTubers and Instagram stars after a 22-year-old requested a free five-night stay in exchange for exposure. (...)
But to influencers themselves, this is a fundamental misunderstanding of the value exchange. Instagram has ballooned to more than 800 million monthly active users, many of whom come to it for travel ideas, and influencers argue that the promotions they offer allow hotels to directly market to new audiences in an authentic way.
They're not completely wrong. Most hotels acknowledge that there's some benefit to working with influencers, it's just that determining how to work with them—and manage their requests—is a challenge.
by Taylor Lorenz, The Atlantic | Read more:
Image: Shutterstock/Blue Planet Video
Housing & Expenditures: Before, During, and After the Bubble
Housing prices in the U.S. rose sharply from the early to mid-2000s, followed by a sharp drop after 2007. This period of accelerated price increases is often called the “housing bubble” and its decline is known as the “housing bubble burst.”
Concurrent with this housing bubble bursting was a serious economic downturn. According to the National Bureau of Economic Research (NBER), the U.S. economy entered a recession in December 2007, from which it started to recover in June 2009. The period of contraction, popularly known as “the Great Recession,” was particularly serious because, for example, the monthly unemployment rate (seasonally adjusted) peaked at 10.0 percent in October 2009—a rate either reached or surpassed only one other time since the unemployment data series started in 1948. That recession ended a quarter of a century earlier, in November 1982.
It is reasonable to argue that the housing bubble and its bursting contributed to the onset, and exacerbated the consequences, of the Great Recession. For example, consumers purchasing near the peak of the bubble presumably faced larger mortgage payments than longer time homeowners, leading them to cut back on expenditures they might have made for other goods and services. Then, once the recession started, and home prices fell, some owners—especially new ones—presumably at minimum felt less wealthy, and therefore more constrained in spending, while others—especially those who lost their jobs because of recessionary pressures—found themselves unable to afford their homes any longer.
Given these events, details of this period deserve a closer look. For example, the housing bubble and burst is usually discussed as a national phenomenon. Yet, according to the familiar saying in real estate, “three factors matter: location, location, and location.” Therefore, did the housing bubble manifest itself differently in different parts of the country? Did the bubble affect renters as well as homeowners? And, given the importance of basic housing in the typical family’s budget—ranging from about 24 percent to 27 percent of total expenditures for the average consumer unit between 1995 and 2015—how did expenditures for other goods and services change during this period?
This Beyond the Numbers article examines the lead up to, and aftermath of, the housing bubble and burst, including changes in housing prices, housing tenure (homeownership and rental rates), and consumer spending. For these purposes, this analysis uses mainly data from the Bureau of Labor Statistics (BLS) Consumer Expenditure (CE) surveys and some data from the Census Bureau. Based on evidence from these data, the “pre-bubble” period is defined as 1995 to 2001; the “bubble” period is defined as 2002 to 2007; and the “burst” and its aftermath cover 2008 through 2015, coincident with the Great Recession (essentially 2008 to 2009) and the first years of recovery therefrom (2010 through 2015).
by Geoffrey Paulin, The Big Picture | Read more:
Image: USBLS

It is reasonable to argue that the housing bubble and its bursting contributed to the onset, and exacerbated the consequences, of the Great Recession. For example, consumers purchasing near the peak of the bubble presumably faced larger mortgage payments than longer time homeowners, leading them to cut back on expenditures they might have made for other goods and services. Then, once the recession started, and home prices fell, some owners—especially new ones—presumably at minimum felt less wealthy, and therefore more constrained in spending, while others—especially those who lost their jobs because of recessionary pressures—found themselves unable to afford their homes any longer.
Given these events, details of this period deserve a closer look. For example, the housing bubble and burst is usually discussed as a national phenomenon. Yet, according to the familiar saying in real estate, “three factors matter: location, location, and location.” Therefore, did the housing bubble manifest itself differently in different parts of the country? Did the bubble affect renters as well as homeowners? And, given the importance of basic housing in the typical family’s budget—ranging from about 24 percent to 27 percent of total expenditures for the average consumer unit between 1995 and 2015—how did expenditures for other goods and services change during this period?
This Beyond the Numbers article examines the lead up to, and aftermath of, the housing bubble and burst, including changes in housing prices, housing tenure (homeownership and rental rates), and consumer spending. For these purposes, this analysis uses mainly data from the Bureau of Labor Statistics (BLS) Consumer Expenditure (CE) surveys and some data from the Census Bureau. Based on evidence from these data, the “pre-bubble” period is defined as 1995 to 2001; the “bubble” period is defined as 2002 to 2007; and the “burst” and its aftermath cover 2008 through 2015, coincident with the Great Recession (essentially 2008 to 2009) and the first years of recovery therefrom (2010 through 2015).
by Geoffrey Paulin, The Big Picture | Read more:
Image: USBLS
Seawater Yields First Grams of Yellowcake Uranium
For the first time, researchers at Pacific Northwest National Laboratory and LCW Supercritical Technologies have created five grams of yellowcake—a powdered form of uranium used to produce fuel for nuclear power production—using acrylic fibers to extract it from seawater.
"This is a significant milestone," said Gary Gill, a researcher at PNNL, a Department of Energy national laboratory, and the only one with a marine research facility, located in Sequim, Wash. "It indicates that this approach can eventually provide commercially attractive nuclear fuel derived from the oceans—the largest source of uranium on earth."
That's where LCW, a Moscow, Idaho clean energy company comes in. LCW with early support from PNNL through DOE's Office of Nuclear Energy, developed an acrylic fiber which attracts and holds on to dissolved uranium naturally present in ocean water.
"We have chemically modified regular, inexpensive yarn, to convert it into an adsorbent which is selective for uranium, efficient and reusable," said Chien Wai, president of LCW Supercritical Technologies. "PNNL's capabilities in evaluating and testing the material, have been invaluable in moving this technology forward."
Wai is a former University of Idaho professor who, along with colleague Horng-Bin Pan, was involved in earlier DOE-funded research to develop materials in order to increase domestic availability of uranium, which is mostly imported into the U.S. currently.
Wai founded LCW and, with funding from the Small Business Innovation Research program, worked out a new approach to adsorb the uranium onto a molecule or ligand that is chemically bound to the acrylic fiber. The result is a wavy looking polymer adsorbent that can be deployed in a marine environment, is durable and reusable.
The adsorbent material is inexpensive, according to Wai. In fact, he said, even waste yarn can be used to create the polymer fiber. The adsorbent properties of the material are reversible, and the captured uranium is easily released to be processed into yellowcake. An analysis of the technology suggests that it could be competitive with the cost of uranium produced through land-based mining. (...)
"For each test, we put about two pounds of the fiber into the tank for about one month and pumped the seawater through quickly, to mimic conditions in the open ocean" said Gill. "LCW then extracted the uranium from the adsorbent and, from these first three tests, we got about five grams—about what a nickel weighs. It might not sound like much, but it can really add up."
"This is a significant milestone," said Gary Gill, a researcher at PNNL, a Department of Energy national laboratory, and the only one with a marine research facility, located in Sequim, Wash. "It indicates that this approach can eventually provide commercially attractive nuclear fuel derived from the oceans—the largest source of uranium on earth."

"We have chemically modified regular, inexpensive yarn, to convert it into an adsorbent which is selective for uranium, efficient and reusable," said Chien Wai, president of LCW Supercritical Technologies. "PNNL's capabilities in evaluating and testing the material, have been invaluable in moving this technology forward."
Wai is a former University of Idaho professor who, along with colleague Horng-Bin Pan, was involved in earlier DOE-funded research to develop materials in order to increase domestic availability of uranium, which is mostly imported into the U.S. currently.
Wai founded LCW and, with funding from the Small Business Innovation Research program, worked out a new approach to adsorb the uranium onto a molecule or ligand that is chemically bound to the acrylic fiber. The result is a wavy looking polymer adsorbent that can be deployed in a marine environment, is durable and reusable.
The adsorbent material is inexpensive, according to Wai. In fact, he said, even waste yarn can be used to create the polymer fiber. The adsorbent properties of the material are reversible, and the captured uranium is easily released to be processed into yellowcake. An analysis of the technology suggests that it could be competitive with the cost of uranium produced through land-based mining. (...)
"For each test, we put about two pounds of the fiber into the tank for about one month and pumped the seawater through quickly, to mimic conditions in the open ocean" said Gill. "LCW then extracted the uranium from the adsorbent and, from these first three tests, we got about five grams—about what a nickel weighs. It might not sound like much, but it can really add up."
Wednesday, June 13, 2018
It Can Happen Here
Liberal democracy has enjoyed much better days. Vladimir Putin has entrenched authoritarian rule and is firmly in charge of a resurgent Russia. In global influence, China may have surpassed the United States, and Chinese president Xi Jinping is now empowered to remain in office indefinitely. In light of recent turns toward authoritarianism in Turkey, Poland, Hungary, and the Philippines, there is widespread talk of a “democratic recession.” In the United States, President Donald Trump may not be sufficiently committed to constitutional principles of democratic government.
In such a time, we might be tempted to try to learn something from earlier turns toward authoritarianism, particularly the triumphant rise of the Nazis in Germany in the 1930s. The problem is that Nazism was so horrifying and so barbaric that for many people in nations where authoritarianism is now achieving a foothold, it is hard to see parallels between Hitler’s regime and their own governments. Many accounts of the Nazi period depict a barely imaginable series of events, a nation gone mad. That makes it easy to take comfort in the thought that it can’t happen again.
But some depictions of Hitler’s rise are more intimate and personal. They focus less on well-known leaders, significant events, state propaganda, murders, and war, and more on the details of individual lives. They help explain how people can not only participate in dreadful things but also stand by quietly and live fairly ordinary days in the midst of them. They offer lessons for people who now live with genuine horrors, and also for those to whom horrors may never come but who live in nations where democratic practices and norms are under severe pressure.
Milton Mayer’s 1955 classic They Thought They Were Free, recently republished with an afterword by the Cambridge historian Richard J. Evans, was one of the first accounts of ordinary life under Nazism. Dotted with humor and written with an improbably light touch, it provides a jarring contrast with Sebastian Haffner’s devastating, unfinished 1939 memoir, Defying Hitler, which gives a moment-by-moment, you-are-there feeling to Hitler’s rise. (The manuscript was discovered by Haffner’s son after the author’s death and published in 2000 in Germany, where it became an immediate sensation.)* A much broader perspective comes from Konrad Jarausch’s Broken Lives, an effort to reconstruct the experience of Germans across the entire twentieth century. What distinguishes the three books is their sense of intimacy. They do not focus on historic figures making transformative decisions. They explore how ordinary people attempted to navigate their lives under terrible conditions.
Haffner’s real name was Raimund Pretzel. (He used a pseudonym so as not to endanger his family while in exile in England.) He was a journalist, not a historian or political theorist, but he interrupts his riveting narrative to tackle a broad question: “What is history, and where does it take place?” He objects that most works of history give “the impression that no more than a few dozen people are involved, who happen to be ‘at the helm of the ship of state’ and whose deeds and decisions form what is called history.” In his view, that’s wrong. What matters are “we anonymous others” who are not just “pawns in the chess game,” because the “most powerful dictators, ministers, and generals are powerless against the simultaneous mass decisions taken individually and almost unconsciously by the population at large.” Haffner insists on the importance of investigating “some very peculiar, very revealing, mental processes and experiences,” involving “the private lives, emotions and thoughts of individual Germans.”
Mayer had the same aim. An American journalist of German descent, he tried to meet with Hitler in 1935. He failed, but he did travel widely in Nazi Germany. Stunned to discover a mass movement rather than a tyranny of a diabolical few, he concluded that his real interest was not in Hitler but in people like himself, to whom “something had happened that had not (or at least not yet) happened to me and my fellow-countrymen.” In 1951, he returned to Germany to find out what had made Nazism possible.
In They Thought They Were Free, Mayer decided to focus on ten people, different in many respects but with one characteristic in common: they had all been members of the Nazi Party. Eventually they agreed to talk, accepting his explanation that he hoped to enable the people of his nation to have a better understanding of Germany. Mayer was truthful about that and about nearly everything else. But he did not tell them that he was a Jew.
In the late 1930s—the period that most interested Mayer—his subjects were working as a janitor, a soldier, a cabinetmaker, an office manager, a baker, a bill collector, an inspector, a high school teacher, and a police officer. One had been a high school student. All were male. None of them occupied positions of leadership or influence. All of them referred to themselves as “wir kleine Leute, we little people.” They lived in Marburg, a university town on the river Lahn, not far from Frankfurt.
Mayer talked with them over the course of a year, under informal conditions—coffee, meals, and long, relaxed evenings. He became friends with each (and throughout he refers to them as such). As he put it, with evident surprise, “I liked them. I couldn’t help it.” They could be ironic, funny, and self-deprecating. Most of them enjoyed a joke that originated in Nazi Germany: “What is an Aryan? An Aryan is a man who is tall like Hitler, blond like Goebbels, and lithe like Göring.” They also could be wise. Speaking of the views of ordinary people under Hitler, one of them asked:
Mayer’s most stunning conclusion is that with one partial exception (the teacher), none of his subjects “saw Nazism as we—you and I—saw it in any respect.” Where most of us understand Nazism as a form of tyranny, Mayer’s subjects “did not know before 1933 that Nazism was evil. They did not know between 1933 and 1945 that it was evil. And they do not know it now.” Seven years after the war, they looked back on the period from 1933 to 1939 as the best time of their lives. (...)
The killing of six million Jews? Fake news. Four of Mayer’s subjects insisted that the only Jews taken to concentration camps were traitors to Germany, and that the rest were permitted to leave with their property or its fair market value. The bill collector agreed that the killing of the Jews “was wrong, unless they committed treason in wartime. And of course they did.” He added that “some say it happened and some say it didn’t,” and that you “can show me pictures of skulls…but that doesn’t prove it.” In any case, “Hitler had nothing to do with it.” The tailor spoke similarly: “If it happened, it was wrong. But I don’t believe it happened.”
With evident fatigue, the baker reported, “One had no time to think. There was so much going on.” His account was similar to that of one of Mayer’s colleagues, a German philologist in the country at the time, who emphasized the devastatingly incremental nature of the descent into tyranny and said that “we had no time to think about these dreadful things that were growing, little by little, all around us.” The philologist pointed to a regime bent on diverting its people through endless dramas (often involving real or imagined enemies), and “the gradual habituation of the people, little by little, to being governed by surprise.” In his account, “each step was so small, so inconsequential, so well explained or, on occasion, ‘regretted,’” that people could no more see it “developing from day to day than a farmer in his field sees the corn growing. One day it is over his head.”
In such a time, we might be tempted to try to learn something from earlier turns toward authoritarianism, particularly the triumphant rise of the Nazis in Germany in the 1930s. The problem is that Nazism was so horrifying and so barbaric that for many people in nations where authoritarianism is now achieving a foothold, it is hard to see parallels between Hitler’s regime and their own governments. Many accounts of the Nazi period depict a barely imaginable series of events, a nation gone mad. That makes it easy to take comfort in the thought that it can’t happen again.

Milton Mayer’s 1955 classic They Thought They Were Free, recently republished with an afterword by the Cambridge historian Richard J. Evans, was one of the first accounts of ordinary life under Nazism. Dotted with humor and written with an improbably light touch, it provides a jarring contrast with Sebastian Haffner’s devastating, unfinished 1939 memoir, Defying Hitler, which gives a moment-by-moment, you-are-there feeling to Hitler’s rise. (The manuscript was discovered by Haffner’s son after the author’s death and published in 2000 in Germany, where it became an immediate sensation.)* A much broader perspective comes from Konrad Jarausch’s Broken Lives, an effort to reconstruct the experience of Germans across the entire twentieth century. What distinguishes the three books is their sense of intimacy. They do not focus on historic figures making transformative decisions. They explore how ordinary people attempted to navigate their lives under terrible conditions.
Haffner’s real name was Raimund Pretzel. (He used a pseudonym so as not to endanger his family while in exile in England.) He was a journalist, not a historian or political theorist, but he interrupts his riveting narrative to tackle a broad question: “What is history, and where does it take place?” He objects that most works of history give “the impression that no more than a few dozen people are involved, who happen to be ‘at the helm of the ship of state’ and whose deeds and decisions form what is called history.” In his view, that’s wrong. What matters are “we anonymous others” who are not just “pawns in the chess game,” because the “most powerful dictators, ministers, and generals are powerless against the simultaneous mass decisions taken individually and almost unconsciously by the population at large.” Haffner insists on the importance of investigating “some very peculiar, very revealing, mental processes and experiences,” involving “the private lives, emotions and thoughts of individual Germans.”
Mayer had the same aim. An American journalist of German descent, he tried to meet with Hitler in 1935. He failed, but he did travel widely in Nazi Germany. Stunned to discover a mass movement rather than a tyranny of a diabolical few, he concluded that his real interest was not in Hitler but in people like himself, to whom “something had happened that had not (or at least not yet) happened to me and my fellow-countrymen.” In 1951, he returned to Germany to find out what had made Nazism possible.
In They Thought They Were Free, Mayer decided to focus on ten people, different in many respects but with one characteristic in common: they had all been members of the Nazi Party. Eventually they agreed to talk, accepting his explanation that he hoped to enable the people of his nation to have a better understanding of Germany. Mayer was truthful about that and about nearly everything else. But he did not tell them that he was a Jew.
In the late 1930s—the period that most interested Mayer—his subjects were working as a janitor, a soldier, a cabinetmaker, an office manager, a baker, a bill collector, an inspector, a high school teacher, and a police officer. One had been a high school student. All were male. None of them occupied positions of leadership or influence. All of them referred to themselves as “wir kleine Leute, we little people.” They lived in Marburg, a university town on the river Lahn, not far from Frankfurt.
Mayer talked with them over the course of a year, under informal conditions—coffee, meals, and long, relaxed evenings. He became friends with each (and throughout he refers to them as such). As he put it, with evident surprise, “I liked them. I couldn’t help it.” They could be ironic, funny, and self-deprecating. Most of them enjoyed a joke that originated in Nazi Germany: “What is an Aryan? An Aryan is a man who is tall like Hitler, blond like Goebbels, and lithe like Göring.” They also could be wise. Speaking of the views of ordinary people under Hitler, one of them asked:
Opposition? How would anybody know? How would anybody know what somebody else opposes or doesn’t oppose? That a man says he opposes or doesn’t oppose depends upon the circumstances, where, and when, and to whom, and just how he says it. And then you must still guess why he says what he says.When Mayer returned home, he was afraid for his own country. He felt “that it was not German Man that I had met, but Man,” and that under the right conditions, he could well have turned out as his German friends did. He learned that Nazism took over Germany not “by subversion from within, but with a whoop and a holler.” Many Germans “wanted it; they got it; and they liked it.”
Mayer’s most stunning conclusion is that with one partial exception (the teacher), none of his subjects “saw Nazism as we—you and I—saw it in any respect.” Where most of us understand Nazism as a form of tyranny, Mayer’s subjects “did not know before 1933 that Nazism was evil. They did not know between 1933 and 1945 that it was evil. And they do not know it now.” Seven years after the war, they looked back on the period from 1933 to 1939 as the best time of their lives. (...)
The killing of six million Jews? Fake news. Four of Mayer’s subjects insisted that the only Jews taken to concentration camps were traitors to Germany, and that the rest were permitted to leave with their property or its fair market value. The bill collector agreed that the killing of the Jews “was wrong, unless they committed treason in wartime. And of course they did.” He added that “some say it happened and some say it didn’t,” and that you “can show me pictures of skulls…but that doesn’t prove it.” In any case, “Hitler had nothing to do with it.” The tailor spoke similarly: “If it happened, it was wrong. But I don’t believe it happened.”
With evident fatigue, the baker reported, “One had no time to think. There was so much going on.” His account was similar to that of one of Mayer’s colleagues, a German philologist in the country at the time, who emphasized the devastatingly incremental nature of the descent into tyranny and said that “we had no time to think about these dreadful things that were growing, little by little, all around us.” The philologist pointed to a regime bent on diverting its people through endless dramas (often involving real or imagined enemies), and “the gradual habituation of the people, little by little, to being governed by surprise.” In his account, “each step was so small, so inconsequential, so well explained or, on occasion, ‘regretted,’” that people could no more see it “developing from day to day than a farmer in his field sees the corn growing. One day it is over his head.”
by Cass R. Sunstein, NY Review of Books | Read more:
Image: August SanderTuesday, June 12, 2018
Keep Moving
A map of everywhere Anthony Bourdain visited on No Reservations, Parts Unknown, and The Layover.
via: Reddit
[ed. From the comments:
Q: I wonder what his trick was for not getting food poisoning.
A: I listened to his Fresh Air interview, and the answer seems to be "Take one for the team, eat it anyway, get the food poisoning. Eventually you grow tolerant." Apparently the whole production crew played it this way, and getting your system acclimated to dodgy food is just part of the deal. See also: Anthony Bourdain and the Hope for Better Men. Lots of good links, including Waffle House.]
A: I listened to his Fresh Air interview, and the answer seems to be "Take one for the team, eat it anyway, get the food poisoning. Eventually you grow tolerant." Apparently the whole production crew played it this way, and getting your system acclimated to dodgy food is just part of the deal. See also: Anthony Bourdain and the Hope for Better Men. Lots of good links, including Waffle House.]
Monday, June 11, 2018
The Hemp Revival: Why Marijuana's Cousin Could Soon be Big Business
Long associated with the hoariest hippie stereotypes, hemp is now chic.
The crop – which is a cannabis plant very similar to marijuana, but lacking its best-known property: getting you high – is a versatile raw material, and like its more notorious relative, it could once again become very lucrative.
The spread of marijuana legalization has sparked renewed interest not because of hemp wallet fanatics – but due largely to demand for CBD, a chemical both it and marijuana produce in which some see potential as a pharmaceutical and nutritional supplement.
For decades US anti-marijuana laws have made it very difficult to experiment on and develop new uses for hemp, though, according to the US government, hemp contains less than 0.3% THC, the plant’s primary psychoactive ingredient.
But the climate is changing. In recent weeks, the US Senate majority leader, Mitch McConnell, a conservative Republican who opposes marijuana legalization, has called for hemp to be legalized, a move which would benefit farmers, though McConnell still opposes marijuana legalization.
And it’s not just the farmers in McConnell’s home state of Kentucky who would be pleased to see a resurgence of the crop. Early this month for hemp history week, the US Senate unanimously passed a non-binding resolution acknowledging hemp’s economic value and “historical relevance”.
China, though wary of marijuana, has emerged as a hemp “superpower”, according to a fascinating 2017 story in the South China Morning Post. While hemp is indigenous to China, Chinese research advanced only recently in 1970 when the military used it for uniforms that would be more comfortable in the Vietnamese jungle. Today, the Morning Post notes, China holds more than half of the world’s more than 600 hemp-related patents.
Hemp’s revival is only the latest chapter in a long history.
In China, hemp has been used to make fabric and rope for more than 3,000 years. A Chinese eunuch named Cai Lun, who is credited with inventing paper during the early Christian era, used hemp as one of his source materials.
Between the 16th and 18th centuries, hemp rope, sails and rigging were so vital to the British Royal Navy that the supply was considered a national security issue. Both Henry VIII and Elizabeth I encouraged growing the crop.
In the US, some of the founding fathers grew hemp. Stoner lore has it that the constitution and declaration of independence were written on hemp. It’s not true – they were written on parchment – but drafts of the documents probably were.
But interest in hemp waned as cotton, a finer fabric dependent on American slave labor, ascended. It was last a relevant crop in the US during the second world war, when the Department of Agriculture made a Hemp for Victory film and encouraged production.
Today’s hemp advocates, a passionate cohort indeed, claim paper, cloth and biofuel made from hemp are all environmentally and economically attractive, relative to the prevailing current methods. If hemp became a major source for any one of these staples, it would be an immense opportunity, akin to legal marijuana.
And there are more uses for hemp still. The seeds are a good source of protein, popular with vegans. Hemp can also be made into a building material known as hempcrete, which is currently easier to access in parts of Europe than in the US. A bridge in sixth-century Gaul was built from hempcrete.
Bryan DeHaven’s Colorado-based clothing company, Chiefton Supply Co, makes T-shirts from a hemp/cotton blend. The component organic hemp is grown, processed into fabric and then stitched into apparel all at one facility in China’s Shangdong province.
DeHaven said hemp’s benefits included that it needed substantially less water than cotton to grow. The resultant cloth, he said, also had anti-bacterial properties; Chiefton is working with sportswear companies on breathable hemp clothing for athletes. “Those guys are really putting our garments to the test,” he said.
Eventually, DeHaven said, he would like to see Chiefton hemp goods produced in the US “from seed to seam”. It will be difficult. Hemp is not completely illegal in the US but the current rules are wildly convoluted, probably even more so than those for marijuana.
The crop – which is a cannabis plant very similar to marijuana, but lacking its best-known property: getting you high – is a versatile raw material, and like its more notorious relative, it could once again become very lucrative.
The spread of marijuana legalization has sparked renewed interest not because of hemp wallet fanatics – but due largely to demand for CBD, a chemical both it and marijuana produce in which some see potential as a pharmaceutical and nutritional supplement.
For decades US anti-marijuana laws have made it very difficult to experiment on and develop new uses for hemp, though, according to the US government, hemp contains less than 0.3% THC, the plant’s primary psychoactive ingredient.

And it’s not just the farmers in McConnell’s home state of Kentucky who would be pleased to see a resurgence of the crop. Early this month for hemp history week, the US Senate unanimously passed a non-binding resolution acknowledging hemp’s economic value and “historical relevance”.
China, though wary of marijuana, has emerged as a hemp “superpower”, according to a fascinating 2017 story in the South China Morning Post. While hemp is indigenous to China, Chinese research advanced only recently in 1970 when the military used it for uniforms that would be more comfortable in the Vietnamese jungle. Today, the Morning Post notes, China holds more than half of the world’s more than 600 hemp-related patents.
Hemp’s revival is only the latest chapter in a long history.
In China, hemp has been used to make fabric and rope for more than 3,000 years. A Chinese eunuch named Cai Lun, who is credited with inventing paper during the early Christian era, used hemp as one of his source materials.
Between the 16th and 18th centuries, hemp rope, sails and rigging were so vital to the British Royal Navy that the supply was considered a national security issue. Both Henry VIII and Elizabeth I encouraged growing the crop.
In the US, some of the founding fathers grew hemp. Stoner lore has it that the constitution and declaration of independence were written on hemp. It’s not true – they were written on parchment – but drafts of the documents probably were.
But interest in hemp waned as cotton, a finer fabric dependent on American slave labor, ascended. It was last a relevant crop in the US during the second world war, when the Department of Agriculture made a Hemp for Victory film and encouraged production.
Today’s hemp advocates, a passionate cohort indeed, claim paper, cloth and biofuel made from hemp are all environmentally and economically attractive, relative to the prevailing current methods. If hemp became a major source for any one of these staples, it would be an immense opportunity, akin to legal marijuana.
And there are more uses for hemp still. The seeds are a good source of protein, popular with vegans. Hemp can also be made into a building material known as hempcrete, which is currently easier to access in parts of Europe than in the US. A bridge in sixth-century Gaul was built from hempcrete.
Bryan DeHaven’s Colorado-based clothing company, Chiefton Supply Co, makes T-shirts from a hemp/cotton blend. The component organic hemp is grown, processed into fabric and then stitched into apparel all at one facility in China’s Shangdong province.
DeHaven said hemp’s benefits included that it needed substantially less water than cotton to grow. The resultant cloth, he said, also had anti-bacterial properties; Chiefton is working with sportswear companies on breathable hemp clothing for athletes. “Those guys are really putting our garments to the test,” he said.
Eventually, DeHaven said, he would like to see Chiefton hemp goods produced in the US “from seed to seam”. It will be difficult. Hemp is not completely illegal in the US but the current rules are wildly convoluted, probably even more so than those for marijuana.
by Alex Halperin, The Guardian | Read more:
Image: George WylesolSociopathic Tendencies
There have always been spectacular stories of lies and deceit in Silicon Valley—tales that span decades, of founders telling half-truths about how their companies were founded, or who founded them; of C.E.O.s exaggerating their latest products to fool the press or induce new funding. In the tech world, these falsehoods are so pedestrian that they have received the moniker “vaporware”: empty vessels that are promoted as complete products despite the knowledge that they will never see the light of day. Over time, the exhalations of these tech C.E.O.s became less about the actual lie, and more about who could deliver it with the utmost persuasion. I remember getting a call from Steve Jobs in the beginning of my career at The New York Times, in which the mythological chief of Apple somehow convinced me not to write a story about a software-related privacy problem. After 45 minutes on the phone with Jobs, I walked over to my editor and convinced him to kill the story. Yet a week later, I realized I’d been duped by Jobs. When I told a seasoned colleague at the Times, he simply laughed and explained, “Welcome to the Steve Jobs Reality-Distortion Field.” Jobs’s chicanery helped birth a whole new strain of tech nerd who believed that, in order to be as successful as King Jobs, you had to be the best used-car salesman in the parking lot. Some C.E.O.s told taradiddles, exaggerating the number of users on their platforms (ahem, Twitter); some in Congress say Mark Zuckerberg lied when he told Congress that people on Facebook have “complete control” over their personal data. (They don’t.) But all of these, all these made-up numbers, concocted valuations, and apocryphal stories of how a company was realized in a garage, are nothing—nothing!—compared to the audacious lies of Elizabeth Holmes, the founder and C.E.O. of Theranos.
Ahh, the story of Holmes, the dedicated Stanford dropout who was set to save the world, one pinprick of blood at a time, by inventing, at 19 years old, a blood-testing start-up which was once valued at almost $10 billion. For years, Holmes was on top of the tech world, gracing the cover of T: The New York Times Style Magazine, Forbes, Fortune, and Inc.,always wearing a black turtleneck and often sitting next to the title: “The Next Steve Jobs.” She was written about in Glamour and The New Yorker. She spoke at the TechCrunch Disrupt conference in 2014, and appeared on Vanity Fair’s New Establishment List in 2015. But as The Wall Street Journal’s John Carreyrou details in his new book, Bad Blood: Secrets and Lies in a Silicon Valley Startup, almost every word coming out of Holmes’s mouth as she built and ran her company was either grossly embellished or, in most instances, outright deceptive.
As Carreyrou writes, the company she built was just a pile of one deceit atop another. When Holmes courted Walgreens, she created completely false test results from their blood tests. When the company’s chief financial officer found out, Holmes fired him on the spot. Holmes told other investors that Theranos was going to make $100 million in revenue in 2014, but in reality the company was only on track to make $100,000 that year. She told the press that her blood-testing machine was capable of making over 1,000 tests, when in reality, it could only do one single type of test. She lied about a contract Theranos had with the Department of Defense, when she said her technology was being used in the battlefield, even though it was not. She repeatedly made up complete stories to the press about everything from her schooling to profits to the number of people whose lives would be saved from her bogus technology. And she did it all, day in and day out, while ensuring that no one inside or outside her company could publicly challenge the truthfulness of her claims.
While people like Jobs, Zuckerberg, Elon Musk, and other titans might stretch the truth and create reality-distortion fields, at the end of the day, they’re doing so to catapult their business—and to protect it. But when it came to Holmes, it seems there was no business to begin with. The entire house of cards was just that, a figment, nothing real. So what was she trying to get out of all these stories? On this week’s Inside the Hive podcast, I sat down with Carreyrou to try to understand how Holmes acted with such deceit, knowing full well that the technology she was selling, technology that was used to perform more than 8 million blood tests, according to Carreyrou, was putting people’s lives in danger. The obvious question to seeing someone act that way, with such utter disregard for how her actions would destroy other people’s lives, is to ask: is she a sociopath?
“At the end of my book, I say that a sociopath is described as someone with no conscience. I think she absolutely has sociopathic tendencies. One of those tendencies is pathological lying. I believe this is a woman who started telling small lies soon after she dropped out of Stanford, when she founded her company, and the lies became bigger and bigger,” Carreyrou said. “I think she’s someone that got used to telling lies so often, and the lies got so much bigger, that eventually the line between the lies and reality blurred for her.”
When I asked if she feels guilty for all the people’s lives who were affected by those lies, including the investors who lost money, the nearly 1,000 employees who lost their jobs, and the patients who were given completely inaccurate blood results, Carreyrou’s response surprised—shocked?—me. “She has shown zero sign of feeling bad, or expressing sorrow, or admitting wrongdoing, or saying sorry to the patients whose lives she endangered,” he said. He explained that in her mind, according to numerous former Theranos employees he has spoken to, Holmes believes that her entourage of employees led her astray and that the bad guy is actually John Carreyrou. “One person in particular, who left the company recently, says that she has a deeply engrained sense of martyrdom. She sees herself as sort of a Joan of Arc who is being persecuted,” he said.
Believe it or not, that’s not the most astonishing thing in the Elizabeth Holmes story. According to Carreyrou, Holmes is currently waltzing around Silicon Valley, meeting with investors, hoping to raise money for an entirely new start-up idea. (My mouth dropped when I heard that, too.) As the dust settles in the Theranos saga, it’s clear that the original investors in Theranos were gullible enough to hand over almost a billion dollars in funding, partially because, when it comes to Silicon Valley, there’s always a sucker hoping to get rich quick. (...)
The Theranos story isn’t over just yet. While she recently settled with the S.E.C. for “massive fraud” as part of the agreement, Holmes is not required to admit wrongdoing, but she has been forced to surrender voting control of Theranos and comply with a 10-year ban from serving as director or officer at any public company (Theranos, ironically, wasn’t public.) Holmes also agreed to return 18.9 million shares of stock, once worth almost $5 billion and now worth nothing, and to pay a small $500,000 penalty. Of course, there is still a major criminal investigation underway by the F.B.I., one that could end with Holmes behind bars. But not to worry: Holmes has lots of prosecutorial quotes she can borrow from Joan of Arc if she stands trial. “I am not afraid . . . I was born to do this.”

As Carreyrou writes, the company she built was just a pile of one deceit atop another. When Holmes courted Walgreens, she created completely false test results from their blood tests. When the company’s chief financial officer found out, Holmes fired him on the spot. Holmes told other investors that Theranos was going to make $100 million in revenue in 2014, but in reality the company was only on track to make $100,000 that year. She told the press that her blood-testing machine was capable of making over 1,000 tests, when in reality, it could only do one single type of test. She lied about a contract Theranos had with the Department of Defense, when she said her technology was being used in the battlefield, even though it was not. She repeatedly made up complete stories to the press about everything from her schooling to profits to the number of people whose lives would be saved from her bogus technology. And she did it all, day in and day out, while ensuring that no one inside or outside her company could publicly challenge the truthfulness of her claims.
While people like Jobs, Zuckerberg, Elon Musk, and other titans might stretch the truth and create reality-distortion fields, at the end of the day, they’re doing so to catapult their business—and to protect it. But when it came to Holmes, it seems there was no business to begin with. The entire house of cards was just that, a figment, nothing real. So what was she trying to get out of all these stories? On this week’s Inside the Hive podcast, I sat down with Carreyrou to try to understand how Holmes acted with such deceit, knowing full well that the technology she was selling, technology that was used to perform more than 8 million blood tests, according to Carreyrou, was putting people’s lives in danger. The obvious question to seeing someone act that way, with such utter disregard for how her actions would destroy other people’s lives, is to ask: is she a sociopath?
“At the end of my book, I say that a sociopath is described as someone with no conscience. I think she absolutely has sociopathic tendencies. One of those tendencies is pathological lying. I believe this is a woman who started telling small lies soon after she dropped out of Stanford, when she founded her company, and the lies became bigger and bigger,” Carreyrou said. “I think she’s someone that got used to telling lies so often, and the lies got so much bigger, that eventually the line between the lies and reality blurred for her.”
When I asked if she feels guilty for all the people’s lives who were affected by those lies, including the investors who lost money, the nearly 1,000 employees who lost their jobs, and the patients who were given completely inaccurate blood results, Carreyrou’s response surprised—shocked?—me. “She has shown zero sign of feeling bad, or expressing sorrow, or admitting wrongdoing, or saying sorry to the patients whose lives she endangered,” he said. He explained that in her mind, according to numerous former Theranos employees he has spoken to, Holmes believes that her entourage of employees led her astray and that the bad guy is actually John Carreyrou. “One person in particular, who left the company recently, says that she has a deeply engrained sense of martyrdom. She sees herself as sort of a Joan of Arc who is being persecuted,” he said.
Believe it or not, that’s not the most astonishing thing in the Elizabeth Holmes story. According to Carreyrou, Holmes is currently waltzing around Silicon Valley, meeting with investors, hoping to raise money for an entirely new start-up idea. (My mouth dropped when I heard that, too.) As the dust settles in the Theranos saga, it’s clear that the original investors in Theranos were gullible enough to hand over almost a billion dollars in funding, partially because, when it comes to Silicon Valley, there’s always a sucker hoping to get rich quick. (...)
The Theranos story isn’t over just yet. While she recently settled with the S.E.C. for “massive fraud” as part of the agreement, Holmes is not required to admit wrongdoing, but she has been forced to surrender voting control of Theranos and comply with a 10-year ban from serving as director or officer at any public company (Theranos, ironically, wasn’t public.) Holmes also agreed to return 18.9 million shares of stock, once worth almost $5 billion and now worth nothing, and to pay a small $500,000 penalty. Of course, there is still a major criminal investigation underway by the F.B.I., one that could end with Holmes behind bars. But not to worry: Holmes has lots of prosecutorial quotes she can borrow from Joan of Arc if she stands trial. “I am not afraid . . . I was born to do this.”
by Nick Bilton, Vanity Fair | Read more:
Image: Jeff Chiu/APHow LinkedIn Turned This “Failmom” Into a Socialist
One warm spring evening, after my teenage daughter and I had spent two hours browsing job boards, the two of us sat at on a bench on the lakefront path on Lake Michigan, watching sweating commuters bike, jog or walk past us. Most of them wore headphones or earbuds.
“Let’s try to guess what they’re listening to,” I suggested.
A shirtless young man ran by.
“Bruno Mars,” my daughter said.
A similar looking guy rode by on a bike.
“Maroon Five,” I said.
A red-faced woman who looked sort of like me, with loose flesh on her upper arms and an expanding menopausal waist, marched past.
“Chapo Trap House,” said my daughter.
I laughed.
A huffing middle-aged woman isn’t the first person one thinks of when they think of Chapo Trap House, a raunchy politics and comedy podcast that lambasts the Trump administration, conservative media, liberal media and most of the rest of American culture.
Early in Chapo Trap House’s existence, the New Yorker profiled the show and the voices behind it. Will Menaker, one of the five hosts, described their typical listener as a “failson.” Co-host Felix Biederman went on to define a failson as the guy that “goes downstairs at Thanksgiving, briefly mumbles, ‘Hi,’ everyone asks him how community college is going, he mumbles something about a 2.0 average, goes back upstairs with a loaf of bread and peanut butter.” His definition went on to mention gaming and masturbating.
The first part of that description could almost be me, an unemployed fiftysomething — or what, in Chapo parlance, might be called a “failmom.”
Of course, I don’t sit in my room eating peanut butter on Thanksgiving, but most other days I can. When my family goes off to work or school in the mornings, I spend a little time sending out résumés that disappear into an ether that has no use for middle-age women. Sometimes I look for gig economy work: walking dogs, when I can get the work, through Rover.com, or the occasional tutoring gig on Wyzant. And then I’m free to mumble and eat peanut butter.
In the same New Yorker profile, Matt Christman, my favorite Chapo host, saw the show and their audience as constituting a population of young people, mostly men, who are “nonessential human beings, who do not fit into the market as consumers or producers or as laborers.”
Yet it’s not just young people who are nonessential.
When I was employed as a copy editor, I thought more about comma placement and modifier placement than I did about economic and political displacement. I had faith in the establishment. Then I got laid off. Twice. There’s nothing like a years-long job search to make a person feel nonessential.
Applying for a job now is different than it used to be, when I could send email directly to the hiring manager or HR person. It’s hard to circumvent online applications, which can take an hour or more to fill out, including addresses of businesses (this requires searching for print publications that have moved as they’ve downsized) and names of supervisors, even though those supervisors moved on, either into retirement or more prestigious positions. The forms demand text in fields, whether there’s an answer or not. For some older job searchers, drop-down menus don’t include the years of employment or graduation. Illinois attorney general Lisa Madigan’s office investigated recruitment sites Monster.com, Indeed.com, CareerBuilder and other aggregators after finding that dates didn’t go back far enough for older applicants. Propublica and the New York Times, while investigating Facebook ads and their effects on the 2016 election, discovered employers like Amazon, Goldman Sachs, Target and Facebook targeted recruitment ads to users under a specific age. The age varied by employer, but generally stopped short of 45 or 50. As part of their research they also placed ads on LinkedIn and Google that excluded audiences over 40, and the ads were approved instantly.
What makes this even more insidious is that fact that one in five Americans can’t afford to retire. The Washington Post got a lot of traction last year with a profile of the growing population of formerly middle-class older adults who travel in RVs for seasonal jobs. (...)
The May jobs report showed an unemployment rate at an 18-year low, exciting economists and (employed) news consumers. This looks excellent on paper. But off the page are my failpeople, who have given up on searching for work or are earning what they can by cleaning or running errands via sites like TaskRabbit; delivering food via sites like Postmates; selling stuff on Craigslist or eBay; renting out rooms on Airbnb; or driving strangers around town. The Federal Reserve’s Report on the Economic Well-Being of U.S. Households, from May 2018, found that 31 percent of working adults work in the gig economy, and two-fifths of those people are doing said gig-work to supplement income from their paid jobs.
Economist Theresa Ghilarducci tracks unemployment among older adults, particularly women. In an interview with PBS Newshour in 2016 after the publication of her book “How to Retire With Enough Money,” Ghilarducci explained that women’s lives are often punctuated by time outside of the labor market because they care for family members — not just children, but aging parents as well. She describes a typical male hiring manager who sees an older female applicant. “He’s thinking about his partner, who he probably loves very much, but whose work he probably devalues, and he’s thinking about this job applicant that doesn’t have the experience he can recognize. And we all live, including this employer, in a patriarchal society, and the very definition of patriarchy is that women’s lives, women’s skills, what women are offering up, their potential economic value, is all devalued.” (...)
When I listen to Chapo Trap House, I know there’s a mirror image of me — a cranky person over 50 nodding in agreement at a polemic on-air voice — on the opposite side of the political spectrum. This feeling of being devalued or nonessential is what drove those opposites to vote for Trump. I might have done the same, because he talked so much about jobs and the forgotten people, but I couldn’t stomach his repeated statements about Mexicans and Muslims being criminals and terrorists. I couldn’t stomach the way he cheered on violence at his rallies, his disdain for environmental science, women, the mentally ill, and the fact-checking newsrooms where my still-employed friends work.
It’s no longer surprising for my husband to come home from work and find me standing in the kitchen, chopping vegetables, nodding as the Chapo Trap House hosts talk about the failings of capitalism and the government it buys. He told a mutual friend that I’d become radicalized. My views have changed enough that I went on a February afternoon to a Communist Manifesto Class held by the Party for Socialism and Liberation. I sat without about 30 people of all ages at long conference tables in a meeting room in a building that also houses the Mexico–U.S. Solidarity Network.
We took turns reading passages. We didn’t make it through the whole thing because there were so many pauses for discussion. What I learned is that communism is a practice; it’s not static.
To add to my radical bona fides, my son made a collage of my face, added to the famous image of Lenin, Engels, Marx, Mao and Stalin, for Mother’s Day.
Around the same time as this class, Chapo Trap House interviewed the economist Richard Wolff. They started out examining the film “Boss Baby” and its political leanings. This, of course, led to a discussion about America’s economy, and the fact that polls show that millennials prefer socialism to capitalism, perhaps because the market crash of 2008 happened during their formative years. When I hear or read about millennials and their love for socialism, I wonder why more people my age don’t embrace it. We came of age in the Great Recession. We were the ones laid off in 2008.
Wolff predicts a shift to socialism in America, because young people will push for it. A lot of mistakes were made in the name of socialism and communism, he says. We have to learn from history.
“I’d like to remind people the transition from feudalism to capitalism didn’t happen in some smooth way,” he says. “Capitalism came into the world after lots of fits and starts and trials and errors. Why do we imagine it will be any different going from capitalism to socialism?”
“Let’s try to guess what they’re listening to,” I suggested.
A shirtless young man ran by.
“Bruno Mars,” my daughter said.
A similar looking guy rode by on a bike.
“Maroon Five,” I said.
A red-faced woman who looked sort of like me, with loose flesh on her upper arms and an expanding menopausal waist, marched past.
“Chapo Trap House,” said my daughter.
I laughed.

Early in Chapo Trap House’s existence, the New Yorker profiled the show and the voices behind it. Will Menaker, one of the five hosts, described their typical listener as a “failson.” Co-host Felix Biederman went on to define a failson as the guy that “goes downstairs at Thanksgiving, briefly mumbles, ‘Hi,’ everyone asks him how community college is going, he mumbles something about a 2.0 average, goes back upstairs with a loaf of bread and peanut butter.” His definition went on to mention gaming and masturbating.
The first part of that description could almost be me, an unemployed fiftysomething — or what, in Chapo parlance, might be called a “failmom.”
Of course, I don’t sit in my room eating peanut butter on Thanksgiving, but most other days I can. When my family goes off to work or school in the mornings, I spend a little time sending out résumés that disappear into an ether that has no use for middle-age women. Sometimes I look for gig economy work: walking dogs, when I can get the work, through Rover.com, or the occasional tutoring gig on Wyzant. And then I’m free to mumble and eat peanut butter.
In the same New Yorker profile, Matt Christman, my favorite Chapo host, saw the show and their audience as constituting a population of young people, mostly men, who are “nonessential human beings, who do not fit into the market as consumers or producers or as laborers.”
Yet it’s not just young people who are nonessential.
When I was employed as a copy editor, I thought more about comma placement and modifier placement than I did about economic and political displacement. I had faith in the establishment. Then I got laid off. Twice. There’s nothing like a years-long job search to make a person feel nonessential.
Applying for a job now is different than it used to be, when I could send email directly to the hiring manager or HR person. It’s hard to circumvent online applications, which can take an hour or more to fill out, including addresses of businesses (this requires searching for print publications that have moved as they’ve downsized) and names of supervisors, even though those supervisors moved on, either into retirement or more prestigious positions. The forms demand text in fields, whether there’s an answer or not. For some older job searchers, drop-down menus don’t include the years of employment or graduation. Illinois attorney general Lisa Madigan’s office investigated recruitment sites Monster.com, Indeed.com, CareerBuilder and other aggregators after finding that dates didn’t go back far enough for older applicants. Propublica and the New York Times, while investigating Facebook ads and their effects on the 2016 election, discovered employers like Amazon, Goldman Sachs, Target and Facebook targeted recruitment ads to users under a specific age. The age varied by employer, but generally stopped short of 45 or 50. As part of their research they also placed ads on LinkedIn and Google that excluded audiences over 40, and the ads were approved instantly.
What makes this even more insidious is that fact that one in five Americans can’t afford to retire. The Washington Post got a lot of traction last year with a profile of the growing population of formerly middle-class older adults who travel in RVs for seasonal jobs. (...)
The May jobs report showed an unemployment rate at an 18-year low, exciting economists and (employed) news consumers. This looks excellent on paper. But off the page are my failpeople, who have given up on searching for work or are earning what they can by cleaning or running errands via sites like TaskRabbit; delivering food via sites like Postmates; selling stuff on Craigslist or eBay; renting out rooms on Airbnb; or driving strangers around town. The Federal Reserve’s Report on the Economic Well-Being of U.S. Households, from May 2018, found that 31 percent of working adults work in the gig economy, and two-fifths of those people are doing said gig-work to supplement income from their paid jobs.
Economist Theresa Ghilarducci tracks unemployment among older adults, particularly women. In an interview with PBS Newshour in 2016 after the publication of her book “How to Retire With Enough Money,” Ghilarducci explained that women’s lives are often punctuated by time outside of the labor market because they care for family members — not just children, but aging parents as well. She describes a typical male hiring manager who sees an older female applicant. “He’s thinking about his partner, who he probably loves very much, but whose work he probably devalues, and he’s thinking about this job applicant that doesn’t have the experience he can recognize. And we all live, including this employer, in a patriarchal society, and the very definition of patriarchy is that women’s lives, women’s skills, what women are offering up, their potential economic value, is all devalued.” (...)
When I listen to Chapo Trap House, I know there’s a mirror image of me — a cranky person over 50 nodding in agreement at a polemic on-air voice — on the opposite side of the political spectrum. This feeling of being devalued or nonessential is what drove those opposites to vote for Trump. I might have done the same, because he talked so much about jobs and the forgotten people, but I couldn’t stomach his repeated statements about Mexicans and Muslims being criminals and terrorists. I couldn’t stomach the way he cheered on violence at his rallies, his disdain for environmental science, women, the mentally ill, and the fact-checking newsrooms where my still-employed friends work.
It’s no longer surprising for my husband to come home from work and find me standing in the kitchen, chopping vegetables, nodding as the Chapo Trap House hosts talk about the failings of capitalism and the government it buys. He told a mutual friend that I’d become radicalized. My views have changed enough that I went on a February afternoon to a Communist Manifesto Class held by the Party for Socialism and Liberation. I sat without about 30 people of all ages at long conference tables in a meeting room in a building that also houses the Mexico–U.S. Solidarity Network.
We took turns reading passages. We didn’t make it through the whole thing because there were so many pauses for discussion. What I learned is that communism is a practice; it’s not static.
To add to my radical bona fides, my son made a collage of my face, added to the famous image of Lenin, Engels, Marx, Mao and Stalin, for Mother’s Day.
Around the same time as this class, Chapo Trap House interviewed the economist Richard Wolff. They started out examining the film “Boss Baby” and its political leanings. This, of course, led to a discussion about America’s economy, and the fact that polls show that millennials prefer socialism to capitalism, perhaps because the market crash of 2008 happened during their formative years. When I hear or read about millennials and their love for socialism, I wonder why more people my age don’t embrace it. We came of age in the Great Recession. We were the ones laid off in 2008.
Wolff predicts a shift to socialism in America, because young people will push for it. A lot of mistakes were made in the name of socialism and communism, he says. We have to learn from history.
“I’d like to remind people the transition from feudalism to capitalism didn’t happen in some smooth way,” he says. “Capitalism came into the world after lots of fits and starts and trials and errors. Why do we imagine it will be any different going from capitalism to socialism?”
by Lori Barrett, Salon | Read more:
Image: Getty/PeopleImagesNet Neutrality is Officially Dead Today. Now What?
The Obama-era net neutrality rules, passed in 2015, are defunct. This time it's for real.
Though some minor elements of the proposal by the Republican-led FCC to roll back those net neutrality rules went into effect last month, most aspects still required approval from the Office of Management and Budget. That's now been taken care of, with the Federal Communications Commission declaring June 11 as the date the proposal takes effect.
While many people agree with the basic principles of net neutrality, the specific rules enforcing the idea has been a lightning rod for controversy. That's because to get the rules to hold up in court, an earlier, Democrat-led FCC had reclassified broadband networks so that they fell under the same strict regulations that govern telephone networks.
FCC Chairman Ajit Pai has called the Obama-era rules "heavy-handed" and "a mistake," and he's argued that they deterred innovation and depressed investment in building and expanding broadband networks. (Read his op-ed on CNET here.) To set things right, he says, he's taking the FCC back to a "light touch" approach to regulation, a move that Republicans and internet service providers have applauded.
But supporters of net neutrality -- such as big tech companies like Google and Facebook, as well as consumer groups and pioneers of the internet like World Wide Web creator Tim Berners-Lee -- say the internet as we know it may not exist without these protections.
"We need a referee on the field who can throw a flag," former FCC Chairman and Obama appointee Tom Wheeler said at MIT during a panel discussion in support of rules like those he championed. Wheeler was chairman when the rules passed three years ago.
If you still don't feel like you understand what all the hubbub is about, have no fear. We've assembled this FAQ to put everything in plain English.
What's net neutrality again?
Net neutrality is the principle that all traffic on the internet should be treated equally, regardless of whether you're checking Facebook, posting pictures to Instagram or streaming movies from Netflix or Amazon. It also means companies like AT&T, which is trying to buy Time Warner, or Comcast, which owns NBC Universal, can't favor their own content over a competitor's.
So what's happening?
The FCC, led by Ajit Pai, voted on Dec. 14 to repeal the 2015 net neutrality regulations, which prohibited broadband providers from blocking or slowing down traffic and banned them from offering so-called fast lanes to companies willing to pay extra to reach consumers more quickly than competitors.
The most significant change resulting from the proposal is the stripping away of the FCC's authority to regulate broadband and the shifting of that responsibility to the Federal Trade Commission.
Does this mean no one will be policing the internet?
The FTC will be the new cop on the beat. It can take action against companies that violate contracts with consumers or that participate in anticompetitive and fraudulent activity.
So what's the big deal? Is the FTC equipped to make sure broadband companies don't harm consumers?
The FTC already oversees consumer protection and competition for the whole economy. But this also means the agency is swamped. And because the FTC isn't focused exclusively on the telecommunications sector, it's unlikely the agency can deliver the same kind of scrutiny the FCC would.
More importantly, the FTC also lacks the FCC's rule-making authority. This means FTC enforcement extends only to companies' voluntary public commitments or to violations of antitrust law. Unless broadband and wireless carriers commit in writing to basic net neutrality principles, the FTC can only enforce antitrust issues, which must meet a high legal standard.
Also, any action the FTC takes happens after the fact. And investigations of wrongdoing can take years. (...)
What's it all mean for me?
This is a huge change in policy at the FCC and it could affect how you experience the internet. Keep in mind, your experience isn't likely to change right away.
But over time, it could change significantly. Whether you think that change will be for the better or the worse depends on whom you believe.
Pai and many other Republicans say freeing up broadband providers from onerous and outdated regulation will let them invest more in their networks. They're hopeful this will lead to more expansion in rural and hard-to-service areas of the country, as well as higher-speed service throughout the US. The agency's argument for repealing the rules is that investment started to decline in 2015 after the rules were adopted.
But Democrats like Sen. Ed Markey of Massachusetts, consumer advocacy groups, civil rights organizations and technology companies like Google and Mozilla say that repealing the 2015 rules and stripping the FCC of its authority will lead to broadband companies controlling more of your internet experience.
As companies like AT&T, Verizon and Comcast acquire more online content like video, they could give their own services priority on their networks, squeezing out competitors and limiting what you can access. This might mean fewer startups get a shot at becoming the next Facebook, Netflix or YouTube. Ultimately, it could lead to your internet experience looking more like cable TV, where all the content is curated by your provider.
Some critics also fear this control could lead to higher prices. And groups such as the American Civil Liberties Union say it could affect your First Amendment right to free speech as big companies control more of what you experience online.
"Internet rights are civil rights," said Jay Stanley, an ACLU senior policy analyst. "Gutting net neutrality will have a devastating effect on free speech online. Without it, gateway corporations like Comcast, Verizon and AT&T will have too much power to mess with the free flow of information."
by Marguerite Reardon, CNET | Read more:
Though some minor elements of the proposal by the Republican-led FCC to roll back those net neutrality rules went into effect last month, most aspects still required approval from the Office of Management and Budget. That's now been taken care of, with the Federal Communications Commission declaring June 11 as the date the proposal takes effect.
While many people agree with the basic principles of net neutrality, the specific rules enforcing the idea has been a lightning rod for controversy. That's because to get the rules to hold up in court, an earlier, Democrat-led FCC had reclassified broadband networks so that they fell under the same strict regulations that govern telephone networks.

But supporters of net neutrality -- such as big tech companies like Google and Facebook, as well as consumer groups and pioneers of the internet like World Wide Web creator Tim Berners-Lee -- say the internet as we know it may not exist without these protections.
"We need a referee on the field who can throw a flag," former FCC Chairman and Obama appointee Tom Wheeler said at MIT during a panel discussion in support of rules like those he championed. Wheeler was chairman when the rules passed three years ago.
If you still don't feel like you understand what all the hubbub is about, have no fear. We've assembled this FAQ to put everything in plain English.
What's net neutrality again?
Net neutrality is the principle that all traffic on the internet should be treated equally, regardless of whether you're checking Facebook, posting pictures to Instagram or streaming movies from Netflix or Amazon. It also means companies like AT&T, which is trying to buy Time Warner, or Comcast, which owns NBC Universal, can't favor their own content over a competitor's.
So what's happening?
The FCC, led by Ajit Pai, voted on Dec. 14 to repeal the 2015 net neutrality regulations, which prohibited broadband providers from blocking or slowing down traffic and banned them from offering so-called fast lanes to companies willing to pay extra to reach consumers more quickly than competitors.
Commentary by FCC Chairman Ajit Pai: Our job is to protect a free and open internetUnder the 2015 rules, the FCC reclassified broadband as a utility, which gave it the authority to regulate broadband infrastructure much as it did the old telephone network.
The most significant change resulting from the proposal is the stripping away of the FCC's authority to regulate broadband and the shifting of that responsibility to the Federal Trade Commission.
Does this mean no one will be policing the internet?
The FTC will be the new cop on the beat. It can take action against companies that violate contracts with consumers or that participate in anticompetitive and fraudulent activity.
So what's the big deal? Is the FTC equipped to make sure broadband companies don't harm consumers?
The FTC already oversees consumer protection and competition for the whole economy. But this also means the agency is swamped. And because the FTC isn't focused exclusively on the telecommunications sector, it's unlikely the agency can deliver the same kind of scrutiny the FCC would.
More importantly, the FTC also lacks the FCC's rule-making authority. This means FTC enforcement extends only to companies' voluntary public commitments or to violations of antitrust law. Unless broadband and wireless carriers commit in writing to basic net neutrality principles, the FTC can only enforce antitrust issues, which must meet a high legal standard.
Also, any action the FTC takes happens after the fact. And investigations of wrongdoing can take years. (...)
What's it all mean for me?
This is a huge change in policy at the FCC and it could affect how you experience the internet. Keep in mind, your experience isn't likely to change right away.
But over time, it could change significantly. Whether you think that change will be for the better or the worse depends on whom you believe.
Pai and many other Republicans say freeing up broadband providers from onerous and outdated regulation will let them invest more in their networks. They're hopeful this will lead to more expansion in rural and hard-to-service areas of the country, as well as higher-speed service throughout the US. The agency's argument for repealing the rules is that investment started to decline in 2015 after the rules were adopted.
But Democrats like Sen. Ed Markey of Massachusetts, consumer advocacy groups, civil rights organizations and technology companies like Google and Mozilla say that repealing the 2015 rules and stripping the FCC of its authority will lead to broadband companies controlling more of your internet experience.
As companies like AT&T, Verizon and Comcast acquire more online content like video, they could give their own services priority on their networks, squeezing out competitors and limiting what you can access. This might mean fewer startups get a shot at becoming the next Facebook, Netflix or YouTube. Ultimately, it could lead to your internet experience looking more like cable TV, where all the content is curated by your provider.
Some critics also fear this control could lead to higher prices. And groups such as the American Civil Liberties Union say it could affect your First Amendment right to free speech as big companies control more of what you experience online.
"Internet rights are civil rights," said Jay Stanley, an ACLU senior policy analyst. "Gutting net neutrality will have a devastating effect on free speech online. Without it, gateway corporations like Comcast, Verizon and AT&T will have too much power to mess with the free flow of information."
by Marguerite Reardon, CNET | Read more:
Image: via
Subscribe to:
Posts (Atom)