Tuesday, May 15, 2018
Cosmic Crisp: Washington’s New Apple Could Be a Game-Changer
Bruce Barritt, Ph.D., is running around the apple orchard with his camera snapping pictures. He resembles a kind of horticultural Pete Carroll, coach of the Seattle Seahawks. He’s energetic, articulate and has a bounce for a guy in his 70s.
It is spring planting season and what’s taking place here in the hills east of Wenatchee is the elaborate choreography of putting in a new orchard. Multiple tractors are going back and forth opening rows of soil while workers drop small, twig-like trees into the furrows. Other workers follow behind covering the rootstock and trimming each tree as it’s planted. Hundreds of trees are planted in minutes. Watering systems and trellises follow.
It’s not uncommon to have a camera when an important birth is taking place, and make no mistake, this planting season is part of an elaborate gestation of a new apple variety that is designed to change the industry and consumer tastes. Barritt, emeritus professor of apple breeding at Washington State University, is the proud papa.
He has been working on this new apple for more than 20 years — since the mid-1990s — and now his dream is literally coming to fruition. “My kids don’t like me to say this but these are like my kids,” he says gesturing at the trees.
The patented name of the new apple is WA 38, but you will know it as the Cosmic Crisp. It is part of a huge bet the Washington apple industry is making to create a new variety that will supplant many of the old familiars, like the iconic Red Delicious. The Northwest, led by Washington, provides about two-thirds of America’s fresh apples and also nearly 75 percent of all U.S. apples, including those used for juice. The state’s apples also sell around the world. With funding from state growers and led by Barritt, WSU researchers have invented a new variety that, they believe, will change the face of the industry and win enthusiasm among the public with a combination of taste, texture and usability.
Just over 600,000 Cosmic Crips trees were in the ground in 2017, with some 7 million more being planted this year in 2018, and another 6 million next — a pace faster than expected. The new apple will be available to consumers in the fall of 2019 — it takes about two years for a new tree to bear fruit.
Barritt says growers will invest some $500 million planting the Cosmic Crisp over the next few years. Kathryn Grandy, director of marketing and operations for Proprietary Variety Management (PVM), a Yakima company tasked with introducing the apple to consumers, tells me it’s “the largest launch of a produce item” ever in the U.S.
According to PVM, Cosmic Crisp will begin to replace Galas, Fujis, Cameos, Braeburns and other varieties, including the Red and Golden Delicious.
Casey Corr, who just retired as managing editor of the industry publication Good Fruit Grower, says the apple has to be an instant success when it hits the supermarket. “It’s gotta be like the new iPhone,” Corr says.
It is spring planting season and what’s taking place here in the hills east of Wenatchee is the elaborate choreography of putting in a new orchard. Multiple tractors are going back and forth opening rows of soil while workers drop small, twig-like trees into the furrows. Other workers follow behind covering the rootstock and trimming each tree as it’s planted. Hundreds of trees are planted in minutes. Watering systems and trellises follow.

He has been working on this new apple for more than 20 years — since the mid-1990s — and now his dream is literally coming to fruition. “My kids don’t like me to say this but these are like my kids,” he says gesturing at the trees.
The patented name of the new apple is WA 38, but you will know it as the Cosmic Crisp. It is part of a huge bet the Washington apple industry is making to create a new variety that will supplant many of the old familiars, like the iconic Red Delicious. The Northwest, led by Washington, provides about two-thirds of America’s fresh apples and also nearly 75 percent of all U.S. apples, including those used for juice. The state’s apples also sell around the world. With funding from state growers and led by Barritt, WSU researchers have invented a new variety that, they believe, will change the face of the industry and win enthusiasm among the public with a combination of taste, texture and usability.
Just over 600,000 Cosmic Crips trees were in the ground in 2017, with some 7 million more being planted this year in 2018, and another 6 million next — a pace faster than expected. The new apple will be available to consumers in the fall of 2019 — it takes about two years for a new tree to bear fruit.
Barritt says growers will invest some $500 million planting the Cosmic Crisp over the next few years. Kathryn Grandy, director of marketing and operations for Proprietary Variety Management (PVM), a Yakima company tasked with introducing the apple to consumers, tells me it’s “the largest launch of a produce item” ever in the U.S.
According to PVM, Cosmic Crisp will begin to replace Galas, Fujis, Cameos, Braeburns and other varieties, including the Red and Golden Delicious.
Casey Corr, who just retired as managing editor of the industry publication Good Fruit Grower, says the apple has to be an instant success when it hits the supermarket. “It’s gotta be like the new iPhone,” Corr says.
The Cosmic Crisp is big, mostly red and very juicy. Barritt says from the beginning the breeding program was designed with the consumer in mind. The apple market has changed over the years. Once staple varieties like Red and Golden Delicious were problematic — short shelve lives, bland flavors — and they’ve lost some popularity (sales peaked in 1994). Those varieties still sell and in some overseas markets like Japan, where tastes run to the familiar, the Red Delicious still is regarded as the ideal of what an apple should be, mostly due to its iconic shape and deep red color. Personally, though, I have never liked it. Other varieties have more flavor, better texture and are easier to grow. For those reasons some believe the Delicious is “obsolete.” (...)
The Cosmic Crisp has a number of advantages. It is slow to turn brown when cut. I had half of one in the car for six hours and it hadn’t even started to turn brown when I got it home. It keeps longer after harvest. Picked in September, the Cosmic Crisp in cold storage can last a year, extending its lifespan and reducing waste. It’s a 365-day-a-year apple designed to thrive in Eastern Washington’s apple friendly soils and climate, unlike varieties brought from overseas or the East Coast.
Barritt says that while benefits for growers are important, it’s taste that will make or break the variety. To that end, I visited WSU’s Tree Fruit Research and Extension Center in Wenatchee where I had a chance to discuss the Cosmic Crisp with Barritt and his successor overseeing research, Kate Evans, Ph.D.
We went down into a basement lab where vials filled with fluid from various Cosmic Crisps was being tested for acidity, which “provides the character of the apple,” says Barritt. It plays a key role in how any apple tastes, and learning to get the proper balance under differing growing conditions is important. Evans continues to conduct research on test trees in order to compile a grower’s manual for how to produce the optimum Cosmic Crisps.
The researchers take a batch of Cosmic Crisp apples out of the box. Barritt and Evans give some instruction on how to taste an apple. “Taste,” it turns out, is not just on the tongue. How does an apple sound when you bite into it? Does it crunch? Does the bite snap off in your mouth? What’s the texture like — smooth or mealy? Is the skin too thick? Is it juicy or dry? Taste involves all the sense before you even get to sweet or sour, the blend of flavors that make up an apple.
The WA 38 designation means it was the WSU team’s 38th attempt to get a new variety. Coming up with the perfect apple takes time. I was fully prepared to be disappointed — the industry hype and catering to mass tastes made me a little suspicious. While it’s not a GMO apple like the Arctic, you’re still talking about something created by scientists and commercial growers who are planting cloned trees.
But the Cosmic Crisp ticked every box: good looking, with a nice crunch and powerful snap, a beautiful sweet-tart balance, tons of juice trickling down the chin. I wasn’t overwhelmed by, say, hints of blueberry or a floral nose — the kinds of complexities wine tasters go on about. But it was one of the best apples I’ve ever eaten. In fact, my sample was the essence of apple.
by Knute Berger & Eric Keto, Crosscut | Read more:
Image: Karen Ducey
Barritt says that while benefits for growers are important, it’s taste that will make or break the variety. To that end, I visited WSU’s Tree Fruit Research and Extension Center in Wenatchee where I had a chance to discuss the Cosmic Crisp with Barritt and his successor overseeing research, Kate Evans, Ph.D.
We went down into a basement lab where vials filled with fluid from various Cosmic Crisps was being tested for acidity, which “provides the character of the apple,” says Barritt. It plays a key role in how any apple tastes, and learning to get the proper balance under differing growing conditions is important. Evans continues to conduct research on test trees in order to compile a grower’s manual for how to produce the optimum Cosmic Crisps.
The researchers take a batch of Cosmic Crisp apples out of the box. Barritt and Evans give some instruction on how to taste an apple. “Taste,” it turns out, is not just on the tongue. How does an apple sound when you bite into it? Does it crunch? Does the bite snap off in your mouth? What’s the texture like — smooth or mealy? Is the skin too thick? Is it juicy or dry? Taste involves all the sense before you even get to sweet or sour, the blend of flavors that make up an apple.
The WA 38 designation means it was the WSU team’s 38th attempt to get a new variety. Coming up with the perfect apple takes time. I was fully prepared to be disappointed — the industry hype and catering to mass tastes made me a little suspicious. While it’s not a GMO apple like the Arctic, you’re still talking about something created by scientists and commercial growers who are planting cloned trees.
But the Cosmic Crisp ticked every box: good looking, with a nice crunch and powerful snap, a beautiful sweet-tart balance, tons of juice trickling down the chin. I wasn’t overwhelmed by, say, hints of blueberry or a floral nose — the kinds of complexities wine tasters go on about. But it was one of the best apples I’ve ever eaten. In fact, my sample was the essence of apple.
by Knute Berger & Eric Keto, Crosscut | Read more:
Image: Karen Ducey
Has Wine Gone Bad?
But among wine critics, there is a deep suspicion that the natural wine movement is intent on tearing down the norms and hierarchies that they have dedicated their lives to upholding. The haziness of what actually counts as natural wine is particularly maddening to such traditionalists. “There is no legal definition of natural wine,” Michel Bettane, one of France’s most influential wine critics, told me. “It exists because it proclaims itself so. It is a fantasy of marginal producers.” Robert Parker, perhaps the world’s most powerful wine critic, has called natural wine an “undefined scam”.
For natural wine enthusiasts, though, the lack of strict rules is part of its appeal. At a recent natural wine fair in London, I encountered winemakers who farmed by the phases of the moon and didn’t own computers; one man foraged his grapes from wild vines in the mountains of Georgia; there was a couple who were reviving an old Spanish technique of placing the wine in great clear glass demijohns outside to capture sunlight; others were ageing their wines in handmade clay pots, buried underground to keep them cool as their predecessors did in the days of ancient Rome. (...)
At first glance, the idea that wine should be more natural seems absurd. Wine’s own iconography, right down to the labels, suggests a placid world of rolling green hills, village harvests and vintners shuffling down to the cellar to check in on the mysterious process of fermentation. The grapes arrive in your glass transformed, but relatively unmolested.
Yet, as natural wine advocates point out, the way most wine is produced today looks nothing like this picture-postcard vision. Vineyards are soaked with pesticide and fertiliser to protect the grapes, which are a notoriously fragile crop. In 2000, a French government report noted that vineyards used 3% of all agricultural land, but 20% of the total pesticides. In 2013, a study found traces of pesticides in 90% of wines available at French supermarkets.
In response to this, a small but growing number of vineyards have introduced organic farming. But what happens once the grapes have been harvested is less scrutinised, and, to natural wine enthusiasts, scarcely less horrifying. The modern winemaker has access to a vast armamentarium of interventions, from supercharged lab-grown yeast, to antimicrobials, antioxidants, acidity regulators and filtering gelatins, all the way up to industrial machines. Wine is regularly passed through electrical fields to prevent calcium and potassium crystals from forming, injected with various gases to aerate or protect it, or split into its constituent liquids by reverse osmosis and reconstituted with a more pleasing alcohol to juice ratio.
Natural winemakers believe that none of this is necessary. The basics of winemaking are, in fact, almost stupefyingly simple: all it involves is crushing together some ripe grapes. When the yeasts that live on the skin of the grape come into contact with the sweet juice inside, they begin gorging themselves on the sugars, releasing bubbles of carbon dioxide into the air and secreting alcohol into the mixture. This continues either until there is no more sugar, or the yeasts make the surrounding environment so alcoholic that even they cannot live in it. At this point, strictly speaking, you have wine. In the millennia since humans first undertook this process, winemaking has become a highly technical art, but the fundamental alchemy is unchanged. Fermentation is the indivisible step. Whatever precedes it is grape juice, and whatever follows it is wine.
“The yeasts are the key between the vines and the people,” Pacalet told me, in a reverent tone. “You use the living system to express the information in the soil. If you use industrial techniques, even if it’s a small operation, you’re making an industrial product.” Viewed in this quasi-spiritual way, the winemaker’s job is to grow healthy grapes, tend to the fermentation, and intervene as little as possible.
In practice, this means going without the methods that have given modern winemakers so much control over their product. Even more radically, it means jettisoning the expectations of mainstream wine culture, which dictates that wine from a certain place should always taste a certain way, and that a winemaker works like a conductor, intervening to turn up or tamp down the various elements of the wine until it plays the tune the audience expects. “It is important a sancerre tastes like a sancerre, then we can start to determine levels of quality,” says Ronan Sayburn, the head of wine at the private wine club and bar 67 Pall Mall.
In France, which remains the cultural and commercial centre of the wine world, the acceptable styles of winemaking aren’t just a matter of history and convention; they are codified into law. For a wine to be labelled as from a particular region, it must adhere to strict guidelines about which grapes and production techniques can be used, and how the resulting wine should taste. This system of certification – the appellation d’origine contrôlée (AOC), or “protected designation of origin” – is enforced by inspectors and blind-tasting panels. Wines that fail to conform to these standards are labelled “vin de France”, a generic designation that suggests low quality and makes them less attractive to buyers.
Some natural winemakers have rebelled against this legislation, which they believe only reinforces the dominant styles and methods that are ruining wine altogether. In 2003, the natural winemaker Olivier Cousin opted out of his local AOC, complaining in a letter that meeting their standards meant that “one must beat the grapes with machines, add sulphites, enzymes and yeast, sterilise and filter”. When he refused to stop describing his wine as being from Anjou, he was actually prosecuted for labelling violations. In response, Cousin put on a good show, riding his draft horse up to the courtroom steps and bringing a barrel of his offending wine to share with passers-by. But he ended up changing the labels.
“The AOC are liars,” Olivier’s son Baptiste, who has taken over several of his father’s vineyards, told me. “The local designations were created to protect small producers, but now they just enforce poor quality.”
The expectations of how a wine from a certain region should taste go back hundreds of years, but the global industry that has been built atop them is largely a product of the past century. If natural wine is a backlash against anything, it is the idea that it is possible to square traditional methods of winemaking with the scale and demands of that market. There is a sense that alongside economic success, globalisation has slowly forced the wine world toward a dull, crowd-pleasing conformity.
France has long been the centre of the wine world, but until the mid-20th century most vineyards were small and worked mainly by hand. In the eyes of natural winemakers, the rot began in the decades after the second world war, as French vineyards modernised and the industry grew into a global economic behemoth. To these disillusioned observers, what seems like a story of technical and economic triumph is really the tragic tale of how wine lost its way.
by Stephen Buranyi, The Guardian | Read more:
Image: uncredited
North Korean Tunnels
North Korea has long depended on tunnel technology. Tunnels hide some of the country’s biggest secrets. Between 1974 and 1990, four tunnels were discovered running from the North under the D.M.Z. deep into South Korea. There may be dozens more still undetected, South Korean officials told me. Pyongyang dug the tunnels through bedrock and later equipped them with lights and ventilation, to infiltrate troops into the South in the event of war. I visited the so-called Third Tunnel of Aggression earlier this month. It came within thirty miles of Seoul. It was large enough for thirty thousand troops to pass through in an hour. It was detected, on a tip from a North Korean defector, in 1978.
Visitors can now tour the tunnel after clearing a South Korean military checkpoint into the D.M.Z. You put on a hard hat and take a little tram down a steep slope, two hundred and forty feet into the earth. A list of instructions advises, in English and Korean, “Do not enter the tunnel drunk” and “People with respiratory and heart problems should not participate in this tour.” Claustrophobia, too. Access to the tunnel ends at a concrete slab installed by the South to demarcate the border. (...)
The challenge will be what North Korea actually surrenders. Kim will have to confess the location and details of hundreds, maybe even thousands, of tunnels on the other side of the D.M.Z. which hide his military treasures. “Some think North Korea has built ten thousand underground facilities since the nineteen-sixties,” the retired lieutenant-general In-Bum Chun, a former director of operational planning for the South Korean Joint Chiefs of Staff, told me. The facilities reportedly include troop bunkers along the D.M.Z.; facilities for up to five thousand metric tons of chemical weapons, one of the world’s largest stockpiles; underground hangars; and three underground runways to allow tunnel takeoffs by military aircraft.
[ed. Underground runways! See also: North Korea expands threat to cancel Trump-Kim summit, saying it won’t be pushed to abandon its nukes. They hate Bolton (for good reason).]

The challenge will be what North Korea actually surrenders. Kim will have to confess the location and details of hundreds, maybe even thousands, of tunnels on the other side of the D.M.Z. which hide his military treasures. “Some think North Korea has built ten thousand underground facilities since the nineteen-sixties,” the retired lieutenant-general In-Bum Chun, a former director of operational planning for the South Korean Joint Chiefs of Staff, told me. The facilities reportedly include troop bunkers along the D.M.Z.; facilities for up to five thousand metric tons of chemical weapons, one of the world’s largest stockpiles; underground hangars; and three underground runways to allow tunnel takeoffs by military aircraft.
by Robin Wright, New Yorker | Read more:
Image: Olivier Mirguet / Agence VU/ Redux[ed. Underground runways! See also: North Korea expands threat to cancel Trump-Kim summit, saying it won’t be pushed to abandon its nukes. They hate Bolton (for good reason).]
Outrageous Medical Bills? Get Them Itemized First
If you happened to be listening to NPR this morning, you might have heard this story on outrageous medical bills sandwiched between reports of the new U.S. embassy in Israel and dozens of dead Palestinians (welcome to Infrastructure Week!). For months, NPR has been working with Kaiser Health News to collect stories of medical bills gone wrong, and today, they brought us the tale of Sherry Young, a retired librarian who underwent two relatively minor surgeries in one day, one for a shoulder injury and one for a bone spur in her foot. In total, she was in the hospital for three days.
Her bill? Over $115,000.
Young had insurance, but because her hospital stay wasn't pre-approved, Young was on the hook for the whole thing, which was more than her home was worth and over five times her annual income. Young did the smart thing: She asked for an itemized bill, and what she found was shocking. According to the bill, Young was being charged $15,076 for four tiny screws made by a company called Arthrex that were placed in her foot.
"Unless the metal [was] mined on an asteroid, I do not know why it should cost that amount," Young told NPR. She tried to get to the bottom of it herself, but the University of Oklahoma, where she'd had her surgery, refused to tell her how much the screws and other things had cost the hospital, so Young did another smart thing: She got in touch with NPR, and NPR was able to get more answers than the patient herself.
John Schmieding, senior vice president and general counsel for Arthrex, declined to tell reporters exactly how much his company charges hospitals for those screws, but he did say that screws generally range from $300 per screw to $1,000 per screw, which means the hospital marked them up anywhere from 275 to 1,150 percent. Those screws, according to an expert NPR interviewed, likely cost around $30 to manufacture.
And that was just the beginning of it. Young was also charged $4,265 for a drill bit, $5,047 for a tool that removes and cauterizes tissue, and $619 for a saw blade. And those tools are (or at least should be) reusable.
Young got lucky. When reporters started looking into her story, BlueCross BlueShield of Oklahoma said the whole thing was a mistake. But if she hadn't been insured—and if she hadn't contacted the media—she would have been responsible for those $115,000 in hospital fees, the sort of money that that makes people go broke. And she's hardly alone: There are endless accounts of outrageous hospital fees: from $3000 for a 15-minute consult and no treatment to $1,420 for two hours of babysitting to $441 for one liter of salt water. In general, insurance companies negotiate with hospitals over prices and patients don't ever see the true costs of their care. But of course, without insurance, and having no negotiating power with hospitals, the uninsured are often times out of luck. Those $15,000 screws are coming out of your pocket.
NPR and Kaiser are going to continue to report on these issues, so, if you have an outrageous medical bill, upload it here.
Her bill? Over $115,000.

"Unless the metal [was] mined on an asteroid, I do not know why it should cost that amount," Young told NPR. She tried to get to the bottom of it herself, but the University of Oklahoma, where she'd had her surgery, refused to tell her how much the screws and other things had cost the hospital, so Young did another smart thing: She got in touch with NPR, and NPR was able to get more answers than the patient herself.
John Schmieding, senior vice president and general counsel for Arthrex, declined to tell reporters exactly how much his company charges hospitals for those screws, but he did say that screws generally range from $300 per screw to $1,000 per screw, which means the hospital marked them up anywhere from 275 to 1,150 percent. Those screws, according to an expert NPR interviewed, likely cost around $30 to manufacture.
And that was just the beginning of it. Young was also charged $4,265 for a drill bit, $5,047 for a tool that removes and cauterizes tissue, and $619 for a saw blade. And those tools are (or at least should be) reusable.
Young got lucky. When reporters started looking into her story, BlueCross BlueShield of Oklahoma said the whole thing was a mistake. But if she hadn't been insured—and if she hadn't contacted the media—she would have been responsible for those $115,000 in hospital fees, the sort of money that that makes people go broke. And she's hardly alone: There are endless accounts of outrageous hospital fees: from $3000 for a 15-minute consult and no treatment to $1,420 for two hours of babysitting to $441 for one liter of salt water. In general, insurance companies negotiate with hospitals over prices and patients don't ever see the true costs of their care. But of course, without insurance, and having no negotiating power with hospitals, the uninsured are often times out of luck. Those $15,000 screws are coming out of your pocket.
NPR and Kaiser are going to continue to report on these issues, so, if you have an outrageous medical bill, upload it here.
by Katie Herzog, The Stranger | Read more:
Image: Getty
[ed. Stories like this are so common these days it's easy to get desensitized (except for the people experiencing them!). Do note the link at the end of this article: (Share Your Medical Bill With Us). Sometimes public shaming is the only effective method for dealing with the hospital/insurance industrial complex.
'Tax Amazon': Seattle Passes Plan for Corporate Wealth Tax to Fund Housing
A parade of hardhat union workers and threats from hometown-behemoth Amazon did not stop Seattle leaders from passing on Monday a “head tax” meant to fund housing projects and homeless services.
A watered-down version of the tax, which will charge the city’s largest employees $275 per worker annually, is now expected to be enacted by Seattle’s mayor, Jenny Durkan. The tax is projected to generate about $48m a year to address a housing crisis spurred on by Amazon’s rapid growth.
A broader tax proposal prompted the tech company to halt construction on one Seattle office tower and put off a lease of another tower. Union construction workers marched on city call to protest the tax, which also drew opposition from business interests.
Socialists and self-styled members of the “Seattle silent majority” squared off prior to Monday’s vote. Neither side supported the compromise, and most speakers blamed city leaders for an escalating homelessness crisis that has seen city sidewalks, parks and roadsides packed with tents and shacks.
About 60% of the tax revenue will go to new housing projects for low and middle-income Seattle residents. The remainder would go to homeless services, including shelter beds, camps and overnight parking.
On Friday, city council members approved a proposal to charge the large employers in the city $500-per-employee. Following a veto threat from Durkan, the council decreased the total charge and included a five-year sunset provision over objections of supporters of the original legislation.
“Do not capitulate to [Amazon CEO Jeff] Bezos’ bullying,” Emily McArthur, an organizer with Socialist Alternative, demanded of the council. “Tax Amazon. Be leaders.”
Amazon has driven Seattle’s economy in recent years, drawing thousands of well-paid workers to the region. The “Bezos Boom” has proved a mixed blessing, though, as middle-income residents have been priced out of Seattle. The city council president, Bruce Harrell, spoke to a growing “fear of what this city is becoming”.
The move by Amazon to create HQ2 – a second headquarters elsewhere – has stoked fears that Seattle’s liberal politics will turn off the company. Threats from Amazon that it will halt growth in Seattle in favor of other offices lend credence to those concerns.
In a statement issued Monday, Vice President Drew Herdener said the company would resume construction on the downtown tower but was considering whether Seattle is the place for it to grow.
“We remain very apprehensive about the future created by the council’s hostile approach and rhetoric toward larger businesses, which forces us to question our growth here,” Herdener said.
by Levi Pulkkinen, The Guardian | Read more:
A watered-down version of the tax, which will charge the city’s largest employees $275 per worker annually, is now expected to be enacted by Seattle’s mayor, Jenny Durkan. The tax is projected to generate about $48m a year to address a housing crisis spurred on by Amazon’s rapid growth.

Socialists and self-styled members of the “Seattle silent majority” squared off prior to Monday’s vote. Neither side supported the compromise, and most speakers blamed city leaders for an escalating homelessness crisis that has seen city sidewalks, parks and roadsides packed with tents and shacks.
About 60% of the tax revenue will go to new housing projects for low and middle-income Seattle residents. The remainder would go to homeless services, including shelter beds, camps and overnight parking.
On Friday, city council members approved a proposal to charge the large employers in the city $500-per-employee. Following a veto threat from Durkan, the council decreased the total charge and included a five-year sunset provision over objections of supporters of the original legislation.
“Do not capitulate to [Amazon CEO Jeff] Bezos’ bullying,” Emily McArthur, an organizer with Socialist Alternative, demanded of the council. “Tax Amazon. Be leaders.”
Amazon has driven Seattle’s economy in recent years, drawing thousands of well-paid workers to the region. The “Bezos Boom” has proved a mixed blessing, though, as middle-income residents have been priced out of Seattle. The city council president, Bruce Harrell, spoke to a growing “fear of what this city is becoming”.
The move by Amazon to create HQ2 – a second headquarters elsewhere – has stoked fears that Seattle’s liberal politics will turn off the company. Threats from Amazon that it will halt growth in Seattle in favor of other offices lend credence to those concerns.
In a statement issued Monday, Vice President Drew Herdener said the company would resume construction on the downtown tower but was considering whether Seattle is the place for it to grow.
“We remain very apprehensive about the future created by the council’s hostile approach and rhetoric toward larger businesses, which forces us to question our growth here,” Herdener said.
by Levi Pulkkinen, The Guardian | Read more:
Image: Elaine Thompson/AP
[ed. Too Big to Tax? Apparently not, and that's a good thing (although the current Seattle City Council does seem a little nuts in other ways). See also: Amazon threatens to move jobs out of Seattle over new tax and Seattle returns to Wells Fargo because no other bank wants city’s business]
[ed. Too Big to Tax? Apparently not, and that's a good thing (although the current Seattle City Council does seem a little nuts in other ways). See also: Amazon threatens to move jobs out of Seattle over new tax and Seattle returns to Wells Fargo because no other bank wants city’s business]
The Kandy-Kolored Tangerine-Flake Streamline Baby
Introduction:
I don't mean for this to sound like "I had a vision" or anything, but there was a specific starting point for practically all of these stories. I wrote them in a fifteen-month period, and the whole thing started with the afternoon I went to a Hot Rod & Custom Car show at the Coliseum in New York. Strange afternoon! I was sent up there to cover the Hot Rod & Custom Car show by the New York Herald Tribune, and I brought back exactly the kind of story and of the somnambulistic totem newspapers in America would have come up with. A totem newspaper is the kind people don't really buy to read but just to have, physically, because they know it supports their own outlook on life. They're just like the buffalo tongues the Omaha Indians used to carry around or the dog ears the Mahili clan carried around in Bengal. There are two kinds to totem newspapers in the country. One is the symbol of the frightened chair-arm-doilie Vicks Vapo-Rub Weltanschauung that lies there in the solar plexus of all good gray burghers. All those nice stories on the first page of the second section about eighty-seven-year-old ladies on Gramercy Park who have one-hundred-and-two-year-old turtles or about the colorful street vendors of Havana. Mommy! This fellow Castor is in there, and revolutions may come and go, but the picturesque poor will endure, padding around in the streets selling their chestnuts and salt pretzels the world over, even in Havana, Cuba, assuring a paradise, after all, full of respect and obeisance, for all us Vicks Vapo-Rub chair-arm-doilie burghers. After all. Or another totem group buys the kind of paper they can put under their arms and have the totem for the touch-but-wholesome outlook, the Mom's Pie view of life. Everybody can go off to the bar and drink a few "brews" and retail some cynical remarks about Zora Folley and how the fight game is these days and round it off, though, with how George Chuvalo has "a lot of heart," which he got, one understands, by eating mom's pie. Anyway, I went to the Hot Rod & Custom Car show and wrote a story that would have suited any of the totem newspapers. All the totem newspapers would regard one of these shows as a sideshow, a panopticon, for creeps and kooks; not even wealthy, eccentric creeps and kooks, which would be all right, but lower class creeps and nutballs with dermatitic skin and ratty hair. The totem story usually makes what is known as "gentle fun" of this, which is a way of saying, don't worry, these people are nothing.
So I wrote a story about a kid who had built a golden motorcycle, which he called "The Golden Alligator." The seat was made of some kind of gold-painted leather that kept going back, on and on, as long as an alligator's tail, and had scales embossed on it, like an alligator's. The kid had made a whole golden suit for himself, like a space suit, that also looked as if it were covered with scales and he would lie down on his stomach on this long seat, stretched out full length, so that he appeared to be made into the motorcycle or something, and roar around Greenwich Village on Saturday nights, down Macdougal Street, down there in Nut Heaven, looking like a golden alligator on wheels. Nutty! He seemed like a Gentle Nut when I got through. It was a shame I wrote that sort of story, the usual totem story, because I was working for the Herald Tribune, and the Herald Tribune was the only experimental paper in town, breaking out of the totem formula. The thing was, I knew I had another story all the time, a bona fide story, the real story of the Hot Rod & Custom Car show, but I didn't know what to do with it. It was outside the system of ideas I was used to working with, even though I had been through the whole Ph.D. route at Yale, in American Studies and everything.
Here were all these . . . weird . . . nutty-looking, crazy baroque custom cars, sitting in little nests of pink angora angel's hair for the purpose of "glamorous" display—but then I got to talking to one of the men who make them, a fellow named Dale Alexander. He was a very serious and soft-spoken man, about thirty, completely serious about the whole thing, in fact, and pretty soon it became clear, as I talked to this man for a while, that he had been living like the complete artist for years. He had starved, suffered—the whole thing—so he could sit inside a garage and create these cars which more than 99 per cent of the American people would consider ridiculous, vulgar and lower-class-awful beyond comment almost. He had started off with a garage that fixed banged-up cars and everything, to pay the rent, but gradually he couldn't stand it any more. Creativity—his own custom car art—became an obsession with him. So he became the complete custom car artist. And he said he wasn't the only one. All the great custom car designers had gone through it. It was the only way. Holy beasts! Starving artists! Inspiration! Only instead of garrets, they had these garages.
So I went over to Esquire magazine after a while and talked to them about this phenomenon, and they sent me out to California to take a look at the custom car world. Dale Alexander was from Detroit or some place, but the real center of the thing was in California, around Los Angeles. I started talking to a lot of these people, like George Barris and Ed Roth, and seeing what they were doing, and—well, eventually it became the story from which the title of this book was taken, "The Kandy-Kolored Tangerine-Flake Streamline Baby." But at first I couldn't even write the story. I came back to New York and just sat around worrying over the thing. I had a lot of trouble analyzing exactly what I had on my hands. By this time Esquire practically had a gun at my head because they had a two-page-wide color picture for the story locked into the printing presses and no story. Finally, I told Byron Dobell, the managing editor at Esquire, that I couldn't pull the thing together. O.K., he tells me, just type out my notes and send them over and he will get somebody else to write it. So about 8 o'clock that night I started typing the notes out in the form of a memorandum that began, "Dear Byron." I started typing away, starting right with the first time I saw any custom cars in California. I just started recording it all, and inside of a couple of hours, typing along like a madman, I could tell that something was beginning to happen. By midnight this memorandum to Byron was twenty pages long and I was still typing like a maniac. About 2 A.M. or something like that I turned on WABC, a radio station that plays rock and roll music all night long, and got a little more manic. I wrapped up the memorandum about 6:15 A.M., and by this time it was 49 pages long. I took it over to Esquire as soon as they opened up, about 9:30 A.M. About 4 P.M. I got a call from Byron Dobell. He told me they were striking out the "Dear Byron" at the top of the memorandum and running the rest of it in the magazine. That was the story, "The Kandy-Kolored Tangerine-Flake Streamline Baby."
I don't mean for this to sound like "I had a vision" or anything, but there was a specific starting point for practically all of these stories. I wrote them in a fifteen-month period, and the whole thing started with the afternoon I went to a Hot Rod & Custom Car show at the Coliseum in New York. Strange afternoon! I was sent up there to cover the Hot Rod & Custom Car show by the New York Herald Tribune, and I brought back exactly the kind of story and of the somnambulistic totem newspapers in America would have come up with. A totem newspaper is the kind people don't really buy to read but just to have, physically, because they know it supports their own outlook on life. They're just like the buffalo tongues the Omaha Indians used to carry around or the dog ears the Mahili clan carried around in Bengal. There are two kinds to totem newspapers in the country. One is the symbol of the frightened chair-arm-doilie Vicks Vapo-Rub Weltanschauung that lies there in the solar plexus of all good gray burghers. All those nice stories on the first page of the second section about eighty-seven-year-old ladies on Gramercy Park who have one-hundred-and-two-year-old turtles or about the colorful street vendors of Havana. Mommy! This fellow Castor is in there, and revolutions may come and go, but the picturesque poor will endure, padding around in the streets selling their chestnuts and salt pretzels the world over, even in Havana, Cuba, assuring a paradise, after all, full of respect and obeisance, for all us Vicks Vapo-Rub chair-arm-doilie burghers. After all. Or another totem group buys the kind of paper they can put under their arms and have the totem for the touch-but-wholesome outlook, the Mom's Pie view of life. Everybody can go off to the bar and drink a few "brews" and retail some cynical remarks about Zora Folley and how the fight game is these days and round it off, though, with how George Chuvalo has "a lot of heart," which he got, one understands, by eating mom's pie. Anyway, I went to the Hot Rod & Custom Car show and wrote a story that would have suited any of the totem newspapers. All the totem newspapers would regard one of these shows as a sideshow, a panopticon, for creeps and kooks; not even wealthy, eccentric creeps and kooks, which would be all right, but lower class creeps and nutballs with dermatitic skin and ratty hair. The totem story usually makes what is known as "gentle fun" of this, which is a way of saying, don't worry, these people are nothing.
So I wrote a story about a kid who had built a golden motorcycle, which he called "The Golden Alligator." The seat was made of some kind of gold-painted leather that kept going back, on and on, as long as an alligator's tail, and had scales embossed on it, like an alligator's. The kid had made a whole golden suit for himself, like a space suit, that also looked as if it were covered with scales and he would lie down on his stomach on this long seat, stretched out full length, so that he appeared to be made into the motorcycle or something, and roar around Greenwich Village on Saturday nights, down Macdougal Street, down there in Nut Heaven, looking like a golden alligator on wheels. Nutty! He seemed like a Gentle Nut when I got through. It was a shame I wrote that sort of story, the usual totem story, because I was working for the Herald Tribune, and the Herald Tribune was the only experimental paper in town, breaking out of the totem formula. The thing was, I knew I had another story all the time, a bona fide story, the real story of the Hot Rod & Custom Car show, but I didn't know what to do with it. It was outside the system of ideas I was used to working with, even though I had been through the whole Ph.D. route at Yale, in American Studies and everything.

So I went over to Esquire magazine after a while and talked to them about this phenomenon, and they sent me out to California to take a look at the custom car world. Dale Alexander was from Detroit or some place, but the real center of the thing was in California, around Los Angeles. I started talking to a lot of these people, like George Barris and Ed Roth, and seeing what they were doing, and—well, eventually it became the story from which the title of this book was taken, "The Kandy-Kolored Tangerine-Flake Streamline Baby." But at first I couldn't even write the story. I came back to New York and just sat around worrying over the thing. I had a lot of trouble analyzing exactly what I had on my hands. By this time Esquire practically had a gun at my head because they had a two-page-wide color picture for the story locked into the printing presses and no story. Finally, I told Byron Dobell, the managing editor at Esquire, that I couldn't pull the thing together. O.K., he tells me, just type out my notes and send them over and he will get somebody else to write it. So about 8 o'clock that night I started typing the notes out in the form of a memorandum that began, "Dear Byron." I started typing away, starting right with the first time I saw any custom cars in California. I just started recording it all, and inside of a couple of hours, typing along like a madman, I could tell that something was beginning to happen. By midnight this memorandum to Byron was twenty pages long and I was still typing like a maniac. About 2 A.M. or something like that I turned on WABC, a radio station that plays rock and roll music all night long, and got a little more manic. I wrapped up the memorandum about 6:15 A.M., and by this time it was 49 pages long. I took it over to Esquire as soon as they opened up, about 9:30 A.M. About 4 P.M. I got a call from Byron Dobell. He told me they were striking out the "Dear Byron" at the top of the memorandum and running the rest of it in the magazine. That was the story, "The Kandy-Kolored Tangerine-Flake Streamline Baby."
by Tom Wolfe, TomWolfe.com | Read more:
Monday, May 14, 2018
I Don’t Know How to Waste Time on the Internet Anymore
The other day, I found myself looking at a blinking cursor in a blank address bar in a new tab of my web browser. I was bored. I didn’t really feel like doing work, but I felt some distant compulsion to sit at my computer in a kind of work-simulacrum, so that at least at the end of the day I would feel gross and tired in the manner of someone who had worked. What I really wanted to do was waste some time.
But … I didn’t know how. I did not know what to type into the address bar of my browser. I stared at the cursor. Eventually, I typed “nytimes.com” and hit enter. Like a freaking dad. The entire world of the internet, one that used to boast so many ways to waste time, and here I was, reading the news. It was even worse than working.
In high school, I took a computer class. I have no idea what I was supposed to be learning. Instead I browsed Fark (user-submitted links from around the web, sort of a proto-Reddit) and eBaum’s World (a mix of early memes, stolen content, and ads for hard-core porn), and printed guitar tabs that would turn out to be wildly incorrect. In college, I hung out on forums like Something Awful, a gigantic repository of jokes (some good), advice (mostly bad), and aimless chatter among thousands of also bored teens, experimenting and working within the staccato confines of the Bulletin Board System. There were writers, too; I read Seanbaby and Old Man Murray and other anarchic internet writers, posting irregularly and with zero professionalism on garish websites. Red text on black backgrounds, broken navigations. I wrote a LiveJournal, badly, and read the LiveJournals of my friends and friends of friends. Everyone said too much and said it poorly. It was incredibly entertaining.
Facebook came in my first year of college. Just as eBaum’s World’s videos gave me a welcome excuse to ignore my computer class, albums of strangely similar photos taken on digicams in dimly lit house parties became my preferred time waster. There’s Steve, from high school, in a spectacularly unflattering shot in someone’s dirty living room in a college town in Virginia, lit by a nuclear flash from someone’s Nikon Coolpix. Sick. Hey, what happened to that girl Steve dated? (She’s also in someone’s dirty living room, her eyes neon red, drinking right out of an $8 bottle of wine.)
This world — of blogs and forums and weird personal sites and early, college-era Facebook — was made for dicking around. After college, when I had a real job, with health insurance and a Keurig machine, I would read blogs, funny people talking about nothing in particular with no goal besides being entertaining for a three- to eight-minute block. These were evolutions of the Seanbaby type of writers. Their websites were comparatively elegant, set up for ease of reading. Gawker, Videogum, the Awl, the A.V. Club, Wonkette, various blogs even less commercial than those. There was one that just made fun of Saved by the Bell episodes. I never even watched Saved by the Bell, but I loved that one.
I started a Twitter account, and fell into a world of good, dumb, weird jokes, links to new sites and interesting ideas. It was such an excellent place to waste time that I almost didn’t notice that the blogs and link-sharing sites I’d once spent hours on had become less and less viable. Where once we’d had a rich ecosystem of extremely stupid and funny sites on which we might procrastinate, we now had only Twitter and Facebook.
And then, one day, I think in 2013, Twitter and Facebook were not really very fun anymore. And worse, the fun things they had supplanted were never coming back. Forums were depopulated; blogs were shut down. Twitter, one agent of their death, became completely worthless: a water-drop-torture feed of performative outrage, self-promotion, and discussion of Twitter itself. Facebook had become, well … you’ve been on Facebook.
In the decade since I took that computer class, the web browser has taken over the entire computing experience. There is nothing to “learn” about computers, really, except how to use a browser; everything you might want to do is done from that stupid empty address bar. Today, through that web browser, there are movies and TV shows and every song ever recorded; it’s where I do my writing and chatting and messaging; it’s where my notes and calendars and social networks live. It’s everything except fun.
There is an argument that this my fault. I followed the wrong people; I am too nostalgic about bad blogs; I am in my 30s and what I used to think was fun time-killing is now deadly. But I don’t think so. What happened is that the internet stopped being something you went to in order to separate from the real world — from your job and your work and your obligations and responsibilities. It’s not the place you seek to waste time, but the place you go to so that you’ll someday have time to waste. The internet is a utility world for me now. It is efficient and all-encompassing. It is not very much fun.
by Dan Nosowitz, Select All | Read more:
But … I didn’t know how. I did not know what to type into the address bar of my browser. I stared at the cursor. Eventually, I typed “nytimes.com” and hit enter. Like a freaking dad. The entire world of the internet, one that used to boast so many ways to waste time, and here I was, reading the news. It was even worse than working.

Facebook came in my first year of college. Just as eBaum’s World’s videos gave me a welcome excuse to ignore my computer class, albums of strangely similar photos taken on digicams in dimly lit house parties became my preferred time waster. There’s Steve, from high school, in a spectacularly unflattering shot in someone’s dirty living room in a college town in Virginia, lit by a nuclear flash from someone’s Nikon Coolpix. Sick. Hey, what happened to that girl Steve dated? (She’s also in someone’s dirty living room, her eyes neon red, drinking right out of an $8 bottle of wine.)
This world — of blogs and forums and weird personal sites and early, college-era Facebook — was made for dicking around. After college, when I had a real job, with health insurance and a Keurig machine, I would read blogs, funny people talking about nothing in particular with no goal besides being entertaining for a three- to eight-minute block. These were evolutions of the Seanbaby type of writers. Their websites were comparatively elegant, set up for ease of reading. Gawker, Videogum, the Awl, the A.V. Club, Wonkette, various blogs even less commercial than those. There was one that just made fun of Saved by the Bell episodes. I never even watched Saved by the Bell, but I loved that one.
I started a Twitter account, and fell into a world of good, dumb, weird jokes, links to new sites and interesting ideas. It was such an excellent place to waste time that I almost didn’t notice that the blogs and link-sharing sites I’d once spent hours on had become less and less viable. Where once we’d had a rich ecosystem of extremely stupid and funny sites on which we might procrastinate, we now had only Twitter and Facebook.
And then, one day, I think in 2013, Twitter and Facebook were not really very fun anymore. And worse, the fun things they had supplanted were never coming back. Forums were depopulated; blogs were shut down. Twitter, one agent of their death, became completely worthless: a water-drop-torture feed of performative outrage, self-promotion, and discussion of Twitter itself. Facebook had become, well … you’ve been on Facebook.
In the decade since I took that computer class, the web browser has taken over the entire computing experience. There is nothing to “learn” about computers, really, except how to use a browser; everything you might want to do is done from that stupid empty address bar. Today, through that web browser, there are movies and TV shows and every song ever recorded; it’s where I do my writing and chatting and messaging; it’s where my notes and calendars and social networks live. It’s everything except fun.
There is an argument that this my fault. I followed the wrong people; I am too nostalgic about bad blogs; I am in my 30s and what I used to think was fun time-killing is now deadly. But I don’t think so. What happened is that the internet stopped being something you went to in order to separate from the real world — from your job and your work and your obligations and responsibilities. It’s not the place you seek to waste time, but the place you go to so that you’ll someday have time to waste. The internet is a utility world for me now. It is efficient and all-encompassing. It is not very much fun.
by Dan Nosowitz, Select All | Read more:
Image: uncredited
The Burnout Crisis in American Medicine
During a recent evening on call in the hospital, I was asked to see an elderly woman with a failing kidney. She’d come in feeling weak and short of breath and had been admitted to the cardiology service because it seemed her heart wasn’t working right. Among other tests, she had been scheduled for a heart-imaging procedure the following morning; her doctors were worried that the vessels in her heart might be dangerously narrowed. But then they discovered that one of her kidneys wasn’t working, either. The ureter, a tube that drains urine from the kidney to the bladder, was blocked, and relieving the blockage would require minor surgery. This presented a dilemma. Her planned heart-imaging test would require contrast dye, which could only be given if her kidney function was restored—but surgery with a damaged heart was risky.
I went to the patient’s room, where I found her sitting alone in a reclining chair by the window, hands folded in her lap under a blanket. She smiled faintly when I walked in, but the creasing of her face was the only movement I detected. She didn’t look like someone who could bounce back from even a small misstep in care. The risks of surgery, and by extension the timing of it, would need to be considered carefully.
I called the anesthesiologist in charge of the operating room schedule to ask about availability. If the cardiology department cleared her for surgery, he said, he could fit her in the following morning. I then called the on-call cardiologist to ask whether it would be safe to proceed. He hesitated. “I’m just covering,” he said. “I don’t know her well enough to say one way or the other.” He offered to pass on the question to her regular cardiologist.
A while later, he called back: The regular cardiologist had given her blessing. After some more calls, the preparations were made. My work was done, I thought. But then the phone rang: It was the anesthesiologist, apologetic. “The computer system,” he said. “It’s not letting me book the surgery.” Her appointment for heart imaging, which had been made before her kidney problems were discovered, was still slated for the following morning; the system wouldn’t allow another procedure at the same time. So I called the cardiologist yet again, this time asking him to reschedule the heart study. But doctors weren’t allowed to change the schedule, he told me, and the administrators with access to it wouldn’t be reachable until morning.
I felt deflated. For hours, my attention had been consumed by challenges of coordination rather than actual patient care. And still the patient was at risk of experiencing delays for both of the things she needed—not for any medical reason, but simply because of an inflexible computer system and a poor workflow.
Situations like this are not rare, and they are vexing in part because they expose the widening gap between the ideal and reality of medicine. Doctors become doctors because they want to take care of patients. Their decade-long training focuses almost entirely on the substance of medicine—on diagnosing and treating illness. In practice, though, many of their challenges relate to the operations of medicine—managing a growing number of patients, coordinating care across multiple providers, documenting it all. Regulations governing the use of electronic medical records (EMRs), first introduced in the Health Information Technology for Economic and Clinical Health (HITECH) Act in 2009, have gotten more and more demanding, while expanded insurance coverage from the Affordable Care Act may have contributed to an uptrend in patient volume at many health centers. These changes are taking a toll on physicians: There’s some evidence that the administrative burden of medicine—and with it, the proportion of burned-out doctors—is on the rise. A study published last year in Health Affairs reported that from 2011 to 2014, physicians spent progressively more time on “desktop medicine” and less on face-to-face patient care. Another study found that the percentage of physicians reporting burnout increased over the same period; by 2014, more than half said they were affected.
To understand how burnout arises, imagine a young chef. At the restaurant where she works, Bistro Med, older chefs are retiring faster than new ones can be trained, and the customer base is growing, which means she has to cook more food in less time without compromising quality. This tall order is made taller by various ancillary tasks on her plate: bussing tables, washing dishes, coordinating with other chefs so orders aren’t missed, even calling the credit-card company when cards get declined.
Then the owners announce that to get paid for her work, this chef must document everything she cooks in an electronic record. The requirement sounds reasonable at first but proves to be a hassle of bewildering proportions. She can practically make eggs Benedict in her sleep, but enter “egg” into the computer system? Good luck. There are separate entries for white and brown eggs; egg whites, yolks, or both; cage-free and non-cage-free; small, medium, large, and jumbo. To log every ingredient, she ends up spending more time documenting her preparation than actually preparing the dish. And all the while, the owners are pressuring her to produce more and produce faster.
It wouldn’t be surprising if, at some point, the chef decided to quit. Or maybe she doesn’t quit—after all, she spent all those years in training—but her declining morale inevitably affects the quality of her work.
In medicine, burned-out doctors are more likely to make medical errors, work less efficiently, and refer their patients to other providers, increasing the overall complexity (and with it, the cost) of care. They’re also at high risk of attrition: A survey of nearly 7,000 U.S. physicians, published last year in the Mayo Clinic Proceedings, reported that one in 50 planned to leave medicine altogether in the next two years, while one in five planned to reduce clinical hours over the next year. Physicians who self-identified as burned out were more likely to follow through on their plans to quit.
What makes the burnout crisis especially serious is that it is hitting us right as the gap between the supply and demand for health care is widening: A quarter of U.S. physicians are expected to retire over the next decade, while the number of older Americans, who tend to need more health care, is expected to double by 2040. While it might be tempting to point to the historically competitive rates of medical-school admissions as proof that the talent pipeline for physicians won’t run dry, there is no guarantee. Last year, for the first time in at least a decade, the volume of medical school applications dropped—by nearly 14,000, according to data from the Association of American Medical Colleges. By the association’s projections, we may be short 100,000 physicians or more by 2030.

I called the anesthesiologist in charge of the operating room schedule to ask about availability. If the cardiology department cleared her for surgery, he said, he could fit her in the following morning. I then called the on-call cardiologist to ask whether it would be safe to proceed. He hesitated. “I’m just covering,” he said. “I don’t know her well enough to say one way or the other.” He offered to pass on the question to her regular cardiologist.
A while later, he called back: The regular cardiologist had given her blessing. After some more calls, the preparations were made. My work was done, I thought. But then the phone rang: It was the anesthesiologist, apologetic. “The computer system,” he said. “It’s not letting me book the surgery.” Her appointment for heart imaging, which had been made before her kidney problems were discovered, was still slated for the following morning; the system wouldn’t allow another procedure at the same time. So I called the cardiologist yet again, this time asking him to reschedule the heart study. But doctors weren’t allowed to change the schedule, he told me, and the administrators with access to it wouldn’t be reachable until morning.
I felt deflated. For hours, my attention had been consumed by challenges of coordination rather than actual patient care. And still the patient was at risk of experiencing delays for both of the things she needed—not for any medical reason, but simply because of an inflexible computer system and a poor workflow.
Situations like this are not rare, and they are vexing in part because they expose the widening gap between the ideal and reality of medicine. Doctors become doctors because they want to take care of patients. Their decade-long training focuses almost entirely on the substance of medicine—on diagnosing and treating illness. In practice, though, many of their challenges relate to the operations of medicine—managing a growing number of patients, coordinating care across multiple providers, documenting it all. Regulations governing the use of electronic medical records (EMRs), first introduced in the Health Information Technology for Economic and Clinical Health (HITECH) Act in 2009, have gotten more and more demanding, while expanded insurance coverage from the Affordable Care Act may have contributed to an uptrend in patient volume at many health centers. These changes are taking a toll on physicians: There’s some evidence that the administrative burden of medicine—and with it, the proportion of burned-out doctors—is on the rise. A study published last year in Health Affairs reported that from 2011 to 2014, physicians spent progressively more time on “desktop medicine” and less on face-to-face patient care. Another study found that the percentage of physicians reporting burnout increased over the same period; by 2014, more than half said they were affected.
To understand how burnout arises, imagine a young chef. At the restaurant where she works, Bistro Med, older chefs are retiring faster than new ones can be trained, and the customer base is growing, which means she has to cook more food in less time without compromising quality. This tall order is made taller by various ancillary tasks on her plate: bussing tables, washing dishes, coordinating with other chefs so orders aren’t missed, even calling the credit-card company when cards get declined.
Then the owners announce that to get paid for her work, this chef must document everything she cooks in an electronic record. The requirement sounds reasonable at first but proves to be a hassle of bewildering proportions. She can practically make eggs Benedict in her sleep, but enter “egg” into the computer system? Good luck. There are separate entries for white and brown eggs; egg whites, yolks, or both; cage-free and non-cage-free; small, medium, large, and jumbo. To log every ingredient, she ends up spending more time documenting her preparation than actually preparing the dish. And all the while, the owners are pressuring her to produce more and produce faster.
It wouldn’t be surprising if, at some point, the chef decided to quit. Or maybe she doesn’t quit—after all, she spent all those years in training—but her declining morale inevitably affects the quality of her work.
In medicine, burned-out doctors are more likely to make medical errors, work less efficiently, and refer their patients to other providers, increasing the overall complexity (and with it, the cost) of care. They’re also at high risk of attrition: A survey of nearly 7,000 U.S. physicians, published last year in the Mayo Clinic Proceedings, reported that one in 50 planned to leave medicine altogether in the next two years, while one in five planned to reduce clinical hours over the next year. Physicians who self-identified as burned out were more likely to follow through on their plans to quit.
What makes the burnout crisis especially serious is that it is hitting us right as the gap between the supply and demand for health care is widening: A quarter of U.S. physicians are expected to retire over the next decade, while the number of older Americans, who tend to need more health care, is expected to double by 2040. While it might be tempting to point to the historically competitive rates of medical-school admissions as proof that the talent pipeline for physicians won’t run dry, there is no guarantee. Last year, for the first time in at least a decade, the volume of medical school applications dropped—by nearly 14,000, according to data from the Association of American Medical Colleges. By the association’s projections, we may be short 100,000 physicians or more by 2030.
by Rena Xu, The Atlantic | Read more:
Image: Dola Sun
Starbucks: No Purchase Needed To Use The Restroom
Starbucks Executive Chairman Howard Schultz said Thursday that Starbucks' bathrooms will now be open to everyone, whether paying customers or not.
"We don't want to become a public bathroom, but we're going to make the right decision 100 percent of the time and give people the key," Schultz said at the Atlantic Council in Washington, D.C. "Because we don't want anyone at Starbucks to feel as if we are not giving access to you to the bathroom because you are 'less than.' We want you to be 'more than.' "
Two black men, business partners Donte Robinson and Rashon Nelson, both 23, were arrested on April 12 as they sat in a Philadelphia Starbucks after not buying anything and asking to use the restroom.
The store manager called the police after asking them to leave — a "terrible decision," Schultz said.
Video of their arrest sparked outrage on social media and accusations of racial bias. Protesters stood outside and inside the Philadelphia Starbucks store where the arrest occurred.
"The company, the management and me personally — not the store manager — are culpable and responsible. And we're the ones to blame," Schultz said Thursday.
"We were absolutely wrong in every way. The policy and the decision she made, but it's the company that's responsible," he added.
Schultz said the company had a "loose policy" around letting only paying customers use the bathroom, though it was up to the discretion of individual store managers.
"We don't want to become a public bathroom, but we're going to make the right decision 100 percent of the time and give people the key," Schultz said at the Atlantic Council in Washington, D.C. "Because we don't want anyone at Starbucks to feel as if we are not giving access to you to the bathroom because you are 'less than.' We want you to be 'more than.' "

The store manager called the police after asking them to leave — a "terrible decision," Schultz said.
Video of their arrest sparked outrage on social media and accusations of racial bias. Protesters stood outside and inside the Philadelphia Starbucks store where the arrest occurred.
"The company, the management and me personally — not the store manager — are culpable and responsible. And we're the ones to blame," Schultz said Thursday.
"We were absolutely wrong in every way. The policy and the decision she made, but it's the company that's responsible," he added.
Schultz said the company had a "loose policy" around letting only paying customers use the bathroom, though it was up to the discretion of individual store managers.
by James Doubek, NPR | Read more:
Image: via
[ed. You still gotta get a key though (and here's where I applaud McDonalds for their no questions asked policy). It might help if government actually did its fucking job and provided basic public services - public bathrooms being one of the most basic.]
[ed. You still gotta get a key though (and here's where I applaud McDonalds for their no questions asked policy). It might help if government actually did its fucking job and provided basic public services - public bathrooms being one of the most basic.]
Why Buying a House Today is Much Harder Than in 1950
To understand just how unaffordable owning a home can be in American cities today, look at the case of a teacher in San Francisco seeking his or her first house.
Educators in the City by the Bay earn a median salary of $72,340. But, according to a new Trulia report, they can afford less than one percent of the homes currently on the market.
Despite making roughly $18,000 more than their peers in other states, many California teachers—like legions of other public servants, middle-class workers, and medical staff—need to resign themselves to finding roommates or enduring lengthy commutes. Some school districts, facing a brain drain due to rising real estate prices, are even developing affordable teacher housing so they can retain talent.
This housing math is brutal. With the average cost of a home in San Francisco hovering at $1.61 million, a typical 30-year mortgage—with a 20 percent down payment at today’s 4.55 percent interest rate—would require a monthly payment of $7,900 (more than double the $3,333 median monthly rent for a one-bedroom apartment last year).
Over the course of a year, that’s $94,800 in mortgage payments alone, clearly impossible on the aforementioned single teacher’s salary, even if you somehow put away enough for a down payment (that would be $322,000, if you’re aiming for 20 percent).
The figures become more frustrating when you compare them with the housing situation a previous generation faced in the late ’50s. The path an average Bay Area teacher might have taken to buy a home in the middle of the 20th century was, per data points and rough approximations, much smoother.
According to a rough calculation using federal data, the average teacher’s salary in 1959 in the Pacific region was more than $5,200 annually (just shy of the national average of $5,306). At that time, the average home in California cost $12,788. At the then-standard 5.7 percent interest rate, the mortgage would cost $59 a month, with a $2,557 down payment. If your monthly pay was $433 before taxes, $59 a month wasn’t just doable, it was also within the widely accepted definition of sustainable, defined as paying a third of your monthly income for housing. Adjusted for today’s dollars, that’s a $109,419 home paid for with a salary of $44,493.
And that’s on just a single salary.
From the front lines to the home front
The postwar boom wasn’t just the result of a demographic shift, or simply the flowering of an economy primed by new consumer spending. It was deliberately, and successfully, engineered by government policies that helped multiply homeownership rates from roughly 40 percent at the end of the war to 60 percent during the second half of the 20th century.
The pent-up demand before the suburban boom was immense: Years of government-mandated material shortages due to the war effort, and the mass mobilization of millions of Americans during wartime, meant homebuilding had become stagnant. In 1947, six million families were doubling up with relatives, and half a million were in mobile homes, barns, or garages according to Leigh Gallagher’s book The End of the Suburbs.
The government responded with intervention on a massive scale. According to Harvard professor and urban planning historian Alexander von Hoffman, a combination of two government initiatives—the establishment of the Federal Housing Authority and the Veterans Administration (VA) home loans programs—served as runways for first-time homebuyers.
Initially created during the ’30s, the Federal Housing Authority guaranteed loans as long as new homes met a series of standards, and, according to von Hoffman, created the modern mortgage market.
“When the Roosevelt administration put the FHA in place in the ’30s, it allowed lenders who hadn’t been in the housing market, such as insurance companies and banks, to start lending money,” he says.
The VA programs did the same thing, but focused on the millions of returning soldiers and sailors. The popular GI Bill, which provided tuition-free college education for returning servicemen and -women, was an engine of upward mobility: debt-free educational advancement paired with easy access to finance and capital for a new home.
It’s hard to comprehend just how large an impact the GI Bill had on the Greatest Generation, not just in the immediate aftermath of the war, but also in the financial future of former servicemen. In 1948, spending as part of the GI Bill consumed 15 percent of the federal budget.
The program helped nearly 70 percent of men who turned 21 between 1940 and 1955 access a free college education. In the years immediately after WWII, veterans’ mortgages accounted for more than 40 percent of home loans.
An analysis of housing and mortgage data from 1960 by Leo Grebler, a renowned professor of urban land economics at UCLA, demonstrates the pronounced impact of these programs. In 1950, FHA and VA loans accounted for 51 percent of the 1.35 million home starts across the nation. These federal programs would account for anywhere between 30 and 51 percent of housing starts between 1951 and 1957, according to Grebler’s analysis.
Between 1953 and 1957, 2.4 million units were started under these programs, using $3.6 billion in loans. This investment dwarfs the amount of money spent on public infrastructure during that period. (...)
Skewed perspectives
Many of the pressing urban planning issues we face today—sprawl and excessive traffic, sustainability, housing affordability, racial discrimination, and the persistence of poverty—can be traced back to this boom. There’s nothing wrong with the government promoting homeownership, as long as the opportunities it presents are open and accessible to all.
As President Franklin Roosevelt said, “A nation of homeowners, of people who won a real share in their own land, is unconquerable.”
That vision, however, has become distorted, due to many of the market incentives encouraged by the ’50s housing boom. In wealthy states, especially California, where Prop 13 locked in property tax payments despite rising property values, the incumbent advantage to owning homes is immense.
In Seattle, the amount of equity a homeowner made just holding on to their investment, $119,000, was more than an average Amazon engineer made last year ($104,000).
In many regions, we may have “reached the limits of suburbanization,” since buyers and commuters can’t stomach supercommutes. NIMBYism and local zoning battles have become the norm when any developers try to add much-needed housing density to expensive urban areas.
In many ways, to paraphrase Roosevelt, we’re seeing a “class” of homeowners become unconquerable. The cost of construction; a shortage of cheap, developable land near urban centers (gobbled up by earlier waves of suburbanization); and other factors have made homes increasingly expensive.
In other words, it’s a great time to own a home—and a terrible time to aspire to buy one.
by Patrick Sisson, Curbed | Read more:
Image: Getty
Educators in the City by the Bay earn a median salary of $72,340. But, according to a new Trulia report, they can afford less than one percent of the homes currently on the market.

This housing math is brutal. With the average cost of a home in San Francisco hovering at $1.61 million, a typical 30-year mortgage—with a 20 percent down payment at today’s 4.55 percent interest rate—would require a monthly payment of $7,900 (more than double the $3,333 median monthly rent for a one-bedroom apartment last year).
Over the course of a year, that’s $94,800 in mortgage payments alone, clearly impossible on the aforementioned single teacher’s salary, even if you somehow put away enough for a down payment (that would be $322,000, if you’re aiming for 20 percent).
The figures become more frustrating when you compare them with the housing situation a previous generation faced in the late ’50s. The path an average Bay Area teacher might have taken to buy a home in the middle of the 20th century was, per data points and rough approximations, much smoother.
According to a rough calculation using federal data, the average teacher’s salary in 1959 in the Pacific region was more than $5,200 annually (just shy of the national average of $5,306). At that time, the average home in California cost $12,788. At the then-standard 5.7 percent interest rate, the mortgage would cost $59 a month, with a $2,557 down payment. If your monthly pay was $433 before taxes, $59 a month wasn’t just doable, it was also within the widely accepted definition of sustainable, defined as paying a third of your monthly income for housing. Adjusted for today’s dollars, that’s a $109,419 home paid for with a salary of $44,493.
And that’s on just a single salary.
A dream of homeownership placed out of reach
That midcentury scenario seems like a financial fantasia to young adults hoping to buy homes today. Finding enough money for a down payment in the face of rising rents and stagnant wages, qualifying for loans in a difficult regulatory environment, then finding an affordable home in expensive metro markets can seem like impossible tasks.
In 2016, millennials made up 32 percent of the homebuying market, the lowest percentage of young adults to achieve that milestone since 1987. Nearly two-thirds of renters say they can’t afford a home.
Even worse, the market is only getting more challenging: The S&P CoreLogic Case-Shiller National Home Price Index rose 6.3 percent last year, according to an article in the Wall Street Journal. This is almost twice the rate of income growth and three times the rate of inflation. Realtor.com found that the supply of starter homes shrinks 17 percent every year.
It’s not news that the homebuying market, and the economy, were very different 60 years ago. But it’s important to emphasize how the factors that created the homeownership boom in the ’50s—widespread government intervention that tipped the scales for single-family homes, more open land for development and starter-home construction, and racist housing laws and discriminatory practices that damaged neighborhoods and perpetuated poverty—have led to many of our current housing issues.
That midcentury scenario seems like a financial fantasia to young adults hoping to buy homes today. Finding enough money for a down payment in the face of rising rents and stagnant wages, qualifying for loans in a difficult regulatory environment, then finding an affordable home in expensive metro markets can seem like impossible tasks.
In 2016, millennials made up 32 percent of the homebuying market, the lowest percentage of young adults to achieve that milestone since 1987. Nearly two-thirds of renters say they can’t afford a home.
Even worse, the market is only getting more challenging: The S&P CoreLogic Case-Shiller National Home Price Index rose 6.3 percent last year, according to an article in the Wall Street Journal. This is almost twice the rate of income growth and three times the rate of inflation. Realtor.com found that the supply of starter homes shrinks 17 percent every year.
It’s not news that the homebuying market, and the economy, were very different 60 years ago. But it’s important to emphasize how the factors that created the homeownership boom in the ’50s—widespread government intervention that tipped the scales for single-family homes, more open land for development and starter-home construction, and racist housing laws and discriminatory practices that damaged neighborhoods and perpetuated poverty—have led to many of our current housing issues.
From the front lines to the home front
The postwar boom wasn’t just the result of a demographic shift, or simply the flowering of an economy primed by new consumer spending. It was deliberately, and successfully, engineered by government policies that helped multiply homeownership rates from roughly 40 percent at the end of the war to 60 percent during the second half of the 20th century.
The pent-up demand before the suburban boom was immense: Years of government-mandated material shortages due to the war effort, and the mass mobilization of millions of Americans during wartime, meant homebuilding had become stagnant. In 1947, six million families were doubling up with relatives, and half a million were in mobile homes, barns, or garages according to Leigh Gallagher’s book The End of the Suburbs.
The government responded with intervention on a massive scale. According to Harvard professor and urban planning historian Alexander von Hoffman, a combination of two government initiatives—the establishment of the Federal Housing Authority and the Veterans Administration (VA) home loans programs—served as runways for first-time homebuyers.
Initially created during the ’30s, the Federal Housing Authority guaranteed loans as long as new homes met a series of standards, and, according to von Hoffman, created the modern mortgage market.
“When the Roosevelt administration put the FHA in place in the ’30s, it allowed lenders who hadn’t been in the housing market, such as insurance companies and banks, to start lending money,” he says.
The VA programs did the same thing, but focused on the millions of returning soldiers and sailors. The popular GI Bill, which provided tuition-free college education for returning servicemen and -women, was an engine of upward mobility: debt-free educational advancement paired with easy access to finance and capital for a new home.
It’s hard to comprehend just how large an impact the GI Bill had on the Greatest Generation, not just in the immediate aftermath of the war, but also in the financial future of former servicemen. In 1948, spending as part of the GI Bill consumed 15 percent of the federal budget.
The program helped nearly 70 percent of men who turned 21 between 1940 and 1955 access a free college education. In the years immediately after WWII, veterans’ mortgages accounted for more than 40 percent of home loans.
An analysis of housing and mortgage data from 1960 by Leo Grebler, a renowned professor of urban land economics at UCLA, demonstrates the pronounced impact of these programs. In 1950, FHA and VA loans accounted for 51 percent of the 1.35 million home starts across the nation. These federal programs would account for anywhere between 30 and 51 percent of housing starts between 1951 and 1957, according to Grebler’s analysis.
Between 1953 and 1957, 2.4 million units were started under these programs, using $3.6 billion in loans. This investment dwarfs the amount of money spent on public infrastructure during that period. (...)
Skewed perspectives
Many of the pressing urban planning issues we face today—sprawl and excessive traffic, sustainability, housing affordability, racial discrimination, and the persistence of poverty—can be traced back to this boom. There’s nothing wrong with the government promoting homeownership, as long as the opportunities it presents are open and accessible to all.
As President Franklin Roosevelt said, “A nation of homeowners, of people who won a real share in their own land, is unconquerable.”
That vision, however, has become distorted, due to many of the market incentives encouraged by the ’50s housing boom. In wealthy states, especially California, where Prop 13 locked in property tax payments despite rising property values, the incumbent advantage to owning homes is immense.
In Seattle, the amount of equity a homeowner made just holding on to their investment, $119,000, was more than an average Amazon engineer made last year ($104,000).
In many regions, we may have “reached the limits of suburbanization,” since buyers and commuters can’t stomach supercommutes. NIMBYism and local zoning battles have become the norm when any developers try to add much-needed housing density to expensive urban areas.
In many ways, to paraphrase Roosevelt, we’re seeing a “class” of homeowners become unconquerable. The cost of construction; a shortage of cheap, developable land near urban centers (gobbled up by earlier waves of suburbanization); and other factors have made homes increasingly expensive.
In other words, it’s a great time to own a home—and a terrible time to aspire to buy one.
by Patrick Sisson, Curbed | Read more:
Image: Getty
Michael Pollan: How to Change Your Mind
Before Timothy Leary came along, psychedelic drugs were respectable. The American public’s introduction to these substances was gradual, considered, and enthusiastic. Psilocybin appeared in an article, “Seeking the Magic Mushroom,” in a 1957 issue of Life magazine. The author of this first-person account of consuming mind-altering fungi at a traditional ritual in a remote Mexican village, R. Gordon Wasson, was a banker, a vice president at J.P. Morgan. The founder and editor in chief of Time-Life, Henry Robinson Luce, took LSD with his wife under a doctor’s supervision, and he liked to see his magazines cover the possible therapeutic uses of psychedelics. Perhaps most famously, Cary Grant underwent more than 60 sessions of LSD-facilitated psychotherapy in the late 1950s, telling Good Housekeeping that the treatment made him less lonely and “a happy man.” This wasn’t a Hollywood star’s foray into the counterculture, but part of an experimental protocol used by a group of Los Angeles psychiatrists who were convinced they had found a tool that could make talk therapy transformative. And they had the science—or at least the beginnings of it—to back up their claims.
Then came Leary and his Harvard Psilocybin Project. In his new book, How to Change Your Mind: What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence, Michael Pollan recounts how nascent but promising research into the therapeutic uses of psychedelic drugs in the 1950s and early 1960s went off the rails. Leary, with Richard Alpert (who would later rename himself Ram Dass), conducted research in a methodologically haphazard and messianic manner, eventually alienating the university’s administration, who fired them. Leary then went on to become a guru (his term) for the hippie movement, urging America’s youth to “turn on, tune in, and drop out.” LSD came to be associated with the anti-war movement, free love, and a general rejection of Middle American mores, and the authorities no longer looked kindly upon it. By 1970, with the Controlled Substances Act, LSD, psilocybin, peyote, DMT, and mescaline were classified as Schedule I drugs—defined as substances with a high potential for abuse, with no currently accepted medical value in the U.S., and unsafe to use even under medical supervision. For four decades, psychedelics were associated with burnt-out cases shambling around college towns like Berkeley and Cambridge, chromosome damage, and the suicide of the daughter of TV personality Art Linkletter.
Pollan is far from the first person to point out that none of the above characterizations are truthful representations of the most familiar psychedelic drugs. As Purdue’s David E. Nichols wrote in 2016 for the peer-reviewed medical journal Pharmacological Reviews, these drugs are “generally considered physiologically safe and do not lead to dependence or addiction.” There have been no known cases of overdose from LSD, psilocybin, or mescaline. The chromosome scare story turned out to be bogus, Diane Linkletter had a history of depression pre-existing her drug use, and while it’s probably a bad idea for any mentally disturbed person to take a powerful psychoactive drug recreationally, there’s no evidence that psychedelics cause mental illness in otherwise healthy people. To the contrary: After 40 years in the wilderness, psychedelics are once more the subject of serious scientific study, with early results suggesting that the drugs, when used under a therapist’s supervision, can help patients suffering from anxiety, depression, post-traumatic stress disorder, obsessive-compulsive disorder, and both alcohol and nicotine addiction. (...)
How to Change Your Mind includes an account of how various psychedelic drugs found their way into American laboratories and homes, the great hiatus of research into their potential uses after they were outlawed in the ’60s and ’70s, and the “renaissance” of scientific interest in the drugs, beginning in the late 1990s and culminating in several government-funded studies today. Pollan himself was no psychonaut when he became interested in that resurgence. He’d tried psilocybin mushrooms twice in his 20s, then let the remaining stash of fungi molder in a jar in the back of a cabinet; the experience was “interesting” but not something he felt moved to repeat. What drew his attention to the subject later in life were two studies and a dinner party, where a 60ish “prominent psychologist,” married to a software engineer, explained that she and her husband found the “occasional use of LSD both intellectually stimulating and of value to their work.” One of the experiments, conducted at Johns Hopkins, UCLA, and NYU, found that large doses of psilocybin, when administered once or twice to patients who had received terminal cancer diagnoses, helped significantly reduce their fear and depression. The other study, conducted by some of the same researchers, observed the similarities between the results of high doses of psilocybin administered by teams of two therapists and what are commonly described as mystical experiences. The latter are episodes characterized by “the dissolution of one’s ego followed by a sense of merging with nature or the universe.” As Pollan notes, this hardly sounds like news to people accustomed to taking psychedelic drugs, but it marked the first validation of the idea in a rigorous scientific setting.
Further research using fMRI scanners has confirmed the similarity in brain activity between people meditating and people having certain kinds of psychedelic trips. But not all trips are the same, as anyone who has dropped acid can attest. Leary’s one great contribution to the understanding of psychedelics was his emphasis on what has become a mantra for contemporary researchers: set and setting. Set refers to what the person taking the drug expects or is told to expect from the experience, and setting refers to the circumstances of the trip itself: whether the subject is alone or with others, outside or inside, listening to particular kinds of music, wearing an eye mask, receiving guidance from someone they trust, being encouraged to explore ideas and feelings by a therapist, and so on.
Pollan took a couple of research trips himself in the course of writing How to Change Your Mind, with results that are interesting only to the extent that they help him make sense of other people’s accounts of their own journeys. The meat of the book is its chapters on the neuroscience of the drugs and their evident ability to suppress activity in a brain system known as the “default mode network.” The DMN acts as our cerebral executive, coordinating and organizing competing signals from other systems. It is, as Pollan sees it, the “autobiographical brain,” and the site of our ego. The long history of people reporting the sensation of their egos dissolving while under the influence of psychedelics meshes with this interpretation. It’s an experience with the potential to both terrify and, paradoxically, comfort those who undergo it.
Why should this effect prove so helpful to the depressed, addicted, and anxious? As Pollan explains it, these disorders are the result of mental and emotional “grooves” in our thinking that have become, as the DMN’s name suggests, default. We are how we think. The right psychedelic experience can level out the grooves, enabling a person to make new cerebral connections and briefly escape from “a rigidity in our thinking that is psychologically destructive.” The aerial perspective this escape offers doesn’t immediately evaporate either. The terminal cancer patients in the psilocybin study felt lasting relief as a result of the glimpse the drugs gave them of a vista beyond the limitations of their own egos—even the ones who didn’t believe in God or other supernatural forces. (...)
If How to Change Your Mind furthers the popular acceptance of psychedelics as much as I suspect it will, it will be by capsizing the long association, dating from Leary’s time, between the drugs and young people. Pollan observes that the young have had less time to establish the cognitive patterns that psychedelics temporarily overturn. But “by middle age,” he writes, “the sway of habitual thinking over the operations of the mind is nearly absolute.” What he sought in his own trips was not communion with a higher consciousness so much as the opportunity to “renovate my everyday mental life.” He felt that the experience made him more emotionally open and appreciative of his relationships.
[ed. See also: Hallucinogenic Drugs as Therapy? I Tried It (Michael Pollan - NY Times)

Pollan is far from the first person to point out that none of the above characterizations are truthful representations of the most familiar psychedelic drugs. As Purdue’s David E. Nichols wrote in 2016 for the peer-reviewed medical journal Pharmacological Reviews, these drugs are “generally considered physiologically safe and do not lead to dependence or addiction.” There have been no known cases of overdose from LSD, psilocybin, or mescaline. The chromosome scare story turned out to be bogus, Diane Linkletter had a history of depression pre-existing her drug use, and while it’s probably a bad idea for any mentally disturbed person to take a powerful psychoactive drug recreationally, there’s no evidence that psychedelics cause mental illness in otherwise healthy people. To the contrary: After 40 years in the wilderness, psychedelics are once more the subject of serious scientific study, with early results suggesting that the drugs, when used under a therapist’s supervision, can help patients suffering from anxiety, depression, post-traumatic stress disorder, obsessive-compulsive disorder, and both alcohol and nicotine addiction. (...)
How to Change Your Mind includes an account of how various psychedelic drugs found their way into American laboratories and homes, the great hiatus of research into their potential uses after they were outlawed in the ’60s and ’70s, and the “renaissance” of scientific interest in the drugs, beginning in the late 1990s and culminating in several government-funded studies today. Pollan himself was no psychonaut when he became interested in that resurgence. He’d tried psilocybin mushrooms twice in his 20s, then let the remaining stash of fungi molder in a jar in the back of a cabinet; the experience was “interesting” but not something he felt moved to repeat. What drew his attention to the subject later in life were two studies and a dinner party, where a 60ish “prominent psychologist,” married to a software engineer, explained that she and her husband found the “occasional use of LSD both intellectually stimulating and of value to their work.” One of the experiments, conducted at Johns Hopkins, UCLA, and NYU, found that large doses of psilocybin, when administered once or twice to patients who had received terminal cancer diagnoses, helped significantly reduce their fear and depression. The other study, conducted by some of the same researchers, observed the similarities between the results of high doses of psilocybin administered by teams of two therapists and what are commonly described as mystical experiences. The latter are episodes characterized by “the dissolution of one’s ego followed by a sense of merging with nature or the universe.” As Pollan notes, this hardly sounds like news to people accustomed to taking psychedelic drugs, but it marked the first validation of the idea in a rigorous scientific setting.
Further research using fMRI scanners has confirmed the similarity in brain activity between people meditating and people having certain kinds of psychedelic trips. But not all trips are the same, as anyone who has dropped acid can attest. Leary’s one great contribution to the understanding of psychedelics was his emphasis on what has become a mantra for contemporary researchers: set and setting. Set refers to what the person taking the drug expects or is told to expect from the experience, and setting refers to the circumstances of the trip itself: whether the subject is alone or with others, outside or inside, listening to particular kinds of music, wearing an eye mask, receiving guidance from someone they trust, being encouraged to explore ideas and feelings by a therapist, and so on.
Pollan took a couple of research trips himself in the course of writing How to Change Your Mind, with results that are interesting only to the extent that they help him make sense of other people’s accounts of their own journeys. The meat of the book is its chapters on the neuroscience of the drugs and their evident ability to suppress activity in a brain system known as the “default mode network.” The DMN acts as our cerebral executive, coordinating and organizing competing signals from other systems. It is, as Pollan sees it, the “autobiographical brain,” and the site of our ego. The long history of people reporting the sensation of their egos dissolving while under the influence of psychedelics meshes with this interpretation. It’s an experience with the potential to both terrify and, paradoxically, comfort those who undergo it.
Why should this effect prove so helpful to the depressed, addicted, and anxious? As Pollan explains it, these disorders are the result of mental and emotional “grooves” in our thinking that have become, as the DMN’s name suggests, default. We are how we think. The right psychedelic experience can level out the grooves, enabling a person to make new cerebral connections and briefly escape from “a rigidity in our thinking that is psychologically destructive.” The aerial perspective this escape offers doesn’t immediately evaporate either. The terminal cancer patients in the psilocybin study felt lasting relief as a result of the glimpse the drugs gave them of a vista beyond the limitations of their own egos—even the ones who didn’t believe in God or other supernatural forces. (...)
If How to Change Your Mind furthers the popular acceptance of psychedelics as much as I suspect it will, it will be by capsizing the long association, dating from Leary’s time, between the drugs and young people. Pollan observes that the young have had less time to establish the cognitive patterns that psychedelics temporarily overturn. But “by middle age,” he writes, “the sway of habitual thinking over the operations of the mind is nearly absolute.” What he sought in his own trips was not communion with a higher consciousness so much as the opportunity to “renovate my everyday mental life.” He felt that the experience made him more emotionally open and appreciative of his relationships.
by Laura Miller, Slate | Read more:
Image: Jeannette Montgomery Barron[ed. See also: Hallucinogenic Drugs as Therapy? I Tried It (Michael Pollan - NY Times)
Subscribe to:
Posts (Atom)