Saturday, September 22, 2018

Three Hawaiian-Shirt-Clad Boomers on Millennials Stealing Their Look

By several accounts, 2018 was the Summer of Sleaze, headlined by Justin Bieber and his arsenal of garish aloha shirts (or, as mainlanders know it, the “Hawaiian” shirt) that made him look like your best friend’s semi-flaky South Beach cocaine dealer, circa 1982. Bieber’s fondness for a particularly, er, loud palette of shirts may put him on the extreme end of the aesthetic, but since celeb style tends to inform consumer trends, Bieber’s fits (or something like them) can now be seen on a male mannequin near you, whether it be via the dirt-cheap fast-fashion retailer H&M or timeless designer favorite Louis Vuitton.

In other words: Aloha shirts are hip for real, whether you’re going lowbrow or highbrow. And millennials in particular can’t seem to get enough of them.

This is an odd but not unexpected turn of events for David Bailey, who has worn an aloha shirt every single day for the last 30 years. Actually, scratch that: “I’ve missed maybe three days,” he says, before cracking a laugh. Bailey, 73, is perhaps the greatest hunter-gatherer of aloha shirts in America, having amassed an estimated 15,000 in his cocoon of a shop, Bailey’s Antiques and Aloha Shirts, which is a 10-minute drive from tourist-thronged Waikiki Beach in Honolulu.

Dig into the endless shelves and spirals of clothing and you might stumble across rarities like a hand-painted vintage rayon shirt from the 1960s, or perhaps a modern reproduction of a classic design worn by Frank Sinatra in From Here to Eternity. Some shirts go for under $5, while many others have appreciated in value since they first came off the sewing line. Anthony Bourdain dropped $3,000 here in 2008 while taping his travel show No Reservations. King of Margaritaville Jimmy Buffett got one for $5,000. Meanwhile, Nic Cage, never one to be outdone, spent $10,000 on a vintage shirt.

“The thought of wearing just a plain white or blue collared shirt is totally boring,” Bailey explains. “What we’re seeing is people who get into aloha shirts by having one, then five, then 50. Manufacturers I talk to are seeing increased sales everywhere, especially on the East Coast and in Europe. We’ve got Hawaii Five-0 and now Magnum P.I. back on TV, showing off the aloha shirt look. The vibe is ripe.”

How the shirt came to be is a matter of academic debate, but there’s agreement around the idea that it was influenced by the colorful Japanese fabrics that were being imported to Hawaii in the early 20th century, and widely used by Japanese families to sew their own clothing. The brightest, most eye-popping colors and patterns were initially reserved for children and young women, says DeSoto Brown, historian at the Bishop Museum in Honolulu. Chinese businessman Ellery Chun had a major influence in the 1930s when he began producing patterned shirts en masse, and Brown credits teens and other young people for fueling the fad of the aloha shirt as a form of local streetwear, during a time when most Americans were dressing conservatively.

“In the late 1930s, there was a brief moment of Hawaii being very fashionable for very wealthy young mainland people, who were part of what was called ‘cafĂ© society.’ After Doris Duke built her house here, which got a ton of national publicity, other rich young people started vacationing here, or even buying their own houses,” Brown continues. “They wore alohawear in publicity or news photos, which not only helped spread awareness of it, but also made it desirable for other people to copy.”

That blew up in the 1950s and early 1960s, when Elvis Presley and Sinatra donned aloha shirts in films, and the Polynesian tiki-culture craze swept across the U.S., starting on the West Coast. “By the 1970s, old aloha shirts first got recycled by hippies who were trying to look oddball, to break with the established expectations. By the 1990s, the clichĂ© of the ‘Fabulous ‘50s’ was also in place in American culture, and a small part of that was the old-fashioned-style aloha shirt,” Brown explains.

For many boomer men like Bailey, however, the aloha shirt isn’t a summer fad — it’s a long-lasting love built on an earnest appreciation for the practicality and style of a unique clothing tradition. The shirt represents the ultimate compromise between buttoned-up and laid-back — an alternative to the prim look of the midcentury white-collar man, and later, the Gordon Gekko pretension of the 1980s. And so, while millennials are certainly buying into the beachy-cool vibe of the aloha shirt, we found three men who have fallen in love with the garment exactly because it tries not to be hip. Here’s what they have to say…

by Eddie Kim, MEL |  Read more:
Image: uncredited

Friday, September 21, 2018

Why the Kavanaugh Allegation is so Important

The sexual assault allegation against Brett Kavanaugh is both credible and disturbing. But some people have questioned its relevance to his confirmation: After all, it is alleged to have happened when Kavanaugh was 17 years old, and Kavanaugh is now 53. The question of how to treat juvenile crimes can be complicated; with criminal convictions, we seal juveniles’ records. While Anita Hill’s allegations against Clarence Thomas concerned behavior during his professional life, Christine Ford’s allegation against Kavanaugh is from his high school years.

I think there is a very strong argument that the facts of the allegation, assuming they are true, are important in considering whether someone should be a Supreme Court Justice. But before we get to that, there’s another reason why this doesn’t just matter, but matters a lot: Kavanaugh has denied that he did it. That means that the truth or falsity of Ford’s allegation is not just important for assessing what Kavanaugh did in high school. It’s important for assessing what he is doing right now. If Ford’s allegation is true, then Kavanaugh has lied to the public. He didn’t just assault a woman in the 1980s, but he is gaslighting a woman in 2018 and trying to mislead the public and the United States Senate about a crime he committed. We can’t wave this away as being about the distant past, because it’s a question about the present. If the allegation is true, Kavanaugh is not fit to be a justice, not just because of his past actions but because of his shameless public lying.

Here is what Kavanaugh has said about the allegation: “I categorically and unequivocally deny this allegation. I did not do this back in high school or at any time.” That’s pretty clear: Christine Ford is lying or delusional. According to Kavanaugh, she has invented a false story about him. It’s a story she confessed to her husband years ago, and a story that passed a polygraph test. But Kavanaugh says it didn’t happen and that Ford is smearing him. If Ford is telling the truth, then what Kavanaugh is doing right now is unconscionable. Not only did he attack her in high school, but he would be publicly trying to falsely brand her delusional or a liar. It would literally be “insult to injury.” No person with a shred of conscience could vote to confirm such a man to the Supreme Court.

Kavanaugh’s denial means that assessing the truth or falsity of the allegation is now critical to his confirmation. If Kavanaugh had admitted the truth of the allegation and apologized sincerely, talking about what a pig he had been in high school, then the “well, this was a long time ago” crowd would be able to argue that we should focus far more on his hideous record as a jurist than his high school sexual assault. Now, however, there is no choice at all: The question of whether Brett Kavanaugh did this is all-important, because it’s also the question of whether Brett Kavanaugh is trying to slander an assault victim and mislead the Senate.

To reiterate: This is not just about the crime, and it is not just about the past. Never mind “this was a long time ago”: If there is credible evidence that he did this, then he does not belong on the court or in any position of power whatsoever, because he is lying. We don’t need to resolve the question of how important juvenile crimes are in order to conclude that. I’m sorry to repeat myself here, but I think this is very important and the point isn’t being made clearly enough. If you think the allegation is true, then Kavanaugh is unfit for office, regardless of how long ago this was.

by Nathan J. Robinson, Current Affairs |  Read more:
Image: uncredited
[ed. See also: After the Kavanaugh Allegations, Republicans Offer a Shocking Defense: Sexual Assault Isn’t a Big Deal]

Liberty, Security, and Iron Cages

I knew my school would put up prison bars sooner or later. It had been too free to last. I haven’t been back in ten years, but I wasn’t shocked when a friend showed me photos of what it looks like now: all around our once wide-open campus, there are bars and metal security gates. In my day, you could walk on and off as you pleased (and we did). No locked doors could keep you out; it was like a college campus. Now, you have to pass through one of the few official entrances, no doubt manned by security personnel.

I doubt there was any public opposition to putting iron bars around the campus. In fact, I’m sure it was applauded: in my hometown of Sarasota, the school board has frantically been spending millions on “long-overdue” security measures at all of its schools, making sure nobody can just walk into a school and every visitor is tracked and registered. This just makes sense, post-Parkland. At Sarasota High School, there have been long debates about how to close off the last remaining way that the campus can be freely accessed from the street: it’s an intolerable risk, everyone seems to think, not to have a barricade.

I don’t share this instinct. I am horrified by it, actually. The fact that everyone thinks it perfectly reasonable to build a gigantic cage around the school, that they think this is so sensible and necessary that nobody could possibly see anything wrong with it, suggests to me a world gone completely insane. I look at the black barriers around my own Pine View School and I see what is quite obviously a dystopia. We were free, and now the students are behind bars. We used to wander wherever we chose. Now you need to get permission. How can anyone look at that and not be disturbed by it? How could anyone watch them building the barriers and not scream “Stop! What are you doing? This is a school, not a fortress!”

Of course, to the extent anyone was disturbed to watch the gates go up initially, their feelings will subside soon enough. Human beings can get used to pretty much anything, which is one of the reasons terrible things happen and nobody notices they’re abnormal. And soon there will be generations of students who don’t even remember when it wasn’t caged. They won’t even be able to imagine such a thing. A world without the security doors would be inconceivable. In fact, the above photos probably don’t look nearly as disturbing to you as they do to me. You can’t see what I see, because you never saw it differently. I see a path that used to be traveled freely, a place I spent nearly ten years wandering around doing whatever I felt like, hideously deformed with bars and gates. You just see a fence, like any other fence. You might even think I sound demented.

I don’t know. It’s very hard to convey what this means to me without sounding crazy, and that’s what worries me. Increased security measures are so rational that they seem inevitable. School shootings are so awful that we’d be crazy not to put in metal detectors and hire an army of guards and give the teachers guns and build a giant wall. And yet to me it feels so, so wrong. I can’t easily argue against it, but I feel it just mustn’t be allowed to happen. It’s partly because I experienced incredible freedom when I was young, and I know there is nothing like it, and I can’t accept that future students won’t get to have it, because that will mean the world is getting worse, and we have to stop the world from getting worse. Surely the gates are just temporary. Surely we’re all committed to tearing them down eventually, at least. But I know they’re not. Once those fences go up, they never, ever come down. Security measures only ever heighten. They do not get relaxed.

I should mention that I do also think this is a stupid response to the problem of shootings. We are all very familiar with mass shootings, because they are so horrifying and seem to occur so often. As a factual matter, however, students in public schools are not really at much risk. In the 20 years since Columbine, there have been half a dozen shootings with multiple fatalities at elementary, middle, and high schools in the United States. That’s half a dozen too many, obviously. But they are not a problem that justifies turning every school in the country into an armed compound. In fact, many of them could have been prevented if we were committed to sensible gun policy and had a school system that was capable of detecting and dealing with troubled students. The Parkland shooter had been reported to police dozens and dozens of times, and had openly made threats to shoot up the school. That was where the failure was, and that’s what needs to be fixed. The Sandy Hook shooter should never have had access to heavy weaponry. By the time these people get to the schoolhouse, we’ve already failed.

In fact, many security measures on campus are futile. The only way short of an outright total surveillance state to solve these problems is through prevention. The security gates on Pine View’s campus, for example, are utterly useless. Handguns fit in backpacks, and many of these shootings are conducted by students. Unless you commit to putting every student’s possessions through a metal detector every morning (as some schools do), a Parkland situation is still perfectly plausible. (Also, Pine View’s fence might make students less secure. It’s easy for a determined perpetrator to shimmy over, but having the whole place surrounded by a cage, with only a few points of exit, makes it far more difficult for students to flee in an emergency.)

I honestly don’t think the building of gates comes out of a sober assessment of how to actually prevent school shootings. It’s purely reactive: something horrible happened, thus we must have new security. But another thing that disturbs me is that in the “cost-benefit” calculus, “freedom” appears essentially nowhere. The old phrase is that whoever gives up a lot of liberty for a bit of security deserves neither. But it so often feels like people will give up all of their liberty for even the appearance of security. Hardly anyone cares about the little unquantifiable things, like how it felt to be a student that a school that respected its pupils enough to give them complete liberty.

The sociologist Max Weber is associated with the idea of the “iron cage” of rationality: there is a kind of logic that imprisons us and determines our thinking. Capitalism creates an ideological iron cage: if something cannot justify itself economically, it does not have value. Security is similar: there’s no way to argue against locking down the schools; it seems somehow like it must occur because it’s just perfectly rational. In a newspaper article about Sarasota High School’s lock-down measures, I saw an administrator say something like “Well, a school has to be able to keep track of everyone who enters and exits campus.” The fact that that statement isn’t controversial, that it seems obvious: that’s the iron cage. Does a school have to be able to keep track of everyone? Couldn’t everyone just come and go? A teacher has to take attendance, of course. But it’s strange how many people can accept as an unquestionable assumption something that was never true until recently: that everyone must be kept track of, that nothing must be unknown. All of this seems to happen without anybody really willing it: reason travels a path that leads inescapably to the security state.

by Nathan J. Robinson, Current Affairs |  Read more:
Image: uncredited

Thursday, September 20, 2018


Adrian Tomine, Fourth Wall
via:

St. Vincent

Senators Unveil Legislation To Protect Patients Against Surprise Medical Bills

With frustration growing among Americans who are being charged exorbitant prices for medical treatment, a bipartisan group of senators Tuesday unveiled a plan to protect patients from surprise bills and high charges from hospitals or doctors who are not in their insurance networks.

The draft legislation, which sponsors said is designed to prevent medical bankruptcies, targets three key consumer concerns:
  • Treatment for an emergency by a doctor who is not part of the patient’s insurance network at a hospital that is also outside that network. The patients would be required to pay out-of-pocket the amount required by their insurance plan. The hospital or doctor could not bill the patient for the remainder of the bill, a practice known as “balance billing.” The hospital and doctor could seek additional payments from the patient’s insurer under state regulations or through a formula established in the legislation.
  • Treatment by an out-of-network doctor or other provider at a hospital that is in the patient’s insurance network. Patients would pay only what is required by their plans. Again, the doctors could seek more payments from the plans based on formulas set up by state rules or through the federal formula.
  • Mandated notification to emergency patients, once they are stabilized, that they could run up excess charges if they are in an out-of-network hospital. The patients would be required to sign a statement acknowledging that they had been told their insurance might not cover their expenses, and they could seek treatment elsewhere.
“Our proposal protects patients in those emergency situations where current law does not, so that they don’t receive a surprise bill that is basically uncapped by anything but a sense of shame,” Sen. Bill Cassidy (R-La.) said in his announcement about the legislation.

Kevin Lucia, a senior research professor at Georgetown University’s Center on Health Insurance Reforms who had not yet read the draft legislation, said the measure was aimed at a big problem.

“Balance billing is ripe for a federal solution,” he said. States regulate only some health plans and that “leaves open a vast number of people that aren’t covered by those laws.”

Federal law regulates health plans offered by many larger companies and unions that are “self-funded.” Sixty-one percent of privately insured employees get their insurance this way. Those plans pay claims out of their own funds, rather than buying an insurance policy. Federal law does not prohibit balance billing in these plans.

Cassidy’s office said, however, that this legislation would plug that gap.

In addition to Cassidy, the legislation is being offered by Sens. Michael Bennet (D-Colo.), Chuck Grassley (R-Iowa), Tom Carper (D-Del.), Todd Young (R-Ind.) and Claire McCaskill (D-Mo.).

Cassidy’s announcement cited two recent articles from Kaiser Health News and NPR’s “Bill of the Month” series, including a $17,850 urine test and a $109,000 bill after a heart attack.

In a statement to Kaiser Health News, Bennet said, “In Colorado, we hear from patients facing unexpected bills with astronomical costs even when they’ve received a service from an in-network provider. That’s why Senator Cassidy and I are leading a bipartisan group of senators to address this all-too-common byproduct of limited price transparency.”

Emergency rooms and out-of-network hospitals aren’t the only sources of balance bills, Lucia said. He mentioned that both ground and air ambulances can leave patients responsible for surprisingly high costs as well.

by Rachel Bluth, Kaiser Health News | Read more:
Image: via
[ed. Congress? Advancing legislation that would actually help normal, non-rich people (vs. armies of healthcare industry lobbyists)? This makes too much sense not to have a catch. We'll just have to wait and see. It could be nothing more than pre-election virtue signalling, but maybe someone is actually serious. I'm inclined to think the former.] 

Life in the Spanish City That Banned Cars

Pople don’t shout in Pontevedra – or they shout less. With all but the most essential traffic banished, there are no revving engines or honking horns, no metallic snarl of motorbikes or the roar of people trying make themselves heard above the din – none of the usual soundtrack of a Spanish city.

What you hear in the street instead are the tweeting of birds in the camellias, the tinkle of coffee spoons and the sound of human voices. Teachers herd crocodiles of small children across town without the constant fear that one of them will stray into traffic.

“Listen,” says the mayor, opening the windows of his office. From the street below rises the sound of human voices. “Before I became mayor 14,000 cars passed along this street every day. More cars passed through the city in a day than there are people living here.”

Miguel Anxo Fernández Lores has been mayor of the Galician city since 1999. His philosophy is simple: owning a car doesn’t give you the right to occupy the public space.

“How can it be that the elderly or children aren’t able to use the street because of cars?” asks CĂ©sar Mosquera, the city’s head of infrastructures. “How can it be that private property – the car – occupies the public space?”

Lores became mayor after 12 years in opposition, and within a month had pedestrianised all 300,000 sq m of the medieval centre, paving the streets with granite flagstones.

“The historical centre was dead,” he says. “There were a lot of drugs, it was full of cars – it was a marginal zone. It was a city in decline, polluted, and there were a lot of traffic accidents. It was stagnant. Most people who had a chance to leave did so. At first we thought of improving traffic conditions but couldn’t come up with a workable plan. Instead we decided to take back the public space for the residents and to do this we decided to get rid of cars.”

They stopped cars crossing the city and got rid of street parking, as people looking for a place to park is what causes the most congestion. They closed all surface car parks in the city centre and opened underground ones and others on the periphery, with 1,686 free places. They got rid of traffic lights in favour of roundabouts, extended the car-free zone from the old city to the 18th-century area, and used traffic calming in the outer zones to bring the speed limit down to 30km/h.

The benefits are numerous. On the same streets where 30 people died in traffic accidents from 1996 to 2006, only three died in the subsequent 10 years, and none since 2009. CO2 emissions are down 70%, nearly three-quarters of what were car journeys are now made on foot or by bicycle, and, while other towns in the region are shrinking, central Pontevedra has gained 12,000 new inhabitants. Also, withholding planning permission for big shopping centres has meant that small businesses – which elsewhere have been unable to withstand Spain’s prolonged economic crisis – have managed to stay afloat.

Lores, a member of the leftwing Galician Nationalist Bloc, is a rarity in the solidly conservative northwestern region. Pontevedra, population 80,000, is the birthplace of Mariano Rajoy, the former Spanish prime minister and leader of the rightwing People’s party. However, the mayor says Rajoy has never shown any interest in an urban scheme that has earned his native city numerous awards.

Naturally, it hasn’t all gone off without a hitch. People don’t like being told they can’t drive wherever they want, but Lores says that while people claim it as a right, in fact what they want are privileges.

“If someone wants to get married in the car-free zone, the bride and groom can come in a car, but everyone else walks,” he says. “Same with funerals.”

by Stephen Burgen, The Guardian |  Read more:
Image:Luis Pereiro Gomez

Wednesday, September 19, 2018

Steely Dan: More Than Just a Band


The NFL’s Very Profitable Existential Crisis

Consider the curious case of the National Football League: It’s the largest single entertainment property in the U.S., a $14 billion per year attention-sucking machine with a steady hold on the lives of tens of millions. And its future is now in widespread doubt.

Ratings for regular-season games fell 17 percent over the past two years, according to Nielsen, and after one week of play in the new season, viewership has been flat. February marked the third-straight year of audience decline for the Super Bowl and the smallest audience since 2009. Youth participation in tackle football, meanwhile, has declined by nearly 22 percent since 2012 in the face of an emerging scientific consensus that the game destroys the brains of its players. Once a straightforward Sunday diversion, the NFL has become a daily exercise in cognitive dissonance for fans and a hotly contested front in a culture war that no longer leaves space for non-combatants.

To many outside observers, this looks like the end of an era. “The NFL probably peaked two years ago,” says Andrew Zimbalist, a professor of economics at Smith College who specializes in the business of sports. “It’s basically treading water.”

Yet even a middling franchise, the Carolina Panthers, sold in May for a league record $2.3 billion. Advertisers spent a record $4.6 billion for spots during NFL games last season, as well as an all-time high $5.24 million per 30 seconds of Super Bowl time. The reason is clear: In 2017, 37 of the top 50 broadcasts on U.S. television were NFL games, including four of the top five.

The Green Bay Packers, the only NFL team that shares financial statements with the public, has posted revenue increases for 15 straight seasons. Leaguewide revenue has grown more than 47 percent since 2012. Commissioner Roger Goodell’s official target is $25 billion in revenue by 2027, or roughly 6 percent annual growth.

“The business of the NFL is very strong and continues to get stronger,” says Marc Ganis, president of the consulting firm Sportscorp Ltd., and an unofficial surrogate for league owners. “It’s a great time to own an NFL franchise,” says Atlanta Falcons owner and Home Depot co-founder Arthur Blank.

The dominant sport in America has become Schrödinger’s league, both doomed and doing better than ever at the same time. This is a guide to how the NFL reached its remarkable moment of contradiction.

Early in August, during an otherwise unremarkable day of training camp for the Minnesota Vikings, a safety for the team put on a black baseball cap with a message across the front: “Make football violent again.” Andrew Sendejo, who plays one of the game’s most violent positions with exceptional violence, was protesting a new NFL rule that bans players from initiating contact with their helmets. When asked what he thought of the new rule, Sendejo replied, “I don’t.”

Until two years ago, the NFL officially denied any link between football and increased risk of degenerative brain disease. That changed when Jeff Miller, the league’s senior vice president for health and safety, told members of Congress that there is “certainly” a link between the sport and diseases such as chronic traumatic encephalopathy, which has been found in the brains of more than 100 former NFL players and is linked to mood swings, depression, impulsiveness, memory loss, and in a handful of cases, suicide. “I think the broader point, and the one that your question gets to, is what that necessarily means—and where do we go from here with that information,” Miller said in response to a question from a congresswoman.

The question now is whether football can be played safely and still be football. In the short run, the NFL has to worry about ruining the fun for the group of people, including Trump, who see football as a vital tool in forging American manhood. As far as they’re concerned, any effort to subtract violence from the game and improve safety is a threat to the country.

“If we lose football, we lose a lot in America. I don’t know if America can survive,” David Baker, president of the Pro Football Hall of Fame, said in January. A few months later, North Carolina’s head football coach Larry Fedora echoed his sentiments: “I fear that the game will be pushed so far from what we know that we won’t recognize it 10 years from now. And if it does, our country will go down, too.”

In the long run, though, the NFL also has to worry that the widespread, lasting damage to players will alienate fans. “The CTE issue is the biggest challenge facing the NFL,” says Chris Nowinski, a former Harvard University football player and professional wrestler who started the Concussion Legacy Foundation. “If they don’t change—and change soon—their legends will keep being diagnosed with the disease and it will turn people off.”

At the moment, CTE can only be diagnosed post-mortem, by slicing into brain tissue. Researchers at Boston University, working with brains donated by families, have found that at least 10 percent of deceased NFL players suffered from the disease. Once scientists find a way to diagnose CTE in the living, which researchers expect to have in fewer than five years, Nowinski believes that this number is bound to double or triple: “If some day you knew that half the players you are watching on the field already have this disease, would you be comfortable watching?”

This year the Concussion Legacy Foundation launched a campaign called “Flag Football Under 14,” based on the research that shows one of the biggest predictors of CTE is the number of years spent playing tackle football. Parents, by the looks of it, were already getting the message. Since 2012, according to annual data compiled by the Sports & Fitness Industry Association (SFIA), the number of children aged 6 to 17 playing tackle football dropped 22 percent, to just above 3 million. In a study published in JAMA Pediatrics this year, researchers found that the fall in participation coincides closely with the rise of media coverage of football’s links to traumatic brain injuries.

The attention to brain injury risks turning football playing into a regional pursuit. In New England, according to SFIA data, the number of players has decreased by 61 percent in the past decade.

Bob Broderick, co-founder of football pad company Xtech, says he has spoken to nearly 2,000 high schools in the past few years and the appetite for youth football remains undiminished in Texas and the rest of the Southeast. “Whether you want to call it a religion, culture, or way of life, that’s the way it is down there,” he says. His most common problem is parents who want pads in smaller sizes for younger kids. “I bet you, in the last month, I’ve turned away 300 kids because we don’t make a product that’s small enough.”

It’s not clear that youth football’s shrinking footprint matters much for the health of the NFL. “The vast majority of people who watch the NFL have never played tackle football in their lives,” says Ganis. As long as elite players keep coming through the college ranks, he says, the league will be fine. And if the next generation’s Tom Brady opts to play baseball, who’s going to notice?

“The reality is that football is such a fun game for fans and a good game for TV,” says Nowinski, the anti-concussion activist, “that even if the quality was slightly worse, it would still be a massively popular enterprise.”

Jerry Richardson, the 82-year-old fast-food magnate who had owned the Carolina Panthers for a quarter century, was forced to sell the team earlier this year following revelations that he had sexually harassed team employees. Richardson, who had been one of the NFL’s most powerful owners, was a prime example of the old boys’ club that runs the league. The ownership ranks include the CEO of a truck-stop chain that has been accused by federal prosecutors of cheating customers out of fuel rebates, the scion of a heating and air conditioning fortune with a DUI on his record, and several heirs to oil money. They are not necessarily the group one would choose to steer an enterprise into the chaotic future of sports and entertainment in America.

But there’s no shortage of new economy billionaires lining up to replace them, just as hedge fund chief David Tepper did with his $2.3 billion takeover the Panthers. The fury that now surrounds these men, and they are mostly men, is both a test of their power and a testament to it. As much as they might long for the days before CTE was a household term, Kaepernick was a civil rights hero, and Trump was president, they’re happy to be in the middle of the conversation. It’s proof that they still matter.

by Ira Boudway and Eben Novy-Williams, Bloomberg | Read more:
Image: Getty

[ed. Welcome back to school.]
via:

How Asia Got Crazy Rich

True to its title, Crazy Rich Asians features two hours of Asian people doing crazy and rich things. They purchase million-dollar-plus earrings; they fly helicopters to a bachelor party hosted on a floating container ship; and they host a wedding in an interior botanical garden, in which the bride walks down the aisle knee-deep in an artificial creek. Based upon Singaporean-American novelist Kevin Kwan’s 2013 novel, the film centers on a middle-class Chinese American economics professor, Rachel Chu, who travels back to her boyfriend Nick Young’s childhood home in Singapore and is introduced to his friends and their unfathomably opulent lifestyles. Its central tension pits Rachel’s American-bred individuality against the traditional, familial piety of Nick’s mother, Eleanor, who insists upon keeping the largest real estate and financial empire in the southeast Asian city-state within the families of the Singaporean elite.

The film has enjoyed substantial critical approval and been rewarded by box office numbers. For its champions, it succeeds in widening the Hollywood universe to include an underrepresented American minority group, portraying it in exceedingly optimistic terms. Many have echoed the director’s claim that “it’s not a movie, it’s a movement.” For its critics, the film is a disappointing foray into representation, obeying romantic-comedy formulae at the expense of saying something edgier about Asian-American life.

What is shared between these views is the choice to judge this film solely upon the basis of its portrayal of Asia, Asians, and Asian Americans, without a history or even acknowledgment of how they became so “crazy rich” in the first place. Without dismissing the film’s significance for so many, it should be recognized that the “Crazy Rich” and “Asian” in its title are performing different roles in the story. On the one hand, “Asian” provides political cover to “Crazy Rich,” as the film markets itself as a celebration of diversity rather than a celebration of the elite in an age of historic inequality, including within Asia and for Asian Americans themselves. On the other hand, neither is the “Crazy Rich” incidental, for to be wealthy is what marks the Asian characters as modern and relatable, even endearing.

This comes out clearly when Kwan’s story is contrasted against Amy Tan’s The Joy Luck Club. That older film drew upon stories from the life of Tan’s mother, spent in Republican-era Shanghai (1911–1949), and it featured stock imagery from turn-of-the-century China: opium dens, concubinage and rape, arranged marriages, and foot binding. I can recall such scenes because they have been seared into my brain since I was 9 years old, dragged to the theater by my Taiwan-raised yet pro-China parents (an important distinction these days), and made slightly nauseous imagining the world my grandparents had left behind. The Joy Luck Club suggests that strong family bonds were what helped Chinese women weather and ultimately escape an oppressive, traditional society. Crazy Rich Asians turns that idea on its head. The conflict between Rachel and Eleanor conveys that strong family bonds are obstacles to empowerment for a new cosmopolitan Chinese diaspora that values individualism and romance. There is an implied historical process here, then, from old Asia as the antithesis of western individualism transformed dramatically into a new Asia embodying the future of capitalism.

The film has also come under criticism for presenting only a narrow slice of the Asian experience. Despite casting ethnic Japanese, Korean, Malay, and Filipino actors, it is ultimately rooted in the international history of the Chinese diaspora and its particular brands of capitalism. It also focuses exclusively upon the diaspora’s most elite segments.

But Crazy Rich Asians was written as something loosely inspired by Kwan’s own lived experiences, and the result is a story that has more nuance than most English-language works about the Chinese diaspora. Rather than chide him for not writing a more inclusive story, it seems more useful to ask why Kwan’s tale, based upon his idiosyncratic childhood as the scion of a Singaporean banking family, has resonated so strongly with a wider audience. What has it meant in the past, and what does it mean today, to celebrate Asian wealth? (...)

Afer independence in 1959, Singapore briefly attempted to unify with Malaysia to pursue a leftist strategy of national development via import substitution industrialization. But in 1965, Singapore separated again and joined a handful of small capitalist Asian countries in projects of export-led growth, inviting foreign investment, and promoting labor-intensive light industries to move up the global value chain. They were eventually dubbed the “four tiger” or “little dragon” economies: Taiwanese televisions, South Korean cars, Hong Kong wigs, and Singaporean semiconductors.

The “four tigers” era was deemed an economic miracle, marked by relatively egalitarian development and low unemployment. By the late ’70s and ’80s, they were facing diminishing returns. Rather than follow Japan, South Korea, and Taiwan into high-tech manufacturing, Singapore pivoted into invisible exports, offering those other economies the services of accounting, legal work, and management. The government also encouraged Singaporean capital to look abroad and invest in poorer Asian countries such as Indonesia, Vietnam, Malaysia, and China, while it opened the doors for migrant workers from South Asia and other low-wage regions. It has since become a hub for international finance, but new growth has come at the cost of widening inequality.

In this sense, Singapore is not a new type of society. A century before Asian industrialization, similar patterns of inequality and patrimonial capitalism animated the celebrated novels about the European bourgeoisie, like Mansfield Park and Buddenbrooks. What those dense family dramas demonstrated was that capitalism is not just a static marketplace but also entails long processes of wealth accumulation marked by different phases and logics. A charitable reading for Crazy Rich Asians is that it is doing for the late 20th-century Chinese diaspora what those novels did for the bourgeoisie of Western Europe.

The most prominent family in Kwan’s story are the Youngs, whose original fortune dates back to Nick’s Chinese-born great-grandmother, presumably at the turn of the 20th century. The Youngs got in on the ground floor of an older, Victorian-era wealth, viewed by its caretakers as sociologically distinct from the newer elites found across the Asia-Pacific. The unstated irony is that owning lots of land in Singapore—and Malaysia and China, not to mention London and Hawaii—made the Young family this fabulously wealthy only because the rest of Asia, along with its nouveau riche, made the region so economically productive in recent decades. These tensions across geography and generation appear at the margins of the romantic plot. Nick’s cousin explains to Rachel that in Asia’s richest circles, you will find Hong Kongers, “Taiwan Tycoons,” and “Beijing Billionaires.” These families are not equals. In the novel, Eleanor initially mistakes Rachel for the heiress to a Taiwan plastics company, which Eleanor calculates as “very new money, made in the seventies and eighties, most likely.” A more palpable clash emerges from the story of Nick’s fabulously wealthy cousin, a real-estate investor, and her rocky marriage to a middle-class software engineer who frequently takes business trips to Shenzhen, China—Shenzhen, of course, a symbol of China’s own movement up the global value chain since the 1980s, having absorbed light industry and electronics manufacturing from the “four tigers.”

The film’s producers allegedly sought to minimize the book’s details of specific stereotypes between Asian groups, wary of alienating unfamiliar audience members. But the distinctions are inescapable throughout the story, and the story in fact would make little sense without them. (...)

All this is to say that for most observers in America by now, “Asia” has shed much of its earlier connotation as land of opium and concubinage, instead symbolizing the latest elite to ascend onto the world stage. For many American audiences, depictions of luxurious Singaporean parties will appear less as shocking revelation than as confirmation of a vague sense that the global economy is in transition. As satisfying as the Calthorpe hotel scene was, it is difficult to ignore just how much it mirrored “Yellow Peril” discourses by reductively portraying Chinese diasporic capitalists as a powerful and international economic force. It also points to the need to go beyond the very American, very management-inspired idea of “diversity” that would equate this film with “ethnic” movies centered on Black or Latinx American life. If modern racial categories have historically functioned as a way to make social inequality in market societies appear rooted in nature, then it follows that each of these groups has been typologized in different ways, owing to their different histories. The historic racist narrative of Black Americans was that they were lazy and undeserving of social mobility. The current narrative of Asian Americans is that they are too mobile, drilled in math and piano at an early age, hence unfair competition. This contrast in forms of racism should have been made clear, for instance, once journalists began openly to pit Black against Asian students in education policy debates. In this context, one wonders how the film will be received by the anti-globalization left or right. There is already a creeping sentiment of “Yellow Peril” in the US today, shared by all sides, suspicious of Chinese capital, labor, and college enrollments. The film borrows many of the same tropes but casts them in an innocent and humorous light. It is walking a fine line. Perhaps this is why Rachel must resolve the film’s encounters with the Singaporean capitalist sublime by insisting upon her individual desire, threatening to walk away from Nick’s family in the name of love, reassuring the audience that she may be Chinese by heritage but at heart remains unmistakably American.

The result is a certain ambivalence about Crazy Rich Asians and its reception. The film embodies an effort by the Asian diaspora to assert greater power in Hollywood, but many of them are already powerful economically, something that made both the story and its commercial success possible. It is fully understandable why the Asian diaspora is pushing for a formal equality with the European and American bourgeoisie before them; why the suggestion that Asians cannot also have the good life is a type of double standard or just textbook racism. But the substance of that equality takes the form of a highly destructive social behavior: endless wealth accumulation for its own sake, embodied in finance and real estate. So while the “four tigers” epoch successfully redistributed global wealth in a relatively egalitarian manner—as did other state-driven development projects across Asia, Africa, and the Americas—one fears that the future destiny of the new Asian bourgeoisie is to follow a by-now very old playbook of dynamic growth calcifying into a myopic old guard.

by Andrew Liu, n+1 |  Read more:
Image: Crazy Rich Asians

Tuesday, September 18, 2018

An Avalanche of Japanese Shave Ice

Before Norie Uematsu became a pastry chef, she waited all year for shave-ice season at home in Japan. Now, she decides when that season begins and ends.

At Cha-an Teahouse, in the East Village of New York, Ms. Uematsu serves refreshing bowls of kakigori — the Japanese shave ice — as soon as the subway stations are hot and sticky. She turns the handle of her vintage shave-ice machine through the end of September, or until she runs out of ripe white peaches, whichever comes first.

All kakigori starts with a block of plain ice. A machine locks the ice in place and spins it against a blade, shaving off soft, sheer flakes. As the ice piles up, kakigori makers add syrups, purées and other sweet toppings. The dessert is endlessly adaptable, which is one reason so many pastry chefs in the United States are not only adding kakigori to their menus but also extending its season.

When prepared with skill, kakigori is a feat of texture — a tall structure of uniformly light, airy and almost creamy crystals that never crunch, but deliver flavor as they dissolve on the tongue.

“To get it really fluffy, you adjust the angle of the blade,” said Ms. Uematsu, turning an iron knob on her machine. “But the finer it is, the harder it is to work with.” As the ice melts, or is worn down, the machine must be adjusted to keep the shavings downy.

In August, at a cafe in Yamanashi, Japan, I ordered a bowl of kakigori made from a block of natural ice. Someone had delivered it from the Yatsugatake Mountains, a volcanic range to the north. It seemed over the top — all that labor for a piece of ice? — but it also testified to the history of kakigori.

Before the development of freezers, shave ice was an extravagant dessert reserved only for those who could pay for the luxury of ice carved from frozen lakes and mountains and transported at great cost.

As Ms. Uematsu pointed out, kakigori has come a long way from its elite roots in the Heian period (from the end of the eighth through the 12th century). “When I was a kid, every house in Japan had a cheap kakigori machine, usually with a cute character on it, like Hello Kitty,” said Ms. Uematsu, who was born in 1980 in Numazu, Shizuoka Prefecture. “And you could buy commercial syrups for flavoring them.”

But kakigori masters at cafes in Japan can still be fiercely competitive. Many shops have lines out the door, and attentive hosts to manage those lines. Atelier Sekka, a small, serene dessert shop in the Sugamo neighborhood of Tokyo, buys enormous glassy blocks of natural ice from Mount Fuji to use as the base for its pristine mounds of kakigori. On a recent weekday morning, there was an hourlong wait for a seat.

A vintage shave-ice machine sits at the center of the stylish Tokyo tearoom Higashiya Ginza, where servers layer the shavings with plums poached in honey. At Himitsudo, where you can order while standing in line on the street, cooks turn out bowls overflowing with puréed mango and other fruits.

I found my favorite kakigori of the summer at a cafe called Kuriya Kashi Kurogi, on the grounds of the University of Tokyo. The ice was beautifully shaved with an electric machine and saturated with fresh soy milk and sweetened condensed milk, layered with whipped cheese and finally crowned with a thick, sweet and salty purée of fresh edamame. Every now and then, digging around, I hit a ridge of red bean paste.

Yoojin Chung, the general manager of Stonemill Matcha in San Francisco, added kakigori to the menu in June, about a month after the cafe opened. Though elaborately built kakigori are in style, Ms. Chung remembers tasting a particularly simple version at a cafe in Kyoto, with no toppings or creams at all, just matcha syrup.

“It was this ginormous green spectacle that came on a tray, at least 12 inches tall, and it was very intense,” Ms. Chung recalled. “I was shocked how it kept its shape despite having all this syrup.”

She compared the texture of perfect kakigori to flower petals — not quite powder and not quite grain — making it distinct from other kinds of shave ice. “It’s a simple thing that’s really hard to execute,” Ms. Chung said.

by Tejal Rao, NY Times | Read more:
Image: An Rong Xu

Le Japon Artistique
via:

The Miracle of the Mundane

On a good day, all of humanity’s accomplishments feel personal: the soaring violins of the second allegretto movement of Beethoven’s Symphony no. 7, the intractable painted stare of Frida Kahlo, the enormous curving spans of the Golden Gate Bridge, the high wail of PJ Harvey’s voice on “Victory,” the last melancholy pages of Wallace Stegner’s Angle of Repose. These works remind us that we’re connected to the past and our lives have limitless potential. We were built to touch the divine.

On a bad day, all of humanity’s failures feel unbearably personal: coyotes wandering city streets due to encroaching wildfires, American citizens in Puerto Rico enduring another day without electricity or potable water in the wake of Hurricane Maria, neo-Nazis spouting hatred in American towns, world leaders testing missiles that would bring the deaths of millions of innocent people. We encounter bad news in the intimate glow of our cell phone screens, and then project our worries onto the flawed artifacts of our broken world: the for lease sign on the upper level of the strip mall, the crow picking at a hamburger wrapper in the gutter, the pink stucco walls of the McMansion flanked by enormous square hedges, the blaring TVs on the walls of the local restaurant. On bad days, each moment is haunted by a palpable but private sense of dread. We feel irrelevant at best, damned at worst. Our only hope is to numb and distract ourselves as well as we can on our long, slow march to the grave.

On a good day, humankind’s creations make us feel like we’re here for a reason. Our belief sounds like the fourth molto allegro movement of Mozart’s Symphony no. 41, Jupiter: Our hearts seem to sing along to Mozart’s climbing strings, telling us that if we’re patient, if we work hard, if we believe, if we stay focused, we will continue to feel joy, to do meaningful work, to show up for each other, to grow closer to some sacred ground. We are thrillingly alive and connected to every other living thing, in perfect, effortless accord with the natural world.

But it’s hard to sustain that feeling, even on the best of days — to keep the faith, to stay focused on what matters most—because the world continues to besiege us with messages that we are failing. You’re feeding your baby a bottle and a voice on the TV tells you that your hair should be shinier. You’re reading a book but someone on Twitter wants you to know about a hateful thing a politician said earlier this morning. You are bedraggled and inadequate and running late for something and it’s always this way. You are busy and distracted. You are not here.

It’s even worse on a bad day, when humankind’s creations fill us with the sense that we are failing as a people, as a planet, and nothing can be done about it. The chafing smooth jazz piped into the immaculate coffee joint, the fake cracks painted on the wall at the Cheesecake Factory, the smoke from fires burning thousands of acres of dry tinder, blotting out the sun — they remind us that even though our planet is in peril, we are still being teased and flattered into buying stuff that we don’t need, or coaxed into forgetting the truth about our darkening reality. As the crowd around us watches a fountain dance to Frank Sinatra’s “Somewhere Beyond the Sea” at the outdoor mall, we peek at our phones and discover the bellowed warnings of an erratic foreign leader, threatening to destroy us from thousands of miles away. Everything cheerful seems to have an ominous shadow looming behind it now. The smallest images and bits of news can feel so invasive, so frightening. They erode our belief in what the world can and should be.

As the first total solar eclipse in America in thirty-nine years reveals itself, an email lands in my inbox from ABC that says The Great American Eclipse at the top. People are tweeting and retweeting the same eclipse jokes all morning. As the day grows dimmer, I remember that Bonnie Tyler is going to sing her 1983 hit “Total Eclipse of the Heart” on an eclipse-themed cruise off the coast of Florida soon.

Even natural wonders aren’t what they used to be, because nothing can be experienced without commentary. In the 1950s, we worried about how TV would affect our culture. Now our entire lives are a terrible talk show that we can’t turn off. It often feels like we’re struggling to find ourselves and each other in a crowded, noisy room. We are plagued, around the clock, by the shouting and confusion and fake intimacy of the global community, mid–nervous breakdown.

Sometimes it feels like our shared breakdown is making us less generous and less focused. On a bad day, the world seems to be filled with bad books and bad buildings and bad songs and bad choices. Worthwhile creations and ego-driven, sloppy works are treated to the same hype and praise; soon it starts to feel as if everything we encounter was designed merely to make some carefully branded human a fortune. Why aren’t we reaching for more than this? Isn’t art supposed to inspire or provoke or make people feel emotions that they don’t necessarily want to feel? Can’t the moon block out the sun without a 1980s pop accompaniment? So much of what is created today seems engineered to numb or distract us, keeping us dependent on empty fixes indefinitely.

Such creations feel less like an attempt to capture the divine than a precocious student’s term paper. If any generous spirit shines through, it’s manufactured in the hopes of a signal boost, so that some leisure class end point can be achieved. Our world is glutted with products that exist to help someone seize control of their own life while the rest of the globe falls to ruin. Work (and guidance, and leadership) that comes from such a greedy, uncertain place has more in common with that fountain at the outdoor mall, playing the same songs over and over, every note an imitation of a note played years before.

But human beings are not stupid. We can detect muddled and self-serving intentions in the artifacts we encounter. Even so, such works slowly infect us with their lopsided values. Eventually, we can’t help but imagine that this is the only way to proceed: by peddling your own wares at the expense of the wider world. Can’t we do better than this, reach for more, insist on more? Why does our culture make us feel crazy for trying?

by Heather Havrilesky, Longreads |  Read more:
Image: What If This Were Enough?
[ed. What a remarkable essay. The antithesis is here: Instagram is Supposed to Be Friendly. So Why is it Making People so Miserable?]

Burning Man: The First Time

A Premature Attempt at the 21st Century Canon

A panel of critics tells us what belongs on a list of the 100 most important books of the 2000s … so far.

Why Now?

Okay, assessing a century’s literary legacy after only 18 and a half years is kind of a bizarre thing to do.

Actually, constructing a canon of any kind is a little weird at the moment, when so much of how we measure cultural value is in flux. Born of the ancient battle over which stories belonged in the “canon” of the Bible, the modern literary canon took root in universities and became defined as the static product of consensus — a set of leather-bound volumes you could shoot into space to make a good first impression with the aliens. Its supposed permanence became the subject of more recent battles, back in the 20th century, between those who defended it as the foundation of Western civilization and those who attacked it as exclusive or even racist.

But what if you could start a canon from scratch? We thought it might be fun to speculate (very prematurely) on what a canon of the 21st century might look like right now. A couple of months ago, we reached out to dozens of critics and authors — well-established voices (Michiko Kakutani, Luc Sante), more radical thinkers (Eileen Myles), younger reviewers for outlets like n+1, and some of our best-read contributors, too. We asked each of them to name several books that belong among the most important 100 works of fiction, memoir, poetry, and essays since 2000 and tallied the results. The purpose was not to build a fixed library but to take a blurry selfie of a cultural moment.

Any project like this is arbitrary, and ours is no exception. But the time frame is not quite as random as it may seem. The aughts and teens represent a fairly coherent cultural period, stretching from the eerie decadence of pre-9/11 America to the presidency of Donald Trump. This mini-era packed in the political, social, and cultural shifts of the average century, while following the arc of an epic narrative (perhaps a tragedy, though we pray for a happier sequel). Jonathan Franzen’s The Corrections, one of our panel’s favorite books, came out ten days before the World Trade Center fell; subsequent novels reflected that cataclysm’s destabilizing effects, the waves of hope and despair that accompanied wars, economic collapse, permanent-seeming victories for the once excluded, and the vicious backlash under which we currently shudder. They also reflected the fragmentation of culture brought about by social media. The novels of the Trump era await their shot at the canon of the future; because of the time it takes to write a book, we haven’t really seen them yet.

You never know exactly what you’ll discover when sending out a survey like this, the results of which owe something to chance and a lot to personal predilections. But given the sheer volume of stuff published each year, it is remarkable that a survey like this would yield any kind of consensus—which this one did. Almost 40 books got more than one endorsement, and 13 had between three and seven apiece. We have separately listed the single-most popular book; the dozen “classics” with several votes; the “high canon” of 26 books with two votes each; and the rest of the still-excellent but somewhat more contingent canon-in-utero. (To better reflect that contingency, we’ve included a handful of critics’ “dissents,” arguing for alternate books by the canonized authors.)

Unlike the old canons, ours is roughly half-female, less diverse than it should be but generally preoccupied with difference, and so fully saturated with what we once called “genre fiction” that we hardly even think of Cormac McCarthy’s post-apocalyptic The Road, Colson Whitehead’s zombie comedy Zone One, Helen Oyeyemi’s subversive fairy tales, or even the Harry Potter novels as deserving any other designation than “literature.” And a whole lot of them are, predictably, about instability, the hallmark of the era after the “end of history” that we call now.

At least one distinctive new style has dominated over the past decade. Call it autofiction if you like, but it’s really a collapsing of categories. (Perhaps not coincidentally, such lumping is better suited to “People Who Liked” algorithms than brick-and-mortar shelving systems.) This new style encompasses Elena Ferrante’s Neapolitan novels; Sheila Heti’s self-questing How Should a Person Be?; Karl Ove Knausgaard’s just-completed 3,600-page experiment in radical mundanity; the essay-poems of Claudia Rankine on race and the collage­like reflections of Maggie Nelson on gender. It’s not really a genre at all. It’s a way of examining the self and letting the world in all at once. Whether it changes the world is, as always with books, not really the point. It helps us see more clearly.

Our dozen “classics” do represent some consensus; their genius seems settled-on. Among them are Kazuo Ishiguro’s scary portrait of replicant loneliness in Never Let Me Go; Roberto Bolaño’s epic and powerfully confrontational 2666; Joan Didion’s stark self-dissection of grief in The Year of Magical Thinking. They aren’t too surprising, because they are (arguably as always, but still) great.

And then there’s The Last Samurai, Helen DeWitt’s debut: published at the start of the century, relegated to obscurity (and overshadowed by a bad and unrelated Tom Cruise movie of the same name), and now celebrated by more members of our panel than any other book. That’s still only seven out of 31, which gives you a sense of just how fragile this consensus is. Better not launch this canon into space just yet.

by Boris Kachka/Editors, Vulture | Read more:
Image:Tim McDonagh

Monday, September 17, 2018

Wanna Get Really High?

Dabbing, consuming a cannabis concentrate using a vaporizing device, has moved into the mainstream as companies produce high-THC concentrates

Concentrates, a rapidly growing segment of the legal marijuana market, reduce the plant to its chemical essence. The point is to get as high as possible. An't reed it works.

Manufacturing concentrates involves using solvents like alcohol, carbon dioxide and other chemicals to strip away the plant’s leaves and then processing the potent remains. The final products can resemble cookie crumbles, wax and translucent cola spills.

A standard method of concentrate consumption, known as dabbing, uses vaporizing devices called rigs that resemble bongs, but instead of a bowl to hold the weed, there’s a nail made from titanium, quartz or a similarly sturdy material. The dabber heats the nail with a blowtorch and then uses a metal tool to vaporize a dab of concentrate on the nail.

Common sense suggests a dabbing habit could be more harmful than an ordinary marijuana habit, but the research is limited. Visually, the process is sometimes compared to smoking strongly stigmatized drugs like crack and crystal methamphetamine.

For years, dabbing has been considered an outcast subculture within the misfit world of cannabis. With so many companies angling to associate themselves with moderate use for functional adults, many want nothing to do with dabbing.

But as cannabis consumption has moved into the mainstream, dabbing has followed. Today a number of portable devices aim to deliver the intense high of dabbing concentrates in a more user-friendly way. At cannabis industry parties, there’s often a “dab bar” where attendants fire up the rigs, and wipe off the mouthpieces after each use. Machines called e-nails allow users to set a rig’s exact temperature to maximize vapor and flavor. On YouTube, there’s a lively competition among brain surgeons and rocket scientists to see who can inhale the heftiest dab.

Strong west coast weed can approach 30% THC. Concentrates, which dispensaries sell by the gram, range between 60% and 80%, but they can be even stronger. One form called crystalline is reportedly 99% THC. (The oil in increasingly ubiquitous vape pens can also be 70% or higher THC but it’s vaporized in smaller doses.)

Concentrates aren’t a new concept; hash or hashish, the compacted resin of the cannabis plant, has been used in central and south Asia for more than 1,000 years. But legalization in North America has laid the groundwork for innovation in the craft. As with most things cannabis, concentrate fanatics can argue endlessly about their preferences – solvent-free, whole-plant, resin, live resin, shatter – and the uninitiated struggle to discern much difference in the effect.

by Alex Halperin, The Guardian |  Read more:
Image: George Wylesol
[ed. This at the local pot shop: The Most Exotic Hash on the Market.]

What Dying Traditions Should Be Preserved?


Castles, postcards, drive-in theaters, artistic matchboxes, home economics classes, sitting on porches, neon signs, two martini lunches, manual transmissions, cursive writing, handwritten letters, handkerchiefs, canning and preserving, Viking funerals, manners/politeness, taxidermy, knitting, listening to full albums, bridge (card game), stamp/coin collecting, whistling, corn cob holders, rolling joints, learning new languages, shoe repair, friendship bracelets, civilized debate, and more...

And the apparent winner (by predominant number of posts): a nice sit-down dinner with good conversation, home-cooked food, no hats, and no distractions (phones, tv, games, etc.).

[ed. From the Reddit post: What is a dying tradition you believe should be preserved?]
Image: via