Thursday, July 25, 2019

Living Intimately With Thoughts of Death

Since my cancer diagnosis, I have lived intimately with thoughts of death. Cancer patients of all ages and stages, as well as people with other ruinous conditions, often experience “a double frame of mind,” as the polemicist Christopher Hitchens once put it. Laboring to survive in the present, we simultaneously imagine our future demise. Of course, feelings and beliefs about mortality range widely. But a number of thinkers have set out to help those who suspect that introspection about this state of mind may be the most important work we can undertake.

If you want to evaluate your own perspective on death, try filling in the Death Attitude Profile — Revised questionnaire developed by the psychologists Paul T.P. Wong, Gary T. Reker and Gina Gesser. A series of 32 propositions, the survey measures death anxiety: worries about self-loss, , missed opportunities, stolen moments, the prospect of your or your survivors’ suffering, the unknown. It also gauges death acceptance: satisfaction at having led a good life, at acknowledging a natural ending, at escaping physical pain or gaining a desirable afterlife or merging with the cosmos.

When my husband and I compared our responses to this test, what struck me was how complicated we all are. The prospect of my own death arouses more fright in me than his does in him, but he is more convinced than I that death is a grim experience. What, then, do the psychologists really tell us? After taking the quiz, the palliative care nurse Sallie Tisdale found her score was “all over the place, internally contradictory.”

To encourage people to ponder their own extinction, Ms. Tisdale, the author of “Advice for Future Corpses (and Those Who Love Them),” recommends the Japanese film “After Life.” In a posthumous state, the dead in this movie pick a single memory in which to live forever. With delicacy, its director, Hirokazu Kore-eda, implicitly asks, what memory would you choose?

This is a difficult assignment for me. Should I choose a joyous holiday get-together with the extended family? Well, those events often disintegrated into mayhem. Maybe I should select the occasion of a professional success. Unfortunately, those moments were often fraught with tension. Besides, do I believe in an afterlife?

Yet watching the movie and considering its premise — or reading Ms. Tisdale’s book or taking the questionnaire — equip the mind with the doubled frames through which many imperiled patients view the world. Not unlike the double consciousness W.E.B. DuBois ascribed to African-Americans, the double consciousness that I experience can devolve into debilitating self-division. However, it can also evolve into an intoxicating clarification of the human condition.

The drawbacks of living in the present with corrosive dread about a diminished or canceled future seem abundantly clear. When the substance of the everyday is drained of reality, leached by visions of impending debilitation and disappearance, double consciousness leads to depression.

As fatigue or nausea take over, I am torn asunder by the morbid conviction that this might very well be the last time I travel, that soon I won’t have the strength to prepare the meal that I am cooking or rise from the bed I am making. Will I survive long enough to finish the next project? Worse, why start the next project, if it cannot be completed? Riven by contrary impulses, I want to live today, but trepidations about tomorrow render the present flimsy or vacuous. Intimations of mortality rob us of confidence in our autonomy before the dying process finishes that job.

by Susan Gubar, NY Times |  Read more:
Image: Jaime Jacob
[ed. I've been trying to remember the name of that Japanese film: After Life. I imagine for some it's not so much the fear of death but the process itself. Like Woody Allen said, I just don't want to be there when it happens. Strange isn't it? All our lives we're taught to be in control of our physical and emotional impulses, but when we want to avoid end of life suffering control is taken away (another issue).]

Wednesday, July 24, 2019

Ultra-Low Mortgage Rates No Relief for Home Sales

The relentlessness of falling home sales is starting to baffle the real estate industry that had expected plunging mortgage rates to fire up sales: Across the US, sales of “existing homes” (previously owned single-family houses, townhouses, condos, and co-ops) in June dropped 2.2% from June last year, to a seasonally adjusted annual rate of 5.27 million homes, according to the National Association of Realtors. It was the 16th month in a row of year-over-year declines (data via YCharts):


“Home sales are running at a pace similar to 2015 levels – even with exceptionally low mortgage rates, a record number of jobs and a record high net worth in the country,” lamented NAR’s report.

And the plunge in mortgage rates from the November high has been spectacular. The Fed has hiked rates one more time in December and so far has not cut rates. But yields across the curve have been dropping in anticipation of a veritable Niagara Falls of rate cuts and whatnot.

In June, the Freddie Mac average commitment rate for a 30-year, conventional, fixed-rate mortgage fell to 3.80%. This is over a full percentage point lower than the average rate in November of 4.87%:


Sales of existing homes in June, at 5.27 million seasonally adjusted annual rate, are now back in the range where they’d been in 2015. The chart below shows how home sales topped out late 2017 and early 2018 at a pace above 5.5 million, as mortgage rates were already rising. When mortgage rates began ascending at a steeper slope, sales fell sharply, as you would expect, hitting the low point in December and January for deals signed in November and December.

But given the plunge in mortgage rates since then, expectations were that sales would resurge. While sales have ticked up from those lows, the move was, for the industry, confusingly feeble (data via YCharts):


The astonishment of the industry with these falling home sales despite ultra-low mortgage rates emerges in the report’s comment, as the industry is grappling with potential answers:

Either a strong pent-up demand will show in the upcoming months, or there is a lack of confidence that is keeping buyers from this major expenditure. It’s too soon to know how much of a pullback is related to the reduction in the homeowner tax incentive.

By home category: Sales of single-family houses in June fell 1.7% year-over-year to a rate of 4.76 million, and sales of condos and co-ops fell 6.5% year-over-year to a rate of 580,000.

Sales by region in June show the steepest year-over-year declines in the West and the Northeast:

Northeast: -4.2%, to an annual rate of 680,000
Midwest: -1.6%, to an annual rate of 1.25 million
South: -0.4%, to an annual rate of 2.25 million
West: -5.2%, to an annual rate of 1.09 million.

Inventory for sale in June was about flat compared to June last year. Given slower sales, supply at the current rate of sales ticked up to 4.3 months (from 4.0 months a year ago). This is plenty of supply. But it’s the wrong supply.

After years of price increases, home prices together have moved up the ladder, including the lower end that is now priced where mid-range used to be a few years ago, and there is no more “low end” in many markets, and the new low end has moved out of range for many buyers. High prices kill demand. And low mortgage rates, after years of low mortgage rates, are having only a limited effect on sales volume.

But the median price of existing homes sold in June across the US – median means half sold for more and half sold for less – rose 4.3% from a year ago to a record $285,700.

So here is the visual definition of a “demand killer”: Since June 2012, so in seven years, the median price has surged 52%. And mortgage rates in 2012 were in about the same range as now. No one in the industry should be surprised that sales are slow:

by Wolf Richter, Wolf Street |  Read more:
Images: Wolfstreet

The Case of Al Franken

Last month, in Minneapolis, I climbed the stairs of a row house to find Al Franken, Minnesota’s disgraced former senator, wandering around in jeans and stocking feet. It was a sunny day, but the shades were mostly drawn. Takeout containers of hummus and carrot sticks were set out on the kitchen table. His wife, Franni Bryson, was stuck in their apartment in Washington, D.C., with a cold, and he had evidently done the best he could to be hospitable. But the place felt like the kind of man cave where someone hides out from the world, which is more or less what Franken has been doing since he resigned, in December, 2017, amid accusations of sexual impropriety.

There had been occasional sightings of him: in Washington, people mentioned having glimpsed him riding the Metro or browsing alone in a bookstore; there was gossip that he had fallen into a depression, and had been seen in a fetal position on a friend’s couch. But Franken had experienced one of the most abrupt downfalls in recent political memory. He had been perhaps the most recognizable figure in the Senate, in part because he’d entered it as a celebrity: a best-selling author and a former writer and performer on “Saturday Night Live.” Now Franken was just one more face in a gallery of previously powerful men who had been brought down by the #MeToo movement, and whom no one wanted to hear from again. America had ghosted him.

Only two years ago, Franken was being talked up as a possible challenger to President Donald Trump in 2020. In Senate hearings, Franken had proved himself to be one of the most effective critics of the Trump Administration. His tough questioning of Jeff Sessions, Trump’s nominee for Attorney General, had led Sessions to recuse himself from the investigation into Russian influence in the 2016 election, and prompted the appointment of Robert Mueller as special counsel.

As it turns out, Franken’s only role in the 2020 Presidential campaign has been as a figure of controversy. On June 4th, Pete Buttigieg was widely criticized on social media for saying that he would not have pressured Franken to resign—as had virtually all his Democratic rivals who were then in the Senate—without first learning more about the alleged incidents. At the same time, the Presidential candidacy of Senator Kirsten Gillibrand has been plagued by questions about her role as the first of three dozen Democratic senators to demand Franken’s resignation. Gillibrand has cast herself as a feminist champion of “zero tolerance” toward sexual impropriety, but Democratic donors sympathetic to Franken have stunted her fund-raising and, Gillibrand says, tried to “intimidate” her “into silence.”

Franken’s fall was stunningly swift: he resigned only three weeks after Leeann Tweeden, a conservative talk-radio host, accused him of having forced an unwanted kiss on her during a 2006 U.S.O. tour. Seven more women followed with accusations against Franken; all of them centered on inappropriate touches or kisses. Half the accusers’ names have still not become public. Although both Franken and Tweeden called for an independent investigation into her charges, none took place. This reticence reflects the cultural moment: in an era when women’s accusations of sexual discrimination and harassment are finally being taken seriously, after years of belittlement and dismissal, some see it as offensive to subject accusers to scrutiny. “Believe Women” has become a credo of the #MeToo movement.

At his house, Franken said he understood that, in such an atmosphere, the public might not be eager to hear his grievances. Holding his head in his hands, he said, “I don’t think people who have been sexually assaulted, and those kinds of things, want to hear from people who have been #MeToo’d that they’re victims.” Yet, he added, being on the losing side of the #MeToo movement, which he fervently supports, has led him to spend time thinking about such matters as due process, proportionality of punishment, and the consequences of Internet-fuelled outrage. He told me that his therapist had likened his experience to “what happens when primates are shunned and humiliated by the rest of the other primates.” Their reaction, Franken said, with a mirthless laugh, “is ‘I’m going to die alone in the jungle.’”

Now sixty-eight, Franken is short and sturdily built, with bristly gray hair, tortoiseshell glasses, and a wide, froglike mouth from which he tends to talk out of one corner. Despite his current isolation, Franken is recognized nearly everywhere he goes, and he often gets stopped on the street. “I can’t go anywhere without people reminding me of this, usually with some version of ‘You shouldn’t have resigned,’ ” Franken said. He appreciates the support, but such comments torment him about his departure from the Senate. He tends to respond curtly, “Yup.”

When I asked him if he truly regretted his decision to resign, he said, “Oh, yeah. Absolutely.” He wishes that he had appeared before a Senate Ethics Committee hearing, as he had requested, allowing him to marshal facts that countered the narrative aired in the press. It is extremely rare for a senator to resign under pressure. No senator has been expelled since the Civil War, and in modern times only three have resigned under the threat of expulsion: Harrison Williams, in 1982, Bob Packwood, in 1995, and John Ensign, in 2011. Williams resigned after he was convicted of bribery and conspiracy; Packwood faced numerous sexual-assault accusations; Ensign was accused of making illegal payoffs to hide an affair.

A remarkable number of Franken’s Senate colleagues have regrets about their own roles in his fall. Seven current and former U.S. senators who demanded Franken’s resignation in 2017 told me that they’d been wrong to do so. Such admissions are unusual in an institution whose members rarely concede mistakes. Patrick Leahy, the veteran Democrat from Vermont, said that his decision to seek Franken’s resignation without first getting all the facts was “one of the biggest mistakes I’ve made” in forty-five years in the Senate. Heidi Heitkamp, the former senator from North Dakota, told me, “If there’s one decision I’ve made that I would take back, it’s the decision to call for his resignation. It was made in the heat of the moment, without concern for exactly what this was.” Tammy Duckworth, the junior Democratic senator from Illinois, told me that the Senate Ethics Committee “should have been allowed to move forward.” She said it was important to acknowledge the trauma that Franken’s accusers had gone through, but added, “We needed more facts. That due process didn’t happen is not good for our democracy.” Angus King, the Independent senator from Maine, said that he’d “regretted it ever since” he joined the call for Franken’s resignation. “There’s no excuse for sexual assault,” he said. “But Al deserved more of a process. I don’t denigrate the allegations, but this was the political equivalent of capital punishment.” Senator Jeff Merkley, of Oregon, told me, “This was a rush to judgment that didn’t allow any of us to fully explore what this was about. I took the judgment of my peers rather than independently examining the circumstances. In my heart, I’ve not felt right about it.” Bill Nelson, the former Florida senator, said, “I realized almost right away I’d made a mistake. I felt terrible. I should have stood up for due process to render what it’s supposed to—the truth.” Tom Udall, the senior Democratic senator from New Mexico, said, “I made a mistake. I started having second thoughts shortly after he stepped down. He had the right to be heard by an independent investigative body. I’ve heard from people around my state, and around the country, saying that they think he got railroaded. It doesn’t seem fair. I’m a lawyer. I really believe in due process.”

Former Senate Minority Leader Harry Reid, who watched the drama unfold from retirement, told me, “It’s terrible what happened to him. It was unfair. It took the legs out from under him. He was a very fine senator.” Many voters have also protested Franken’s decision. A Change.org petition urging Franken to retract his resignation received more than seventy-five thousand signatures. It declared, “There’s a difference between abuse and a mistake.”

by Jane Mayer, New Yorker |  Read more:
Image: Geordie Wood
[ed. I have zero patience for people who use the term "who could have known?" after making stupid, irresponsible decisions, despite ample arguments against them at the time (Iraq?). See also: The reason there’s no #MeToo for domestic violence (Penelope Trunk).]

A Decade of Low Interest Rates Is Changing Everything

It’s hard to wrap your head around just how low U.S. interest and bond yields are—still are—a decade after the Great Recession ended. Year after year, prognosticators said that rates were bound to go back up soon: Just be ready. That exercise has proved to be like waiting for Godot.

In 2018, Jamie Dimon, chief executive officer of JPMorgan Chase & Co., put Americans on alert to the likelihood of higher interest rates. He said the global benchmark for longer-term rates, the yield on a 10-year Treasury bond, could go above 5%. Right now it’s just a hair above 2%. Thirty-year mortgage rates are a fraction of long-run averages, and companies too are paying very little to borrow. All that cheap money has been helping the economy along. On the other side of the ledger, bank depositors are getting paid only a fraction of 1% on their savings.

The longevity of low rates has upended long-standing assumptions about money and reshaped a generation of investors, traders, savers, and policymakers. The Federal Reserve has tried to push the U.S. into a higher-rate regime, raising rates nine times since 2015, when the key short-term rate was near zero. But now the central bank appears ready to reverse course and start cutting again when it meets at the end of July. “This is the new abnormal,” says David Kelly, chief global strategist at JPMorgan Asset Management, which oversees $1.8 trillion. “Normally when you are in this phase of an expansion, you have a rising inflation problem, a Federal Reserve overtightening to slow the economy, and businesses that can’t afford to borrow. None of that is true right now.”

Investors are betting that a quarter-percentage-point rate cut is all but certain, according to prices in the futures market. Fed Chair Jerome Powell reinforced those views with remarks to Congress on July 10 and 11. He cited rising global risks, low inflation, and weakening business investment and manufacturing. Depressed U.S. rates come as other central banks, including the European Central Bank, have turned more dovish—even with their rates already set below zero. (...)

For banks, the squeeze in long-term rates isn’t ideal. That’s because they tend to fund long-term investments with short-term debt, so they prosper when long-run rates are significantly higher than short ones. In the U.S., banks have still been able to profit, with the top five firms cracking $30 billion in quarterly earnings for the first time. But some big commercial banks have warned that lower interest rates are weighing on their outlooks for revenues from lending.

Individuals have had to get used to earning paltry rates. The national average rate on savings accounts is 0.10%, little changed from four years ago and down from 0.30% in 2009, according to data from Bankrate.com. In 2000, well before the financial crisis, the rate was 1.73%. “We never got to the would-be promised land with respect to higher rates,” says Mark Hamrick, senior economic analyst at Bankrate.com. “This has been the difference for savers between having more money and not.”

The problem is the same for institutions that manage savings on behalf of others. Pension funds, overseeing trillions in retirees’ future cash, have been ratcheting down return expectations. The 30-year Treasury bond, a favored debt security, yields about 2.5%—compared with an average 6.5% since the 1970s. Even a record rise in stock prices hasn’t solved the low-return problem for pension funds, because many of them cut their allocations to equities after the financial crisis. Ben Meng, chief investment officer of the California Public Employees’ Retirement System, said in June that the expected return for his pension portfolio over the next 10 years would be 6.1%, down from a previous target of 7%.

Where low rates really bite isn’t in current returns but in the future gains investors can reasonably expect. Interest rates set a kind of baseline for the return on all assets. As they fall, bond values rise and stocks often do, too. But once rates have settled at or near rock bottom, there’s less room for that kind of price appreciation.

by Liz McCormick, Bloomberg |  Read more:
Image: Daphne Geisler

Tuesday, July 23, 2019


photo: markk

From #TelegramGate to #RickyLeaks: Puerto Rico is on 🔥!

Two weeks ago, Puerto Rico's Center for Investigative Journalism published one of the most consequential investigative stories in the island's history: a trove of leaked private Telegram chats between Governor Ricardo Rossello and his most senior advisors and officials, in which the group use crude, homophobic and misogynist labels to mock and degrade opposition figures, Puerto Rican celebrities, and the people of Puerto Rico as they struggled with the aftermath of hurricanes Maria and Irma, left to swelter and die by a local and national government that had abandoned them.

Since then, Puerto Rico has been roiled by mass demonstrations, initially calling for Rossello's resignation, but now for deep, structural reforms to an island whose long history has been one of colonial occupation, oppression, and looting.

The protests have been led by feminists and queer activists, supported by the likes of Ricky Martin, a beloved, gay Puerto Rican pop star who was targeted for homophobic slurs in the leaks. As they've gained strength, the protests have drawn out more and more people from all walks of life -- with the vanguard still made up by political radicals who will not accept cosmetic compromises.

The Puerto Rican government has responded with riot squads and violent suppression, in a spectacular miscalculation that has only brought out more people. To make things worse, the police appear to have manufactured a casus belli by setting off fireworks behind their lines, a fraud so transparent that it has robbed them of any credibility they had left.

There's no sign that the protests are losing steam. Instead, they're gaining momentum, thanks in part to a second blockbuster from the Center for Investigative Journalism, detailing a high-stakes web of corruption with millions in looted public funds and bribes that goes straight to the top.

A small, densely populated island with a shameful colonial past up in arms demanding self-rule and an end to autocracy and corruption? If it's not Hong Kong, it must be Puerto Rico.

by Cory Doctorow, Boing Boing |  Read more:
Image: Joe Raedle/Getty Images via
[ed. Puerto Rico's been treated like the U.S.'s bastard stepchild for decades (some people still without electricity, nine months after Hurricane Maria).]

The Overprotected American Child

Why not let them walk to school alone? Parents and communities are figuring out ways to give their children more independence—and it just may help them to become less anxious, more self-reliant adults.

A few weeks ago I left my 9-year-old daughter home alone for the first time. It did not go as planned.

That’s because I had no plan. My daughter was sick. My husband was out of town. And I needed to head to the drugstore—a five-minute walk away—to get some medicine for her. So I made sure my daughter knew where to find our rarely used landline phone, quizzed her on my cellphone number and instructed her not to open the front door for anyone. Then I left. Twenty minutes later I was back home. Both of us were a bit rattled by the experience—her first time completely alone, with no supervising adult!—but we were fine.

I had been postponing this moment of independence for my daughter for months, held back by worry over the potential catastrophes. But I know that this way of thinking is part of a larger social problem. Many have lamented the fact that children have less independence and autonomy today than they did a few generations ago. Fewer children are walking to school on their own, riding their bicycles around neighborhoods or going on errands for their parents. There have been several high-profile cases of parents actually being charged with neglect for allowing their children to walk or play unsupervised. We’re now seeing a backlash to all this pressure for parental oversight: Earlier this year, the state of Utah enacted a new “free-range” parenting law that redefined neglect to specifically exclude things like letting a child play in a park or walk to a nearby store alone.

Overzealous parenting can do real harm. Psychologists and educators see it as one factor fueling a surge in the number of children and young adults being diagnosed with anxiety disorders. According to a study published this year in the Journal of Developmental & Behavioral Pediatrics, the number of children aged 6 to 17 whose parents said they were currently diagnosed with anxiety grew from 3.5% in 2007 to 4.1% in 2012. And in a 2017 survey of more than 31,000 college students by the American College Health Association, 21.6% reported that they had been diagnosed with or treated for anxiety problems during the previous year. That is up from 10.4% in a 2008 survey.

A big 2007 study, published in Clinical Psychology Review, surveyed the scientific literature on how much parenting influences the development of anxiety in kids. The parenting behavior that had the strongest impact of any kind was “granting autonomy”—defined as “parental encouragement of children’s opinions and choices, acknowledgment of children’s independent perspectives on issues, and solicitation of children’s input on decisions and solutions of problems.” More autonomy was associated with less childhood anxiety. (Genes play an even bigger role, however, in individual differences in anxiety.)

For children who are already anxious, overprotecting them can make it worse. “It reinforces to the child that there is something they should be scared of and the world is a dangerous place and ‘I can’t do that for myself,’ ” says Rebecca Rialon Berry, a clinical psychologist at the NYU Langone Child Study Center.

A lack of autonomy and independence can also stymie the development of self-confidence and may cause children to remain dependent on parents and others to make decisions for them when they become adults, says Jack Levine, a developmental pediatrician in New York. And because children naturally want more independence as they grow, thwarting that desire can cause them to become angry and act out, notes Brad Sachs, a family psychologist in Columbia, Md.

Like a lot of Generation Xers, I have my own memories of a carefree childhood riding bicycles and playing tag with other neighborhood children, my parents nowhere in sight. They seemed to trust their instincts. But today, how do you go with your gut when you’re bombarded by hyperventilating social media posts, shrill parenting advice books and a neurotic cultural tide? And what about disapproving neighbors—and spouses? My own husband wasn’t thrilled when I told him that I’d left our daughter home alone. “She could have hit her head. Or choked,” he said. (To be fair to him, both things have actually happened to her—and this is when we’ve been around.) (...)

Michael J. Hynes, superintendent of the Patchogue-Medford Schools on Long Island in New York, launched a Let Grow project last fall because he was seeing “kids more and more bubble wrapped as the years go on,” he says. “I’ve noticed they are averse to risk-taking.”

The children in five of the district’s seven elementary schools now have one day when their only homework is to do something new. (Some classes also write about the experience.) Project suggestions, to do alone or with a friend, include walking the dog, exploring the woods and “playing night tag.” Let Grow also helps schools to launch Play Clubs in which children can play freely in the playground or gym before or after school. The organization suggests that schools enlist one adult to act as a “lifeguard” but otherwise let youngsters alone to figure out what and how to play—and to solve their own problems.

After nearly a year of the effort, Mr. Hynes says that he’s seen positive results in the district. “I can’t say test scores went up, but I believe the kids are better behaved and more self-confident. Students are taking risks in the classroom. Normally shy kids are now raising their hands.” (...)

Anne Marie Albano, director of the Columbia University Clinic for Anxiety and Related Disorders in New York, reminds parents that the ultimate goal is to have their children be self-sufficient by the time they leave home for college or the workplace. She and her colleagues have come up with a list of milestones that adolescents should achieve before high-school graduation, including being able to advocate for themselves with teachers and other authority figures, seeing a doctor without a parent and waking themselves up in the morning on their own. “We have parents who call their college student at Harvard or Michigan and wake them up every morning,” she says. You do not want to be that parent.

Even when children are thrilled to gain some independence, parents often have to learn to cope with their own anxiety. Heidi Thompson, lives with her husband and two children in Calais, Vt., a town where children often run around unsupervised. Still, Ms. Thompson, a psychotherapist, was nervous when her daughter wanted to participate in a ritual for neighborhood kids the summer before seventh grade: camping overnight without adults on an island in the nearby lake. Ms. Thompson reluctantly gave her OK. “I was up all night,” she said. In the morning, however, her daughter, “came home so excited. We want them to feel that the world overall is a safe place,” says Ms. Thompson.

Of course, when children try something on their own, it doesn’t always go smoothly. They may take the wrong bus or choose not to study for a test—and then bomb it.

Such outcomes point to the one autonomy milestone that parents find particularly difficult, says Joseph F. Hagan Jr., clinical professor in pediatrics at the University of Vermont and the co-editor of the American Academy of Pediatrics’ Bright Futures guidelines for health professionals. “Part of independence is to make your own decisions,” he says—including “the right to make a wrong decision.”

by Andrea Petersen, WSJ | Read more:
Image: David Arky

Monday, July 22, 2019

Steely Dan


Charlie Freak had but one thing to call his own. Three weight ounce pure golden ring no precious stone. Five nights without a bite. No place to lay his head. And if nobody takes him in he'll soon be dead. On the street he spied my face I heard him hail. In our plot of frozen space he told his tale. Poor man, he showed his hand. So righteous was his need. And me so wise. I bought his prize. For chicken feed. Newfound cash soon begs to smash a state of mind. Close inspection fast revealed his favorite kind. Poor kid, he overdid. Embraced the spreading haze. And while he sighed his body died. In fifteen ways. When I heard I grabbed a cab to where he lay. 'Round his arm the plastic tag read D.O.A. Yes Jack, I gave it back. The ring I could not own. Now come my friend I'll take your hand. And lead you home.

[ed. See also: Parker's Band, Pretzel Logic (Steely Dan).]

American Green

Although there are plenty of irrational aspects to life in modern America, few rival the odd fixation on lawns. Fertilizing, mowing, watering — these are all-American activities that, on their face, seem reasonable enough. But to spend hundreds of hours mowing your way to a designer lawn is to flirt, most would agree, with a bizarre form of fanaticism. Likewise, planting a species of grass that will make your property look like a putting green seems a bit excessive — yet not nearly as self-indulgent as the Hamptons resident who put in a nine-hole course with three lakes, despite being a member of an exclusive golf club located across the street. And what should we make of the Houston furniture salesman who, upon learning that the city was planning to ban morning mowing — to fight a smog problem comparable to Los Angeles’s — vowed to show up, bright and early, armed and ready to cut.“I’ll pack a sidearm,” he said. “What are they going to do, have the lawn police come and arrest me?”

Surprisingly, the lawn is one of America’s leading “crops,” amounting to at least twice the acreage planted in cotton. In 2007, it was estimated that there were roughly twenty-five to forty million acres of turf in the United States. Put all that grass together in your mind and you have an area, at a minimum, about the size of the state of Kentucky, though perhaps as large as Florida. Included in this total were fifty-eight million home lawns plus over sixteen thousand golf-course facilities (with one or more courses each) and roughly seven hundred thousand athletic fields. Numbers like these add up to a major cultural preoccupation.

Not only is there already a lot of turf, but the amount appears to be growing significantly. A detailed study found that between 1982 and 1997, as suburban sprawl gobbled up the nation, the lawn colonized over 382,850 acres of land per year. Even the amount of land eligible for grass has increased, as builders have shifted from single-story homes to multi-story dwellings with smaller footprints. The lawn, in short, is taking the country by storm.

Lawn care is big business, with Americans spending an estimated $40 billion a year on it. That is more than the entire gross domestic product of the nation of Vietnam. Lawn care has become such a competitive field that something as simple as choosing a company name can challenge even the savviest landscape professional. A glance at the national lawn-care directory reveals a very imaginative crowd, able to move beyond the old medical standbys such as Lawn Doctor and Lawn Medic, and such obvious choices as Green Lawn of Riverside, California (not to be confused with Lawn Green of Sacramento, California), into the realm of the avant-garde: Lawn Rescue, Lawn Authority, Lawn Express, Lawn Manicure, Lawn One, Lawn Genies, Lawnsense, Lawn Magic, Ultralawn, Lawnamerica, Lawn Rangers (of Texas, of course), Lawn Barbers, as well as more inventive concoctions such as Marquis de Sodding, the Sod Fathers, and Mow Better Lawns.

April is “National Lawn Care Month” in the United States, but in no other nation of the world. “It’s the perfect time to honor the environment both through Earth Day and National Lawn Care Month,” a representative of the Professional Lawn Care Association of America once explained. And where else can you find advice on a sales pitch like the following one a trade magazine proffered to the up-and-coming lawn professional: “Have a couple of key messages on the benefits of turf. Use statements like . . . ‘I am maintaining your 8,000 square feet of turf so it will continue to provide enough oxygen for your family plus several others in the neighborhood.’”

Hardcore lawn enthusiasts explain that they are not even growing “grass.” No, they are involved in something far more serious: tending “turf” (a word that comes from Sanskrit, meaning tuft of grass). A whole new generation of mower technology has come to the fore: hydrostatic walk-behinds, zero-turn units, and commercial riders. Consider the aptly named Xtreme Mowchine, a riding mower said to cut grass at fifteen miles per hour, a speed that, according to the manufacturer, makes it the fastest lawn mower in the world. “IT’S LIKE A MOWER ON STEROIDS!” blares the advertisement. And mowers are hardly the only product line out there for coiffing the yard. An arsenal of machines is now available, even to the amateur — aerators, sod cutters, dethatchers, backpack blowers, trimmers, and edgers — lying in wait to drown out the first sounds of spring.

Sometimes the lengths to which the truly devoted grass enthusiast will go might shock even your most dedicated weekend lawn jockey. Moles are a case in point. These little miscreants have the annoying habit of tunneling beneath the lawn, causing one victim in Florida to liken the “mole subway system” in his yard to “a map of New York City.” Imagine the horror, then, of residents in Washington after the state passed a ballot initiative that outlawed the use of “body-gripping traps” for dealing with this common turf menace. Here was an assault on the lawn that no self-respecting gardener could countenance. The first line of attack, understandably enough, involved homegrown remedies like pouring castor oil or tossing chewing gum down the holes. The more creatively inclined tried saving their bodily fluids for use in the crusade. Others pinned their hopes on asphyxiation, hooking up long hoses to the tailpipes of cars. If those strategies failed, there was the prospect of advanced technology, such as gas bombs with names like “the Giant Destroyer” and “Gopher Gasser.” While this is not an advice manual, we should learn from the mistakes of others, which brings to mind the Seattle homeowner who ignited his entire lawn after pouring gasoline down the tunnels and dropping in a match.

As bizarre as the lawn fanatics may seem, when looked at closely, their behavior is only a slight exaggeration of what has come to be seen as normal. If most homeowners today are not making turf checkerboards or rushing to mow the Joneses’ lawn next door, they do aspire to a presentable yard which keeps the neighbors happy and adds to their property value. Few Americans bother to question the lawn, in part because its true price is not readily apparent. What is that price? Although the turf industry says that the lawn is the equivalent of “First-Aid for the Earth,” the reality is more complicated. Grass by itself can indeed prevent soil erosion and storm-water runoff, but the quest for perfect turf is another story altogether, with a dark side for both the landscape and public health.

by Ted Steinberg, Longreads | Read more:
Image: Andy Cross/The Denver Post via Getty Images

Prince and Muhammad Ali
via:

The Insect Apocalypse Is Here


The Insect Apocalypse Is Here. What Does It Mean For the Rest of Life on Earth? (NY Times)
Image: Photo illustrations by Matt Dorfman. Source photographs: Bridgeman Images.

Why an “AI Race” Between the U.S. and China is a Terrible Idea

Perhaps because it lies at the perfect nexus of genuinely-very-complicated and impossibly-confounded-by-marketing-buzzword-speak, the term “AI” has become a catchall for anything algorithmic and sufficiently technologically impressive. AI, which is supposed to stand for “artificial intelligence,” now spans applications from cameras to the military to medicine.

One thing we can be sure about AI — because we are told it so often and at so increasingly high a pitch — is that whatever it actually is, the national interest demands more of it. And we need it now, or else China will beat us there, and we certainly wouldn’t want that, would we? What is “there,” exactly? What does it look like, how would it work, and how would it change our society? Irrelevant! The race is on, and if America doesn’t start taking AI seriously, we’re going to find ourselves the losers in an ever-widening Dystopia Gap.

A piece on Politico this week by Luiza Ch. Savage and Nancy Scola exemplifies the mix of maximum alarm and minimum meaning that’s become so typical in our national (and nationalist) discussion around artificial intelligence. “Is America ceding the future of AI to China?” the article asks.

We’re meant to take this possibility as not only very real but as an unquestionably bad thing. One only needs to tell the public that the country risks “ceding” control of something — literally anything — to the great foreign unknown for our national eyes to grow wide.

“The last time a rival power tried to out-innovate the U.S. and marshaled a whole-of-government approach to doing it, the Soviet Union startled Americans by deploying the first man-made satellite into orbit,” the article says. “The Sputnik surprise in 1957 shook American confidence, galvanized its government and set off a space race culminating with the creation of NASA and the moon landing 50 years ago this month.”

Our new national dread, the article continues, is “whether another Sputnik moment is around the corner” — in the form of an AI-breakthrough from the keyboards of Red China instead of Palo Alto.

Forget that Sputnik was not actually a “surprise” for the powers that be, or that Sputnik itself was basically a beeping aluminum beach ball — “barely more than a radio transmitter with batteries,” the magazine Air & Space once said. There’s a bigger problem here: Framing the Cold War as a battle of innovators conveniently avoids mentioning that the chief innovation in question wasn’t Sputnik or the Space Shuttle or any peacetime venture, but the creation of an arsenal for instant global nuclear holocaust at the press of a button.

Sure, yes, it’s doubtful we could have “marshaled a whole-of-government approach” to space travel without having first “marshaled a whole-of-government approach” to rocket-borne atomic genocide, but to highlight the eventual accomplishments of NASA without acknowledging that it entailed a very close dance with a worldwide apocalypse is ahistoric and absurd. To use this comparison to goad us into another nationalist tech race with a global military power is outright dangerous — if only because the victory remains completely undefined. How would we “beat” China, exactly? Beat them at what, exactly? Which specific problems do we hope to use AI to fix? At a point in history when cities are beginning to scrutinize and outright ban “AI” technologies like facial recognition, are we sure the fixes aren’t even worse than the problems? Nationalists caught in an arms race have no time to answer questions like these or any others; they’ve got a race to win!

All anyone can manage to do is bark that we need more, more, more AI, more investments, more R&D, more collaborations, more ventures, more breakthroughs, simply more AI. Maybe we’ll worry about what we needed all of this for in the first place once we’ve beaten China there. Or maybe an algorithm will explain it to us, along with the locations of all our family members and a corresponding score that quantifies their social utility and biometric trustworthiness.

The Politico piece is full of worried voices cautioning that we can’t let Americans fall behind in the global invasive-surveillance race, completely unable to explain why this would be a bad thing. “The city of Tianjin alone plans to spend $16 billion on AI — and the U.S. government investment still totals several billion and counting,” despairs Elsa Kania of the technology and national security program at the Center for a New American Security. “That’s still lower by an order of magnitude.” Amy Webb, a New York University business school professor, told Politico, “We are being outspent. We are being out-researched. We are being outpaced. We are being out-staffed.”

Of course, it’s not just these researchers, nor is it just Politico: The necessity of absolute American dominance in an extremely unpredictable, deeply hazardous, and altogether hard to comprehend field has made the great leap from think-tank anxiety nightmare to political talking point. At the first Democratic presidential debate, South Bend, Indiana, Mayor Pete Buttigieg sounded the alarm:
China is investing so they could soon be able to run circles around us in artificial intelligence, and this president is fixated on the relationship as if all that mattered was the balance on dishwashers. We have a moment when their authoritarian model is an alternative to ours because ours looks so chaotic because of internal divisions. The biggest thing we have to do is invest in our own domestic competitiveness.
In the same breath as he states this technology is being used to bolster authoritarianism abroad, Buttigieg urges a renewed national investment in that very technology at home. (...)

Even moderate voices find themselves hopelessly caught in the pro-AI fervor, the rush to develop this technology for its own sake. New America’s Justin Sherman has written numerous articles about why framing AI development as an “arms race” is wrongheaded — but only because it leaves out all the other potentially frightening and draconian gifts a nationalist AI sprint could produce. “Competing AI development in the United States and China needs to be reframed from the AI arms race rhetoric, but that doesn’t mean AI development itself doesn’t matter,” Sherman wrote in March. “In fact, the opposite is true.”

Sherman highlights a couple of nonweapon AI applications we ought to not leave to the Chinese, like the potential to use self-teaching software to detect cancer — though he provides only a glancing admission that “many legal and ethical issues plague AI in healthcare (e.g., data privacy, AI bias).” It’s hard to square the belligerent drumbeat of AI nationalism with a calm, composed approach to making sure these technologies are only developed and deployed within a rigorous ethical framework, after all. Moving fast and breaking things is the American way.

by Sam Biddle, The Intercept |  Read more:
Image: Soohee Cho/The Intercept

Sunday, July 21, 2019

Chambers Bay or Bust – 72 holes in One Day

This is the saga of four guys’ 33-mile trek over tough terrain that began at the crack of dawn and ended 16 hours later at last light, playing 72 holes along the way.

Participants in this unique odyssey at Chambers Bay Golf Course on July 1, 2019 are all members of the Trossachs Golf Group (more about that later). The four intrepid players were Peter Wengert, Michael Lynch (a Chambers Bay member and spearhead of what he called his “quest,”), Paul Schweitzer and John Scholl.

John had just over a week to prepare for the uncommon challenge (the original idea was a mere 36 holes on the June 21 Summer Solstice) and started six straight days of carbo-loading, during which he gained five pounds. He also stepped up his normal morning-walk regimen by adding a hill that matched the grades he’d encounter at Chambers Bay.

Michael scheduled the order of play and time targets for all 72 holes, a Rubik’s Cube that took into account the public would begin arriving early in the day and slow them down, if they had followed the normal sequence of holes. The routing was further complicated by their self-imposed rule that no hole could be played twice in the same round for each of the four rounds.

Michael said a re-routing in the afternoon, with the help of a porter (“Skip hole 13, go to No. 1, then return to 13 to complete 13-18 before 9:30pm”) was “key to our success.” John said another key was that “Michael buying lunch and a round of drinks for one of the foursomes we encountered going from No.9 to No. 10 in the afternoon helped us get ahead.”

They had begun playing from the 6,500-yard Sand tees the first round, did a combo Sand and the 6,000-yard White tees the second, a combo 6,900-yard Navy, Sand and White tees the third, and the Whites on the fourth. By the end of the day, John’s pedometer registered 65,825 steps, which translates to some 33 miles, equal to the distance from Chambers Bay in South Tacoma to the Renton City Hall, and that’s without all the elevation changes.

During the ordeal, John consumed nearly 9,000 calories, in every-two-hour mini-meals. All of the guys had their coolers with food and drinks strategically placed at the Oasis tents around the course, with the aid of the porters in their small electric truck-carts.

Chambers Bay is well-known for not allowing the traditional electric golf carts, despite the rolling terrain that goes from a low of 25 feet above sea level near the water to a high of 197 feet, repeatedly. And there’s one lone signature tree on the course, which is why John carried an umbrella on a dry, sunny day.

Note that John is pulling his cart up one of the many hills. He’s a sales exec at Rainier Industries, which is best known for its tent and awning products. He asked one of the workers in that section to make him a custom harness so he could pull his golf cart up the hills, in order to employ different muscles than those used in pushing the cart on level and downhill stretches. Talk about preparation.

The four all finished the feat without incident. John admits he “hit the wall” about 2:30pm but got his second wind and felt fine at the end. He finished blisterless, thanks to three sock changes and plenty of foot powder.

Knowing super-competitive John’s keen interest in scoring as well as I do, the most amazing thing to me was that he still had not added up the scores when we had breakfast two days later (when he was well-rested and five pounds lighter). He estimated that he shot an average of just under 100 per round for each of the four rounds. This only proved to me that the whole experience was about much more than a golf score to all four of the guys.

John still struggles to explain why he decided to undertake the challenge. “I guess the best thing I can say is that, once I agreed to do it, I felt accountable to the group.”

by Larry Coffman, WSGA Golf News |  Read more:
Image: uncredited
[ed. Classic.]

Polishing the Nationalist Brand in the Trump Era

Ever since Donald J. Trump laid waste to its ideological shibboleths with his victory at the polls, the conservative intellectual class has been scrambling to keep up with him.

And earlier this week, at the first major gathering dedicated to wresting a coherent ideology out of the chaos of the Trumpist moment, the president was upending their efforts again.

On Sunday evening, some 500 policy thinkers, theorists, journalists and students gathered in a ballroom at the Ritz-Carlton here for the start of the National Conservatism Conference, a three-day event dedicated to charting a new path for conservatism under the banner of nationalism.

And not the kind associated with tiki torches and Nazi salutes, the conference was at pains to make clear.

“We are nationalists, not white nationalists,” David Brog, one of the organizers, said in his welcoming remarks, calling any equation of the two “a slander.” He then pointed to the door and invited anyone who “defines our American nation in terms of race” who had slipped through the conference’s careful screening to leave.

But inconveniently, just a few hours earlier, President Trump had let loose with tweets calling for four freshman congresswomen of color to “go back” to the “broken and crime infested” countries they came from, throwing an awkward wrench into the messaging.

Not that Mr. Trump’s name was mentioned in the program or the mission statement for the event, which was organized by the Edmund Burke Foundation, a newly formed public affairs institute. It featured headlining speeches by Tucker Carlson, John Bolton and Peter Thiel, as well as some three dozen speakers on panels covering topics like immigration, foreign policy and economic nationalism. The names of Burke and Lincoln may have been uttered as much as the president’s.

Conservatives have always prided themselves on being driven by ideas, and the big idea here was that nationalism — shorn of its darker associations — could provide an intellectual banner now that the conservatism based on free trade, libertarian economics and military interventionism that held sway for decades has run out of gas.

“Today is our independence day,” Yoram Hazony, an Israeli political theorist, author of the recent book “The Virtue of Nationalism” and the conference’s intellectual prime mover, declared in his fiery opening remarks. “We declare independence from neoconservatism. We declare independence from neoliberalism, from libertarianism, from what they call classical liberalism.”

“There is something that unites everyone in this room,” he continued. “We are national conservatives.”

Those in attendance may not have all agreed. They included reform conservatives and religious traditionalists, ardent Trumpists and former Never-Trumpers, and more than a few unconverted free-marketeers and others who were keeping a skeptical eye on the proceedings.

Geoffrey Kabaservice, a historian of conservatism and director of political studies at the Niskanen Center, described the gathering as part of an ongoing effort by conservatives to unite “under an ideological banner that Trump himself doesn’t carry.”

“They are trying to find a way to retroactively justify their support of Trumpism under a broader conservative movement,” he said.“But that’s a tricky assignment.”

Detoxifying nationalism

Just how tricky was suggested by those tweets from the president, and the muted response to them at the conference.

In the hotel bar, the national uproar over the tweets unspooled continuously on the television (at least until it was switched to Fox News). But in the conference sessions, there was virtually no reference to them, and little appetite among those chattering in the halls to offer more than tepid criticism, if that.

“They were bad,” Rich Lowry, the editor of National Review (and a recovering Never-Trumper), said a bit grimly, when asked about the tweets. “His trolling at its worst. Unproductive. Indefensible.”

Mr. Hazony, caught in the hallway between sessions, waved the question away. “It’s a great honor to be running the intellectual part of political conservatism,” he said. “We just don't have to deal with that stuff.”

Helen Andrews, the managing editor of The Washington Examiner and a contributor to various conservative publications, looked puzzled when asked on Monday about the tweets, and said she hadn’t seen them.

As for nationalism, she said she saw “no downside” to embracing it. “I don’t think it’s a word that needs to be detoxified, even as the term conservatism sometimes needs to get detoxified,” she said.

But some others expressed reservations about the new political brand being road-tested.

Yuval Levin, the editor of National Affairs and a speaker at the conference, said that the label “national conservatism” captured some of his own interest in a conservatism that focuses on social health, rather than just the market.

“But I don’t think we can just go around saying nationalism is the answer to our problems,” he said. He added, “People are not crazy to worry when they hear that term.”

Soil, but not blood

When it came to defining who belonged to the nation, there was lots of talk of soil and rootedness, alongside repeated disavowals of blood, or its modern equivalent, DNA.

In a talk called “Why America Is Not an Idea,” Mr. Lowry, the author of the forthcoming book “The Case for Nationalism,” took aim at “one of our most honored clichés”: that the essence of Americanism lies only in its ideals.

The problem with this “overintellectualized understanding of America,” he said, is “it slights the absolutely indispensable influence of culture.”

Even the phrase “city on a hill,” an emblem of American universalism, he said, comes from East Anglia, and is rooted in “a particular soil, a particular place, a particular way of thinking.”

We should insist, Mr. Lowry said, “on the assimilation of immigrants into a common culture.” A panel on immigration happening simultaneously echoed that theme of culture, but with a much harder, racially exclusionary edge. Amy Wax, a law professor at the University of Pennsylvania who was removed from teaching first-year students last year after writing an article questioning the abilities of black students, offered what she called “the cultural case” for reduced immigration.

She defended President Trump’s vulgar comment last year disparaging immigration from certain countries, to laughter and applause. And she dismissed the idea that immigrants somehow became American simply by living here, which Ms. Wax (borrowing a term used by white nationalists and self-described “race realists”) mocked as the “magic dirt” argument.

There’s no reason that “people who come here will quickly come to think, live and act just like us.” she said. Immigration policy, she said, should take into account “cultural compatibility.”

“In effect,” she said, this “means taking the position that our country will be better off with more whites and fewer nonwhites.”

by Jennifer Schuessler, NY Times | Read more:
Image: Justin T. Gellerson
[ed. I think the term we're looking for is cluster fuck. Forget it conservatives, you supported it, you own it.]

The Economist Who Would Fix the American Dream

Raj Chetty got his biggest break before his life began. His mother, Anbu, grew up in Tamil Nadu, a tropical state at the southern tip of the Indian subcontinent. Anbu showed the greatest academic potential of her five siblings, but her future was constrained by custom. Although Anbu’s father encouraged her scholarly inclinations, there were no colleges in the area, and sending his daughter away for an education would have been unseemly.

But as Anbu approached the end of high school, a minor miracle redirected her life. A local tycoon, himself the father of a bright daughter, decided to open a women’s college, housed in his elegant residence. Anbu was admitted to the inaugural class of 30 young women, learning English in the spacious courtyard under a thatched roof and traveling in the early mornings by bus to a nearby college to run chemistry experiments or dissect frogs’ hearts before the men arrived. Anbu excelled, and so began a rapid upward trajectory. She enrolled in medical school. “Why,” her father was asked, “do you send her there?” Among their Chettiar caste, husbands commonly worked abroad for years at a time, sending back money, while wives were left to raise the children. What use would a medical degree be to a stay-at-home mother?

In 1962, Anbu married Veerappa Chetty, a brilliant man from Tamil Nadu whose mother and grandmother had sometimes eaten less food so there would be more for him. Anbu became a doctor and supported her husband while he earned a doctorate in economics. By 1979, when Raj was born in New Delhi, his mother was a pediatrics professor and his father was an economics professor who had served as an adviser to Prime Minister Indira Gandhi.

When Chetty was 9, his family moved to the United States, and he began a climb nearly as dramatic as that of his parents. He was the valedictorian of his high-school class, then graduated in just three years from Harvard University, where he went on to earn a doctorate in economics and, at age 28, was among the youngest faculty members in the university’s history to be offered tenure. In 2012, he was awarded the MacArthur genius grant. The following year, he was given the John Bates Clark Medal, awarded to the most promising economist under 40. (He was 33 at the time.) In 2015, Stanford University hired him away. Last summer, Harvard lured him back to launch his own research and policy institute, with funding from the Bill & Melinda Gates Foundation and the Chan Zuckerberg Initiative.

Chetty turns 40 this month, and is widely considered to be one of the most influential social scientists of his generation. “The question with Raj,” says Harvard’s Edward Glaeser, one of the country’s leading urban economists, “is not if he will win a Nobel Prize, but when.”

The work that has brought Chetty such fame is an echo of his family’s history. He has pioneered an approach that uses newly available sources of government data to show how American families fare across generations, revealing striking patterns of upward mobility and stagnation. In one early study, he showed that children born in 1940 had a 90 percent chance of earning more than their parents, but for children born four decades later, that chance had fallen to 50 percent, a toss of a coin.

In 2013, Chetty released a colorful map of the United States, showing the surprising degree to which people’s financial prospects depend on where they happen to grow up. In Salt Lake City, a person born to a family in the bottom fifth of household income had a 10.8 percent chance of reaching the top fifth. In Milwaukee, the odds were less than half that. (...)

Charlotte is one of America’s great urban success stories. In the 1970s, it was a modest-size city left behind as the textile industry that had defined North Carolina moved overseas. But in the 1980s, the “Queen City” began to lift itself up. US Airways established a hub at the Charlotte Douglas International Airport, and the region became a major transportation and distribution center. Bank of America built its headquarters there, and today Charlotte is in a dead heat with San Francisco to be the nation’s second-largest banking center, after New York. New skyscrapers have sprouted downtown, and the city boundary has been expanding, replacing farmland with spacious homes and Whole Foods stores. In the past four decades, Charlotte’s population has nearly tripled.

Charlotte has also stood out in Chetty’s research, though not in a good way. In a 2014 analysis of the country’s 50 largest metropolitan areas, Charlotte ranked last in ability to lift up poor children. Only 4.4 percent of Charlotte’s kids moved from the bottom quintile of household income to the top. Kids born into low-income families earned just $26,000 a year, on average, as adults—perched on the poverty line. “It was shocking,” says Brian Collier, an executive vice president of the Foundation for the Carolinas, which is working with Opportunity Insights. “The Charlotte story is that we are a meritocracy, that if you come here and are smart and motivated, you will have every opportunity to achieve greatness.” The city’s true story, Chetty’s data showed, is of selective opportunity: All the data-scientist and business-development-analyst jobs in the thriving banking sector are a boon for out-of-towners and the progeny of the well-to-do, but to grow up poor in Charlotte is largely to remain poor.

To help cities like Charlotte, Chetty takes inspiration from medicine. For thousands of years, he explained, little progress was made in understanding disease, until technologies like the microscope gave scientists novel ways to understand biology, and thus the pathologies that make people ill. In October, Chetty’s institute released an interactive map of the United States called the Opportunity Atlas, revealing the terrain of opportunity down to the level of individual neighborhoods. This, he says, will be his microscope.

Drawing on anonymized government data over a three-decade span, the researchers linked children to the parents who claimed them as dependents. The atlas then followed poor kids from every census tract in the country, showing how much they went on to earn as adults. The colors on the atlas reveal a generation’s prospects: red for areas where kids fared the worst; shades of orange, yellow, and green for middling locales; and blue for spots like Salt Lake City’s Foothill neighborhood, where upward mobility is strongest. It can also track children born into higher income brackets, compare results by race and gender, and zoom out to show states, regions, or the country as a whole.

The Opportunity Atlas has a fractal quality. Some regions of the United States look better than high-mobility countries such as Denmark, while others look more like a developing country. The Great Plains unfurl as a sea of blue, and then the eye is caught by an island of red—a mark of the miseries inflicted on the Oglala Lakota by European settlers. These stark differences recapitulate themselves on smaller and smaller scales as you zoom in. It’s common to see opposite extremes of opportunity within easy walking distance of each other, even in two neighborhoods that long-term residents would consider quite similar.

To find a cure for what ails America, Chetty will need to understand all of this wild variation. Which factors foster opportunity, and which impede it? The next step will be to find local interventions that can address these factors—and to prove, with experimental trials, that the interventions work. The end goal is the social equivalent of precision medicine: a method for diagnosing the particular weaknesses of a place and prescribing a set of treatments. This could transform neighborhoods, and restore the American dream from the ground up.

If all of this seems impossibly ambitious, Chetty’s counterargument is to point to how the blue is marbled in with the red. “We are not trying to do something that is unimaginable or has never happened,” he told me over lunch one day. “It happens just down the road.”

Yet in Charlotte, where Opportunity Insights hopes to build its proof of concept, the atlas reveals swaths of bleak uniformity. Looking at the city, you first see a large bluish wedge south of downtown, with Providence Road on one side and South Boulevard on the other, encompassing the mostly white, mostly affluent areas where children generally grow up to do well. Surrounding the wedge is a broad expanse in hues of red that locals call “the crescent,” made up of predominantly black neighborhoods where the prospects for poor children are pretty miserable. Hunger and homelessness are common, and in some places only one in five high-school students scores “proficient” on standardized tests. In many parts of the crescent, the question isn’t What’s holding kids back? so much as What isn’t holding them back? It’s hard to know where to start.

The most significant challenge Chetty faces is the force of history. In the 1930s, redlining prevented black families from buying homes in Charlotte’s more desirable neighborhoods. In the 1940s, the city built Independence Boulevard, a four-lane highway that cut through the heart of its Brooklyn neighborhood, dividing and displacing a thriving working-class black community. The damage continued in the ’60s and ’70s with new interstates. It’s common to hear that something has gone wrong in parts of Charlotte, but the more honest reading is that Charlotte is working as it was designed to. American cities are the way they are, and remain the way they are, because of choices they have made and continue to make.

Does a professor from Harvard, even one as influential and well funded as Chetty, truly stand any chance of bending the American story line? On his national atlas, the most obvious feature is an ugly red gash that starts in Virginia, curls down through the Southeast’s coastal states—North Carolina, South Carolina, Georgia, and Alabama—then marches west toward the Mississippi River, where it turns northward before petering out in western Tennessee. When I saw this, I was reminded of another map: one President Abraham Lincoln consulted in 1861, demarcating the counties with the most slaves. The two maps are remarkably similar. Set the documents side by side, and it may be hard to believe that they are separated in time by more than a century and a half, or that one is a rough census of men and women kept in bondage at the time of the Civil War, and the other is a computer-generated glimpse of our children’s future.

by Gareth Cook, The Atlantic |  Read more:
Images: Library of Congress and Opportunity Insights / U.S. Census Bureau

Isovaline

Isovaline is a rare amino acid transported to earth by the Murchison meteorite, which landed in Australia in 1969. The discovery of isovaline in the biosphere demonstrates an extraterrestrial origin of amino acids and has been linked to the homochirality of life on earth [1] suggesting a role in the origin of life.[2] (...)

This novel first-in-class compound has potential for treatment of acute and chronic pain, without the negative side effects associated with other commonly used analgesics.

by Wikipedia |  Read more:
Image: Wikipedia
[ed. The things you learn every day! See also: Know Your Gabapentinoids (SSC).]

Saturday, July 20, 2019

2RAUMWOHNUNG

Humanity Is Not Sleeping; It’s In An Induced Coma

In the late 1960s, the sudden widespread availability of psychedelics combined with the circulation of eastern philosophy to the west to begin a radical transformation of human consciousness. Our species, desperate to transcend the residual trauma from two world wars and the existential terror of the nuclear age, began moving into a wildly unprecedented relationship with its capacity for abstract thought.

For the first time ever, humans began disentangling themselves from egoic consciousness on a mass scale, suddenly using thought as the useful tool it’s meant to be rather than the life-dominating addiction that it had become up until that point. World leaders not only permitted this transformation but actively facilitated it, realizing from their own encounters with this new revolution that humanity relinquishing its egoic mental constrictions opened up the possibility for the creation of paradise on earth.

As we became less egocentric, our values and interests changed. This led to changes in the way we vote, in the kinds of media we chose to consume, and in the kinds of ideas which were popularized.

This set the stage for the next level of human evolution in the arrival of the internet. For the first time in history humans were able to network their minds all around the world in real time, by the thousands, then by the millions, then by the billions. Our harmonious relationship with our inner world suddenly segued smoothly into the ability to develop a harmonious relationship with our outer world, no longer confined by space and time in sharing ideas and information with each other.

With clear minds networked in a truly democratic way we were quickly able to identify and solve all the remaining problems in our world, and we saw ourselves transitioning into a collaborative relationship with each other and with our ecosystem which gave us all a quality of life that had been unimaginable up until a few decades ago.

Now, here in 2019, all human creativity and ingenuity goes toward finding new ways to help us survive and thrive and understand. Technological innovation, once mostly stagnated in the cognitive cul-de-sacs of figuring out new ways to exploit and dominate each other and commit more efficient acts of mass military violence, is now flourishing and expanding at an exponential rate. We have indeed created Heaven on earth together.

Oh,

but,

that was just a dream I had.

It could have gone down like that. There’s no reason it couldn’t have. Maybe in a parallel universe it did. But, here in this timeline, it didn’t.

Here in this timeline, the people in power wanted to remain in power thank you very much, even if it meant making life worse for everyone in the long run, including themselves.

Here in this timeline, we had all the medicine we needed to cure our sickness, but we were forbidden from using it.

Here in this timeline, plutocrats bought up all the media so they can tell us every day how important it is to continue bolstering the status quo, and that the only thing up for debate is what strategies we should use to do so.

Here in this timeline they got us dependent on money, and then devalued it to ensure that we’re working harder and harder for less and less so we have no time to expand our consciousness of our inner or outer worlds.

Here in this timeline psychedelics were made as illegal as heroin or cocaine, and their use was stigmatized as a dangerous activity for criminals and degenerates.

Here in this timeline, control of the internet was quickly shored up by plutocratic interests and government agencies, and traffic is now directed toward authorized platforms sharing authorized narratives which bolster the status quo and manufacture consent for establishment agendas.

Here in this timeline, we had all the tools to escape this prison, but they were taken from us and replaced with more prison bars. Taken by people who were more clever than the rest of us, and who had enough money and influence to shut the whole thing down.

For some weird reason I found myself watching a spiritual guru-type guy discussing his political views on a video yesterday, and, like most spiritual guru-type people, his view of our political reality was highly malnourished. He kept talking about how all our problems are the result of humanity being deeply unconscious, and how that unconsciousness leads us to elect deeply unconscious leaders. Like “elected leaders” are the ones calling the shots. No mention of plutocrats, opaque government agencies or propaganda; the only problem, according to spiritual guru-type guy, is that we’re all equally asleep at the wheel and all equally responsible for what’s going on.

Spiritual guru-type guy is wrong. Humanity is not sleeping. Humanity is in an induced coma.

by Caitlin Johnstone |  Read more:
Image: uncredited