Friday, June 7, 2019

Every Way to Cook an Egg (59 Methods) | Bon Appétit

Overlooked No More: Elizabeth Peratrovich, Rights Advocate for Alaska Natives

It was hardly the first affront. They had grown up in a segregated Alaska: separate schools, hospitals, theaters, restaurants and cemeteries. But for Elizabeth Peratrovich and her husband, Roy, Tlingit natives, the sign they spotted one day in late 1941 in Douglas, just across the channel from downtown Juneau, was the final straw.

“No Natives Allowed” read the notice on a hotel door.

“The proprietor of Douglas Inn does not seem to realize that our Native boys are just as willing as the white boys to lay down their lives to protect the freedom that he enjoys,” they wrote in a letter to Ernest Gruening, the territory’s governor, signaling the start of their campaign to fight discrimination in Alaska.

Calling such open bias “an outrage,” the couple continued, “We will still be here to guard our beloved country while hordes of uninterested whites will be fleeing South.”

Gruening agreed with the Peratroviches, and they joined forces. In 1943, they attempted to usher an antidiscrimination bill through Alaska’s two-branch Territorial Legislature. It failed, with a tie vote of 8-8 in the House.

In the two years that followed, the Peratroviches redoubled their efforts, urging Native Alaskans to campaign for seats in the Legislature and taking their cause on the road to gain support. They even left their children in the care of an orphanage for a summer so that they could travel across the state more freely.

By the time the new bill reached the Senate floor, on Feb. 5, 1945, Congress had increased the size of the territory’s Legislature, two Natives had been elected to it, and Alaska’s House had already approved the bill. Though the odds of passage were high, the bill set off hours of passionate debate and drew so many onlookers that the crowd spilled out of the gallery doors.

Senator Allen Shattuck argued that the measure would “aggravate rather than allay” racial tensions.

“Who are these people, barely out of savagery, who want to associate with us whites with 5,000 years of recorded civilization behind us?” he was quoted as saying in Gruening’s 1973 autobiography, “Many Battles.”

When the floor was opened to public comments, Peratrovich set down her knitting needles and rose from her seat in the back.

Taking the podium, she said: “I would not have expected that I, who am barely out of savagery, would have to remind the gentlemen with 5,000 years of recorded civilization behind them of our Bill of Rights.”

She gave examples of the injustices that she and her family had faced because of their background and called on the lawmakers to act. “You as legislators,” she said, “can assert to the world that you recognize the evil of the present situation and speak your intent to help us overcome discrimination.”

Her testimony, The Daily Alaska Empire wrote, shamed the opposition into a “defensive whisper.”

The gallery broke out in a “wild burst of applause,” Gruening wrote. The 1945 Anti-Discrimination Act was passed, 11-5.

Gruening signed the bill into law on Feb. 16 — a date now celebrated by the state each year. The legislation entitled all Alaskans to “full and equal enjoyment” of public establishments, setting a misdemeanor penalty for violators. It also banned discriminatory signage based on race.

It was the first antidiscrimination act in the United States. It would be nearly 20 years before the federal Civil Rights Act would be passed, in 1964, and 14 years before Alaska would become a state.

by Carson Vaughan, NY Times | Read more:
Image: Alaska State Archives
[ed. Overlooked is a series of obituaries about remarkable people whose deaths, beginning in 1851, went unreported in The Times.]

Consumer Financial Loan-Shark Bureau

How Payday Lenders Spent $1 Million at a Trump Resort — and Cashed In

In mid-March, the payday lending industry held its annual convention at the Trump National Doral hotel outside Miami. Payday lenders offer loans on the order of a few hundred dollars, typically to low-income borrowers, who have to pay them back in a matter of weeks. The industry has long been reviled by critics for charging stratospheric interest rates — typically 400% on an annual basis — that leave customers trapped in cycles of debt.

The industry had felt under siege during the Obama administration, as the federal government moved to clamp down. A government study found that a majority of payday loans are made to people who pay more in interest and fees than they initially borrow. Google and Facebook refuse to take the industry’s ads.

On the edge of the Doral’s grounds, as the payday convention began, a group of ministers held a protest “pray-in,” denouncing the lenders for having a “feast” while their borrowers “suffer and starve.”

But inside the hotel, in a wood-paneled bar under golden chandeliers, the mood was celebratory. Payday lenders, many dressed in golf shirts and khakis, enjoyed an open bar and mingled over bites of steak and coconut shrimp.

They had plenty to be elated about. A month earlier, Kathleen Kraninger, who had just finished her second month as director of the federal Consumer Financial Protection Bureau, had delivered what the lenders consider an epochal victory: Kraninger announced a proposal to gut a crucial rule that had been passed under her Obama-era predecessor.

Payday lenders viewed that rule as a potential death sentence for many in their industry. It would require payday lenders and others to make sure borrowers could afford to pay back their loans while also covering basic living expenses. Banks and mortgage lenders view such a step as a basic prerequisite. But the notion struck terror in the payday lenders. Their business model relies on customers — 12 million Americans take out payday loans every year, according to Pew Charitable Trusts — getting stuck in a long-term cycle of debt, experts say. A CFPB study found that three out of four payday loans go to borrowers who take out 10 or more loans a year. (...)

In Mick Mulvaney, who Trump appointed as interim chief of the CFPB in 2017, the industry got exactly the kind of person it had hoped for. As a congressman, Mulvaney had famously derided the agency as a “sad, sick” joke.

If anything, that phrase undersold Mulvaney’s attempts to hamstring the agency as its chief. He froze new investigations, dropped enforcement actions en masse, requested a budget of $0 and seemed to mock the agency by attempting to officially re-order the words in the organization’s name.

But Mulvaney’s rhetoric sometimes exceeded his impact. His budget request was ignored, for example; the CFPB’s name change was only fleeting. And besides, Mulvaney was always a part-timer, fitting in a few days a week at the CFPB while also heading the Office of Management and Budget, and then moving to the White House as acting chief of staff.

It’s Mulvaney’s successor, Kraninger, whom the financial industry is now counting on — and the early signs suggest she’ll deliver. In addition to easing rules on payday lenders, she has continued Mulvaney’s policy of ending supervisory exams on outfits that specialize in lending to the members of the military, claiming that the CFPB can do so only if Congress passes a new law granting those powers (which isn’t likely to happen anytime soon). She has also proposed a new regulation that will allow debt collectors to text and email debtors an unlimited number of times as long as there’s an option to unsubscribe.

Enforcement activity at the bureau has plunged under Trump. The amount of monetary relief going to consumers has fallen from $43 million per week under Richard Cordray, the director appointed by Barack Obama, to $6.4 million per week under Mulvaney and is now $464,039, according to an updated analysis conducted by the Consumer Federation of America’s Christopher Peterson, a former special adviser to the bureau. (...)

Triple-digit interest rates are no laughing matter for those who take out payday loans. A sum as little as $100, combined with such rates, can lead a borrower into long-term financial dependency.

That’s what happened to Maria Dichter. Now 73, retired from the insurance industry and living in Palm Beach County, Florida, Dichter first took out a payday loan in 2011. Both she and her husband had gotten knee replacements, and he was about to get a pacemaker. She needed $100 to cover the co-pay on their medication. As is required, Dichter brought identification and her Social Security number and gave the lender a postdated check to pay what she owed. (All of this is standard for payday loans; borrowers either postdate a check or grant the lender access to their bank account.) What nobody asked her to do was show that she had the means to repay the loan. Dichter got the $100 the same day.

The relief was only temporary. Dichter soon needed to pay for more doctors’ appointments and prescriptions. She went back and got a new loan for $300 to cover the first one and provide some more cash. A few months later, she paid that off with a new $500 loan.

Dichter collects a Social Security check each month, but she has never been able to catch up. For almost eight years now, she has renewed her $500 loan every month. Each time she is charged $54 in fees and interest. That means Dichter has paid about $5,000 in interest and fees since 2011 on what is effectively one loan for $500.

by Anjali Tsui, ProPublica, and Alice Wilder, WNYC, Pro Publica | Read more:
Image: via

Biotech Cockaigne of the Vegan Hopeful

August 2013: The future of meat appears in London. At least, that’s how the media event I’m watching online has been billed. A hamburger made of bovine muscle cells grown in vitro is unveiled, then served to a panel of tasters while a studio audience of journalists watches. A promotional film describes the various ills that “cultured meat” promises to solve, ills caused by eating animals at industrial scale. Industrial animal agriculture possibly produces 14 to 18 percent of global emissions of greenhouse gases. The byproducts of animal agriculture can pollute waterways and soil. Livestock, especially bovine livestock, is inefficient at turning plant foods into protein. Concentrated animal feeding operations (CAFOs) are a potential source of zoonotic diseases; furthermore, subtherapeutic dosing with antibiotics to speed animals’ growth builds antibiotic resistance in pathogens that can grow in feedlots.1 Billions of animals suffer in our meat production infrastructure, and the moral weight of that suffering depends on whom you ask, and on his or her philosophical views about animals. Today’s event conveys the implicit promise that “cultured meat” may solve all these problems. The short promotional film concludes with the words “be part of the solution.”

A second promotional film describes how the burger was made: The process started with a biopsy of cow muscle cells, followed by careful stimulation of a stem cell–driven, natural process of muscle repair, as cells were fed with growth media under carefully calibrated laboratory conditions. Gradually, what functions as a healing process in vivo (i.e., in living animals) becomes a meat production process, in vitro. Thus, the potential of stem cells to create new tissue becomes the biological grounds for a promise about the future of protein.

But this is only a test—or, only a taste. In vitro techniques cannot yet perfectly reproduce in vivo animal muscle and fat, and thus cannot perfectly reproduce what consumers recognize as meat. Cultured meat has yet to become delicious. Nor is the technology scalable. The techniques and materials are still too expensive. The burger taste-tested in London took months of lab time to make, and the entire project (materials, technician salaries, etc.) cost more than $300,000 US. If the holy grail of cultured meat research is to develop a product that can replace “cheap meat,” that is, the kind of meat that is produced at industrial scale and sold at fast-food restaurants, then the goal seems years or decades away.

If we succeed in growing meat—meat that never had parents, meat that was never part of a complete animal body—we will do more than change human subsistence strategies forever. We will also transform our relationship with animal bodies, beginning at the level of the cell. Mark Post, the Dutch medical researcher who created the burger with the help of a team of scientists and technicians, seems hopeful and confident. He laughs good-naturedly with the journalists when they articulate their doubts. Of course, he acknowledges, it would be easier if everyone just became a vegetarian, but such a mass shift in human behavior doesn’t seem likely.

A Tale of Hope—or Hype?

October 2018: Scientists, entrepreneurs, and promoters are working to make cultured meat a reality. There is still no cultured meat on the market, but a handful of startup companies, many of them based in the San Francisco Bay area, promise that they will have a product to sell—presumably still not at the same price point as a fast-food hamburger or chicken nugget—in a matter of months or a handful of years.

I spent the years between the first in vitro hamburger unveiling and late 2018 conducting ethnographic research on the cultured meat movement, and I still cannot tell you if cultured meat will grace our tables soon. To the best of my knowledge, the two main technical challenges in cultured meat research have not yet been surmounted. One challenge is the creation of an affordably scalable growth medium not derived from animal sources (the current mix contains fetal bovine serum) and the other is the ability to create “thick” and texturally sophisticated tissue, such as that found in steak or pork chops, as opposed to growing two-dimensional sheets of cells and assembling them into meat. And beyond these technical challenges, cultured meat’s pioneers will need to find a way to make production “scale up” to the point where the cost of an individual serving of meat drops close to, or even equals, the cost of the conventional equivalent. In short, we don’t yet know what kind of technology story this is. Are we en route to success, or are we watching a cautionary tale in progress, one about hope and hype?

Much like self-driving cars, the advocates of which hope their use will reduce car crashes, cultured meat is promoted by those who believe in its practical and ethical benefits. But cultured meat is also like the self-driving car insofar as opinions vary as to whether a single technology can resolve a complex and, in some senses, social problem that involves not only engineering challenges but also the vagaries of human behavior. Like medical therapies based on stem cells, cultured meat excites the imagination and creates hope, but the hype seems to be running years or decades ahead of the reality. (Cultured meat itself is an offshoot of the effort to create tissues for transplant to human patients, an effort that goes by the name “regenerative medicine.”)

Cultured meat may one day come ashore on the high-tech equivalent of the Island of Misfit Toys, where flying cars rust next to moldering piles of food pills, but it hasn’t yet. One of the forces keeping it afloat, both financially and in the popular imagination, is many people’s deep investment in the defense of animals. The cultured meat startups are linked by a loose social network of educated professionals, often vegans or vegetarians, who believe that cultured meat may accomplish what decades of animal protection activism has not, alleviating the suffering of animals in our food system. Not all venture capital investment in cultured meat research is inspired by a desire to protect animals, of course; there are investors interested in the potential environmental “cleanliness” of cultured meat, and those angling for a profit, just as profit orientation is part of the package for any investor. But the most vocal proponents of cultured meat speak more eagerly about the defense of animals than they do about the defense of the natural environment or human health, although they readily acknowledge that cultured meat (many of them call it “clean meat,” or use other terms) happily addresses all three needs at once.

Meet the Utilitarians

In addition to resources, the advocates of cultured meat have a philosophy ready to hand. Many of them are self-described utilitarians, readers of the works of philosopher Peter Singer, in particular his 1975 book Animal Liberation. In that book, Singer followed classical utilitarian philosophers like Jeremy Bentham by arguing that the way to determine the moral standing of animals is not by assessing their intellectual capacities relative to those of most humans but by asking if animals can suffer as humans do. Answering that question in the affirmative, Singer suggested that it was “speciesist” to deny moral standing to the suffering of animals. Many regard Animal Liberation as the bible of the contemporary animal rights movement, despite the fact that the book does not defend the rights of animals per se. Contrary to the thinking of some other philosophers concerned with animals, such as Tom Regan, Singer does not assert the inherent rights of animals, or (in what philosophers term a “deontological” fashion) define the maltreatment or even the use of animals as morally wrong. “I am a vegetarian,” Singer has written, “because I am a utilitarian.” Rather than focus on the inherent worth of a human or animal life, a utilitarian will ask how that life is contoured by experiences of suffering or happiness. These notions, unlike those such as inherent worth, are the conditions a utilitarian can measure with some hope of improving the world. Whether they share Singer’s ordering of concerns (first utilitarianism, then animal protection), many of cultured meat’s promoters have taken up Singer’s approach as a philosophical support for their work.

Utilitarianism combines the following features: It is consequentialist insofar as it judges right and wrong by considering the outcome of our actions, not preoccupying itself with the nature of those actions themselves. It is a doctrine of ends, not means. It is universalist insofar as it claims to take into account every being’s interests equally. It is welfarist in that it understands and measures people’s well-being in terms of the satisfaction of their needs. And it is aggregative in that it considers everyone’s interests added together with the goal of maximizing happiness and minimizing suffering for the greatest number. Individuals count only as part of the whole. Each one counts for one, never for more than one.

If this account of utilitarianism’s parts seems schematic, it is worth saying that many utilitarian accounts of the world can seem like line drawings or blueprints. As the philosopher Bernard Williams noted, this philosophy “appeals to a frame of mind in which technical difficulty…is preferable to moral unclarity, no doubt because it is less alarming.”That is to say, for a utilitarian it is better to have a complicated job of balancing multiple interests than to be unsure what would count as a desirable outcome. Utilitarianism appeals to those who dislike moral ambiguity and to those who focus on outcomes; this characterization also applies to many actors in the world of cultured meat who eagerly anticipate an end to animal agriculture.

by Benjamin Aldes Wurgaft, The Hedgehog Review |  Read more:
Image: Alarmy

Duke Ellington

We Have Nothing to Lose but Our Debts

Amid the bad results for the Left in the European elections, the Greek outcome was particularly poignant. In the last such contest in 2014, Syriza rode the revolt against austerity to become the largest single party, in its final step toward national office. Five years later, in last month’s election, it finished ten points behind the right-wing New Democracy. And where once Syriza promised to spark change throughout the EU, it is now the best student of the neoliberal dogma “There Is No Alternative.”

After four years of slashed pensions, sell-offs of state assets, and even a right-wing turn on foreign policy, Syriza is now also set to lose office. Indeed, not only did Alexis Tsipras’s party enforce an even harsher austerity than its predecessors ever dreamt of, but as snap general elections loom, it is set to become an exhausted opposition to a sharply reactionary New Democracy government. Polls for the July 7 vote suggest the conservatives have a massive lead, and could even secure an absolute majority in parliament.

The hollowing out of Syriza’s base is the expression of disappointment and despair. But there are also signs that some of its voters are turning to left-wing alternatives. Former finance minister Yanis Varoufakis’s MeRA25 party achieved a particularly creditable result in the European contest, less than four hundred votes from electing a member of the European Parliament. As Greece heads to a fresh general election, MeRA25 hopes to elect its first members of parliament, offering a platform for its call to replace austerity with Europe-wide investment.

Jacobin’s David Broder spoke to Varoufakis about the effect of Syriza’s defeat on the wider European left, the prospects of a realignment of EU politics, and MeRA25’s own plans for a “political revolution” in Greece.

Almost all left-wing parties lost votes in the European elections, no matter what their strategy regarding the European Union (EU). For this reason, many analyses of the result have focused on more general obstacles, invoking the “death of the populist moment,” the stabilization of the EU, or indeed the lack of left-wing governments able to challenge its current policy balance. Such readings would all suggest a window of opportunity has closed. Do you think this is the case, or are there still openings?

YV - It is undoubtedly the case that a large window of opportunity has closed — and it was closed here, in Greece, in 2015. Millions of Europeans looked with hope to this country, and it was Alexis Tsipras’s Syriza government (elected that January) that had the responsibility for keeping that window open, and for opening it up further for others. What these millions wanted a break from was not even true neoliberalism, but what I would call bankruptocracy — a new regime in which the greatest power was wielded by the most bankrupt bankers.

Tsipras’s surrender in July 2015 closed that window of opportunity. And there’s no sugaring the bitter pill — the European elections were a complete catastrophe for progressives. Yet at the same time, we should also be clear that there is never a final victory or defeat. New windows are always opening up.

Yet if the troika’s treatment of Greece damaged the EU’s image, and also cast doubt on the prospect of a single state being able to change things, there is little sign of what other forces could challenge the present order. DiEM25 has spoken of constructing broad fronts across Europe, including even liberals and progressive-minded conservatives who see the need to break the EU out of its austerian dogmas. But do you see any evidence that other political forces are actually moving in the direction you suggest?

Firstly, I’ll say that the reason we lost the window of opportunity wasn’t the troika’s treatment of Greece. We shouldn’t blame our enemies for our defeats, just as we don’t blame the scorpion for stinging us — this is in its nature. The blame lies with those who decided to trade the anti-austerity agenda on which they were elected in exchange for a few years in office — all the while having their backs patted by the enemy.

As for the resonance of our arguments, there is an impressive disconnect between a general recognition that austerity was, indeed, a disaster, and the lack of any political program to end it. I have the privilege of speaking to a lot of bankers — for some reason, they like talking to me. They completely accept that socialism for bankers and austerity for the population brought about a major defeat for European capitalism. Social democrats on the ground admit that it has been terrible, as do some conservatives, as well as the Greens and the Left. But the disconnect lies in the lack of an organized political plan to shift us out of this.

Even progressives have failed to get together to advance an alternative — indeed, only DiEM25 put forward the plan for a Green New Deal. The Greens themselves are so conservative, so ordoliberal, and so scared that conservatives will accuse them of being fiscally irresponsible, that they end up recycling ordoliberalism.

But we don’t regret not standing together with the Party of the European Left, which has chosen incoherence. From Italy to Switzerland, Hungary or Britain, the fascists and right-wingers are coherent: they say, “we want our country back,” and that means dissolving supranational organizations and institutions and pointing the finger of blame at foreigners, whether that means Jews, Syrians, Greeks, Germans, or refugees — the “other.” It is a misanthropic dead end, but it is coherent.

That cannot be said of the Party of the European Left, which included not only Syriza — which completely surrendered to the troika — but also the europhile French Communist Party and allied euro-skeptic forces like Jean-Luc Mélenchon’s France Insoumise, or Podemos, whose policy on Europe and the euro is not to have a policy. (...)

This election saw not just the growth of the far right but also advances for liberal and Green parties, at the expense of social and Christian democrats. But if pro-European sentiment has been mobilized in opposition to right-wing populism, do you think this could be harnessed by the anti-neoliberal left? Wasn’t the rise in support for these parties instead more of a vote of confidence in the EU as it currently exists?

It is stupendous that there is talk of a vote of confidence in the EU when the far right came first-placed in France, Britain, and Italy. Ten years ago, if you were told this would happen, you’d have said — oh my god. The mainstream media presenting the rise in liberal and Green parties as a vote of confidence in the EU is mind-boggling.

Politically and historically speaking, these parties’ rise is irrelevant. The liberals’ rise owes to Macron in France and Ciudadanos in Spain. These are deeply conservative forces — Ciudadanos even governs together with the far-right Vox in Andalusia. There is nothing liberal about them: they are traditional, austerian class warriors against the working class. Some such forces could be called more liberal, but only in the sense that the German CDU [Christian Democratic Union] is more liberal than the Austrian ÖVP [People’s Party]. This is just a shift within the same liberal-conservative bloc.

The same could be said of the Greens — and here we are really talking about France and Germany, where the Greens are a significant force. These parties are the green wing of social democracy, and their traditional government partners are the Parti Socialiste and SPD [Social Democratic Party], who collapsed due to their connivance in the assault on working-class voters. Indeed, overall the social-democratic/green blocs that led to the governments of François Hollande and Gerhard Schröder have shrunk.

To celebrate the rise of the Greens is to celebrate a lifestyle choice – fiscal conservatives who want to celebrate recycling and who say they like Greta Thunberg when addressing their kids. Indeed, in a debate with Sven Giegold, the German Greens’ leading candidate, in response to my presentation of DiEM25’s ambitious green investment plan of half a trillion euros annually funded via European Investment Bank bonds, I was appalled to hear him retort that there were not enough green projects to fund with so much money. He offered as proof for this the neoliberal creed that if there was such a need, the market would have provided the investments!

(...) Do you think left-wing voters have bought Tsipras’s message that Syriza made the best of a bad situation? Or have they lost faith in the prospect of changing things? And what are the chances of bringing these other forces together?

Ever since he surrendered to the troika, Tsipras was always going to invest in a dilemma put to progressives: “Who do you want to torture you — an enthusiastic torturer, or someone like me who doesn’t want to torture you but will do it to keep his job?” This was his line in September 2015 [in that year’s second general election, after Syriza caved to the troika]. But four years later, after pushing through the most naked, harshest austerity policies anywhere in Europe — including under Greece’s previous governments — he can no longer blackmail progressives with lesser-evil arguments. (...)

The regime did not feel threatened by parties advocating exit from the euro and EU. Our view — that we’re not going to leave, and it’s up for the German government to leave, or to throw us out — is harder for them to deal with. The regime despised us because we neither want Grexit nor fear it. Our call unilaterally to implement perfectly moderate policies without fear or passion destabilized them. It was a danger to their system because of its widespread appeal.

by Yanis Varoufakis, Jacobin | Read more:
Image: Sean Gallup / Getty

Thursday, June 6, 2019


[ed. See next post. I don't do video games and was surprised to learn that marquee actors like Mads Mikkelsen and Léa Seydoux are somehow involved in the process. I thought it was all CGI stuff or something. This picture from Death Stranding reminds me of George Saunders' short essay the Semplica-Girl Diaries.]
Image: via

The Video Games Industry is Bigger Than Hollywood

Force Majeure


[ed. Another climate change skeptic.]

Wednesday, June 5, 2019

Book Review: The Secret of Our Success

“Culture is the secret of humanity’s success” sounds like the most vapid possible thesis. The Secret Of Our Success by anthropologist Joseph Henrich manages to be an amazing book anyway.

Henrich wants to debunk (or at least clarify) a popular view where humans succeeded because of our raw intelligence. In this view, we are smart enough to invent neat tools that help us survive and adapt to unfamiliar environments.

Against such theories: we cannot actually do this. Henrich walks the reader through many stories about European explorers marooned in unfamiliar environments. These explorers usually starved to death. They starved to death in the middle of endless plenty. Some of them were in Arctic lands that the Inuit considered among their richest hunting grounds. Others were in jungles, surrounded by edible plants and animals. One particularly unfortunate group was in Alabama, and would have perished entirely if they hadn’t been captured and enslaved by local Indians first.

These explorers had many advantages over our hominid ancestors. For one thing, their exploration parties were made up entirely of strong young men in their prime, with no need to support women, children, or the elderly. They were often selected for their education and intelligence. Many of them were from Victorian Britain, one of the most successful civilizations in history, full of geniuses like Darwin and Galton. Most of them had some past experience with wilderness craft and survival. But despite their big brains, when faced with the task our big brains supposedly evolved for – figuring out how to do hunting and gathering in a wilderness environment – they failed pathetically.

Nor is it surprising that they failed. Hunting and gathering is actually really hard. Here’s Henrich’s description of how the Inuit hunt seals:
You first have to find their breathing holes in the ice. It’s important that the area around the hole be snow-covered—otherwise the seals will hear you and vanish. You then open the hole, smell it to verify it’s still in use (what do seals smell like?), and then assess the shape of the hole using a special curved piece of caribou antler. The hole is then covered with snow, save for a small gap at the top that is capped with a down indicator. If the seal enters the hole, the indicator moves, and you must blindly plunge your harpoon into the hole using all your weight. Your harpoon should be about 1.5 meters (5ft) long, with a detachable tip that is tethered with a heavy braid of sinew line. You can get the antler from the previously noted caribou, which you brought down with your driftwood bow. 
The rear spike of the harpoon is made of extra-hard polar bear bone (yes, you also need to know how to kill polar bears; best to catch them napping in their dens). Once you’ve plunged your harpoon’s head into the seal, you’re then in a wrestling match as you reel him in, onto the ice, where you can finish him off with the aforementioned bear-bone spike. 
Now you have a seal, but you have to cook it. However, there are no trees at this latitude for wood, and driftwood is too sparse and valuable to use routinely for fires. To have a reliable fire, you’ll need to carve a lamp from soapstone (you know what soapstone looks like, right?), render some oil for the lamp from blubber, and make a wick out of a particular species of moss. You will also need water. The pack ice is frozen salt water, so using it for drinking will just make you dehydrate faster. However, old sea ice has lost most of its salt, so it can be melted to make potable water. Of course, you need to be able to locate and identify old sea ice by color and texture. To melt it, make sure you have enough oil for your soapstone lamp.
No surprise that stranded explorers couldn’t figure all this out. It’s more surprising that the Inuit did. And although the Arctic is an unusually hostile place for humans, Henrich makes it clear that hunting-gathering techniques of this level of complexity are standard everywhere. Here’s how the Indians of Tierra del Fuego make arrows:
Among the Fuegians, making an arrow requires a 14-step procedure that involves using seven different tools to work six different materials. Here are some of the steps: 
– The process begins by selecting the wood for the shaft, which preferably comes from chaura, a bushy, evergreen shrub. Though strong and light, this wood is a non-intuitive choice since the gnarled branches require extensive straightening (why not start with straighter branches?). 
– The wood is heated, straightened with the craftsman’s teeth, and eventually finished with a scraper. Then, using a pre-heated and grooved stone, the shaft is pressed into the grooves and rubbed back and forth, pressing it down with a piece of fox skin. The fox skin becomes impregnated with the dust, which prepares it for the polishing stage (Does it have to be fox skin?). 
– Bits of pitch, gathered from the beach, are chewed and mixed with ash (What if you don’t include the ash?). 
– The mixture is then applied to both ends of a heated shaft, which must then be coated with white clay (what about red clay? Do you have to heat it?). This prepares the ends for the fletching and arrowhead. 
– Two feathers are used for the fletching, preferably from upland geese (why not chicken feathers?). 
– Right-handed bowman must use feathers from the left wing of the bird, and vice versa for lefties (Does this really matter?). 
– The feathers are lashed to the shaft using sinews from the back of the guanaco, after they are smoothed and thinned with water and saliva (why not sinews from the fox that I had to kill for the aforementioned skin?). 
Next is the arrowhead, which must be crafted and then attached to the shaft, and of course there is also the bow, quiver and archery skills. But, I’ll leave it there, since I think you get the idea.
How do hunter-gatherers know how to do all this? We usually summarize it as “culture”. How did it form? Not through some smart Inuit or Fuegian person reasoning it out; if that had been it, smart European explorers should have been able to reason it out too.

The obvious answer is “cultural evolution”, but Henrich isn’t much better than anyone else at taking the mystery out of this phrase. Trial and error must have been involved, and less successful groups/people imitating the techniques of more successful ones. But is that really a satisfying explanation? (...)

Remember, Henrich thinks culture accumulates through random mutation. Humans don’t have control over how culture gets generated. They have more control over how much of it gets transmitted to the next generation. If 100% gets transmitted, then as more and more mutations accumulate, the culture becomes better and better. If less than 100% gets transmitted, then at some point new culture gained and old culture lost fall into equilibrium, and your society stabilizes at some higher or lower technological level. This means that transmitting culture to the next generation is maybe the core human skill. The human brain is optimized to make this work as well as possible.

Human children are obsessed with learning things. And they don’t learn things randomly. There seem to be “biases in cultural learning”, ie slots in an infant’s mind that they know need to be filled with knowledge, and which they preferentially seek out the knowledge necessary to fill.

One slot is for language. Human children naturally listen to speech (as early as in the womb). They naturally prune the phonemes they are able to produce and distinguish to the ones in the local language. And they naturally figure out how to speak and understand what people are saying, even though learning a language is hard even for smart adults.

Another slot is for animals. In a world where megafauna has been relegated to zoos, we still teach children their ABCs with “L is for lion” and “B is for bear”, and children still read picture books about Mr. Frog and Mrs. Snake holding tea parties. Henrich suggests that just as the young brain is hard-coded to want to learn language, so it is hard-coded to want to learn the local animal life (maybe little boys’ vehicle obsession is an outgrowth of this – buses and trains are the closest thing to local megafauna that most of them will encounter!)

by Scott Alexander, Slate Star Codex |  Read more:
Image: Princeton University Press

Image: someecards

How to Save the (Institutional) Humanities

The large majority of our fellow-citizens care as much about literature as they care about aeroplanes or the programme of the Legislature. They do not ignore it; they are not quite indifferent to it. But their interest in it is faint and perfunctory; or, if their interest happens to be violent, it is spasmodic. Ask the two hundred thousand persons whose enthusiasm made the vogue of a popular novel ten years ago what they think of that novel now, and you will gather that they have utterly forgotten it, and that they would no more dream of reading it again than of reading Bishop Stubbs’s Select Charters.
— Arnold Bennet, Literary Taste (1907)
Humanities departments are not doomed to oblivion. They might deserve oblivion, but they are not doomed to it. This post is going to suggest one relatively painless institutional fix that has the potential to dam the floods up before they sweep the entire profession away. (...)

Confusing a subject with the narrow band of institutions currently devoted to credentializing those who study it clouds our thinking. The collapse of humanity departments on university campuses is a best an indirect signal of the health of the humanities overall. At times the focus on the former distracts us from real problems facing the latter. The death of professorships in poetry is far less alarming than American societies' rejection of poetry writ large. In as much as the creeping reach of the academy has contributed to poetry's fall from popular acclaim, the collapse of graduate programs in literature and creative writing may be a necessary precondition for its survival.

Academics don't want to hear this, of course. But the truth is that few academics place "truth," "beauty," or "intersectional justice" at the top of their personal hierarchy of values. The motivating drive of the American academic is bourgeois respectability. The academic wants to continue excelling in the same sort of tasks they have excelled in since they were 10 years old, and want to be respected for it. The person truelycommitted to the humanist impulse would be ready pack things up and head into the woods with Tao Qian and Thoreau. But that is not what academia is for. Academia is a quest for status and certitude.

If pondering on these things you still feel the edifice is worth preserving, then I am here to tell you that this possible. The solution I endorse is neat in its elegance, powerful in its simplicity. It won't bring the halcyon days of the '70s of back, but it will divert enough students into humanities programs to make them somewhat sustainable. (...)

"Many things not at all" is what the current system teaches. The structure of generals and elective courses struggles to produce any other outcome. Learning something well depends on a cumulative process of practice and recall. Memories not used soon fade; methods not refined soon dull; facts not marshaled are soon forgotten. I remember the three credits I took in Oceanography as a grand experience (not least for field lab at the beach), but years later I find I cannot recall anything I was tested on. And why would I? After that class was over the information I learned was never used in any of the other classes I took.

This sounds like an argument against learning anything but one carefully selected major. That takes things a step too far. There is a benefit to having expertise in more than one domain. I am reminded of Scott Adam's "top 25%" principle, which I first found in Marc Andreeson's guide to career planning:
If you want an average successful life, it doesn’t take much planning. Just stay out of trouble, go to school, and apply for jobs you might like. But if you want something extraordinary, you have two paths: 
Become the best at one specific thing.
Become very good (top 25%) at two or more things. 
The first strategy is difficult to the point of near impossibility. Few people will ever play in the NBA or make a platinum album. I don’t recommend anyone even try. 
The second strategy is fairly easy. Everyone has at least a few areas in which they could be in the top 25% with some effort. In my case, I can draw better than most people, but I’m hardly an artist. And I’m not any funnier than the average standup comedian who never makes it big, but I’m funnier than most people. The magic is that few people can draw well and write jokes. It’s the combination of the two that makes what I do so rare. And when you add in my business background, suddenly I had a topic that few cartoonists could hope to understand without living it. 
....Get a degree in business on top of your engineering degree, law degree, medical degree, science degree, or whatever. Suddenly you’re in charge, or maybe you’re starting your own company using your combined knowledge. 
Capitalism rewards things that are both rare and valuable. You make yourself rare by combining two or more “pretty goods” until no one else has your mix... 
It sounds like generic advice, but you’d be hard pressed to find any successful person who didn’t have about three skills in the top 25%.
To this I would add a more general statement about the purpose of a university education. In my days as a teacher in history and literature, I used to give a lecture to the Chinese students I had helped prepare for American university life. This lecture would touch on many things. This was one of them. I would usually say something close to this:
Students who go to America usually fall into one of two groups. The first group is focused like a laser beam on grinding through coursework that will easily open up a new career to them upon graduation. You will know the type when you see them--they will be carrying around four books on accounting or chemical engineering, and will constantly be fretting over whether their GPA is high enough for them to land an internship with Amazon. In many ways those students will spend their university years doing the exact same thing they are doing now: jumping through one hoop after another to get good grades and secure what they hope will be a good future.  
On the other hand, you have many students who arrive in America and immediately devote themselves to the pleasures they could not chase at home. These students jump at the obscure class in 19th century French poetry, glorying in their newfound freedom to learn about something just because they want to learn about it. They follow their passions. Such passions rarely heed the demands of a future job market. 
Which student should you be? 
My advise: be both
The trouble with our new expert in Romantic poetry or classical Greek is that even if she is smart enough to do just about any job out there, she has no way to prove that to her potential future employers. Her teachers will have her write term papers and book reviews. Your ability to write an amazing term paper impresses nobody outside of the academy (even if the research skills needed to write one are in demand out there). If you do not have a technical skillset they can understand — or even better, a portfolio of projects you have completed that you can give them — you will struggle greatly when it comes time to find a job. Your success will not be legible to the outside world. You must find ways to make it legible. You must ponder this problem from your very first year of study. It is not wise to spend your entire university experience pretending that graduation day will not come. It will, and you must be prepared for it. 
On the flip side, I cannot endorse the path of Mr. I-Only-Take-Accounting-Classes either. He lives for the Next Step. My friends, there will always be a Next Step. Life will get busier, not easier, after college. You may never again be given such grand opportunity to step back and think about what is most important. 
What is wrong? What is right? What is true, and how will I know it? What is beauty, and where can I find it? What does it mean to be good? What does it mean to live a meaningful life? Your accounting classes will not answer that question. Now the odds are high that your literature, art, and history classes won't really answer them either—but they will ask you to develop your own answers to them. That is truly valuable. 
I will say it again: you may have another period in your life where you have the time, resources, and a supporting community designed to help you do this. If you are not having experiences in university that force you to spend time wrestling in contemplation, then you have wasted a rare gift. 
So that is my advice. Do both! 
I cannot tell you exactly how to do both — that will be for each of you to decide. But recognize which sort of student you are, and find ways to counter-act your natural tendency. If you have no desire greater than diving into a pile of history books, perhaps take three or four classes of GIS on the side, and create skins for Google Earth that draw on your data. If you are driven to find a career in finance, go do so — but then arrange to spend a semester abroad in Spain, or Japan, or somewhere that let's you experience a new culture and lifestyle. 
Prepare for your career. Expand your mind. Find a way to do both.
Far fewer students have taken this advice than I hoped. I am partially fond of my alma mater's new system because it forces all of its students do exactly what I advocate they should. But the logic of the system is compelling on its own grounds. By requiring a science based minor, all students are required to master the basics of statistics and the scientific method. They do this not through a series of university-required, general-purpose, mind-numbing courses, but through a minor they choose themselves. All students are required to master a professional skill that will give them options on the post-college job market. They will learn how to make their work and talents legible to the world outside of academia. And all students are required to round this education out with an in-depth study of art, history, or culture.

From an organizational sense, the system's greatest boon goes to the humanities departments. The prime reason students do not take humanities courses is that college is too expensive to afford a degree which does not guarantee a career. That is it. As the number of people graduating from college increases, merely having a degree is no longer a signal of extraordinary competence. Any student that goes hundreds of thousands of dollars into debt for the sake of a degree which will not provide them with the skill-set they need to pay it back is extremely foolish, and most of them know it.

by T. Greer, The Scholar's Stage |  Read more:
Image: via
[ed. Congratulations to this year's graduates.]

It’s Time To Take Octopus Civilization Seriously

Intelligence is a hot topic of discussion these days. Human intelligence. Plant intelligence. Artificial intelligence. All kinds of intelligence. But while the natures of human and plant intelligence are subjects mired in heated debate, derision, and controversy, the subject of artificial intelligence inspires an altogether different kind of response: fear. In particular, fear for the continued existence of any human civilization whatsoever. From Elon Musk to Stephen Hawking, the geniuses of the Zeitgeist agree. AI will take our jobs and then, if we’re not careful, everything else too, down to every last molecule in the universe. A major Democratic presidential candidate, Andrew Yang, has turned managing the rise of AI into one of the core principles of his political platform. It is not a laughing matter.

But artificial general intelligence is not the type of intelligence that humanity should fear most. Far from the blinking server rooms of Silicon Valley or the posh London offices of DeepMind, another type of intelligence lurks silently out of human sight, biding its time in the Lovecraftian deep. Watching. Waiting. Organizing. Unlike artificial intelligence, this intelligence is not hypothetical, but very real. Forget about AGI. It’s time to worry about OGI—octopus general intelligence.

In late 2017, it was reported that an underwater site called “Octlantis” had been discovered by researchers off the coast of Australia. Normally considered to be exceptionally solitary, fifteen octopuses were observed living together around a rocky outcropping on the otherwise flat ocean floor. Fashioning homes—dens—for themselves out of shells, the octopuses were observed mating, fighting, and communicating with each other. Most importantly, this was not the first time that this had happened. Another similar site called “Octopolis” had been previously discovered in the vicinity in 2009.

One of the researchers, Stephanie Chancellor, described the octopuses in “Octlantis” as “true environmental engineers.” The octopuses were observed conducting both mate defense and “evictions” of octopuses from dens, defending their property rights from infringement by other octopuses. The other “Octopolis” site had been continuously inhabited for at least seven years. Given the short lifespans of octopuses, lasting only a few years on the high end, it is clear that “Octopolis” has been inhabited by several generations of octopuses. We are presented with the possibility of not only one multi-generational octopus settlement chosen for defense from predators and engineered for octopus living, but two. And those are just the ones we’ve discovered. The oceans cover over 70% of Earth’s surface.

None of the three experts I spoke with for this article would rule out the possibility of further octopus settlements.

The octopus is a well-known creature, but poorly understood. The primal fear inspired by the octopus frequently surfaces in horror movies, pirate legends, political cartoons depicting nefarious and tentacled political enemies, and, understandably, in Japanese erotic art. For all that, the octopus is, to most people, just another type of seafood you can order at the sushi bar. But the octopus is more than just sushi. It’s more than the sum of its eight arms. A lot more, in fact—it may be the most alien creature larger than a speck of dust to inhabit the known ecosystems of the planet Earth. Moreover, it’s not just strange. It’s positively talented.

Octopuses can fully regenerate limbs. They can change the color and texture of their skin at will, whether to camouflage themselves, make a threat, or for some other unknown purpose. They can even “see” with their skin, thanks to the presence of the light-sensitive protein rhodopsin, also found in human retinas. They can shoot gobs of thick black ink with a water jet, creating impenetrable smokescreens for deceit and escape. Octopuses can use their boneless, elastic bodies to shapeshift, taking on the forms of other animals or even rocks. Those same bodies allow even the larger species of octopuses to squeeze through holes as small as one inch in diameter. The octopus’ arms are covered in hundreds of powerful suckers that are known to leave visible “octo-hickeys” on humans. The larger ones can hold at least 35 lbs. each. The suckers can simultaneously taste and smell. All octopus species are venomous.

Despite all of these incredible abilities, the octopus’ most terrifying feature remains its intelligence. The octopus has the highest brain-to-body-mass ratio of any invertebrate, a ratio that is also higher than that of many vertebrates. Two thirds of its neurons, however, are located in its many autonomous arms, which can react to stimuli and even identify and grab food after being severed from the rest of the octopus, whether still dead or alive. In other words, the intelligence of an octopus is not centralized. It is decentralized, like a blockchain. Like blockchains, this makes them harder to kill. It has been reported that octopuses are capable of observational learning, short- and long-term memory, tool usage, and much more. One might wonder: if octopuses have already mastered blockchain technology, what else are they hiding?

We can see octopuses frequently putting this intelligence to good use, and not only in their burgeoning aquatic settlements. Some octopuses are known to use coconut shells for shelter, even dismantling and transporting the shell only to reassemble it later. In laboratory settings, octopuses are able to solve complex puzzles and open different types of latches in order to obtain food. They don’t stop there, though. Captive octopuses have been known to escape their tanks, slither across the floor, climb into another tank, feast on the helpless fish and crabs within, and then return to their original tank. Some do it only at night, knowingly keeping their human overseers in the dark. Octopuses do not seem to have qualms about deceiving humans. They are known to steal bait from lobster traps and climb aboard fishing boats to get closer to fishermen’s catches.

One octopus in New Zealand even managed to escape an aquarium and make it back to the sea. When night fell and nobody was watching, “Inky”—his human name, as we do not know how octopuses refer to themselves in private—climbed out of his tank, across the ground, and into a drainpipe leading directly to the ocean.

Given the advanced intelligence and manifold abilities of octopuses, it may not be a surprise, in hindsight, that they are developing settlements off the coast of Australia. By establishing a beachhead in the Pacific Ocean, a nascent octopus civilization would be well-placed to challenge the primary geopolitical powers of the 21st century, namely, the United States and China. Australia itself is sparsely inhabited and rich in natural resources vital for any advanced civilization. The country’s largely coastal population would be poorly prepared to deal with an invasion from the sea.

by Marko Jukic, Palladium |  Read more:
Image: Qijin Xu/Octopus

How China Is Planning to Rank 1.3 Billion People

China has a radical plan to influence the behavior of its 1.3 billion people: It wants to grade each of them on aspects of their lives to reflect how good (or bad) a citizen they are. Versions of the so-called social credit system are being tested in a dozen cities with the aim of eventually creating a network that encompasses the whole country. Critics say it’s a heavy-handed, intrusive and sinister way for a one-party state to control the population. Supporters, including many Chinese (at least in one survey), say it’ll make for a more considerate, civilized and law-abiding society.

1. Is this for real?

Yes. In 2014, China released sweeping plans to establish a national social credit system by 2020. Local trials covering about 6% of the population are already rewarding good behavior and punishing bad, with Beijing due to begin its program by 2021. There are also other ways the state keeps tabs on citizens that may become part of an integrated system. Since 2015, for instance, a network that collates local- and central- government information has been used to blacklist millions of people to prevent them from booking flights and high-speed train trips.

2. Why is China doing this?

“Keeping trust is glorious and breaking trust is disgraceful.” That’s the guiding ideology of the plan as outlined by the government. China has suffered from rampant corruption, financial scams and corporate scandals in its breakneck industrialization of the past several decades. The social credit system is billed as an attempt to raise standards of behavior and restore trust as well as a means to uphold basic laws that are regularly flouted.

3. How are people judged?

That varies place to place. In the eastern city of Hangzhou, “pro-social” activity includes donating blood and volunteer work, while violating traffic laws lowers an individual’s credit score. In Zhoushan, an island near Shanghai, no-nos include smoking or driving while using a mobile phone, vandalism, walking a dog without a leash and playing loud music in public. Too much time playing video games and circulating fake news can also count against individuals. According to U.S. magazine Foreign Policy, residents of the northeastern city of Rongcheng adapted the system to include penalties for online defamation and spreading religion illegally.

4. What happens if someone’s social credit falls?

“Those who violate the law and lose the trust will pay a heavy price,” the government warned in one document. People may be denied basic services or prevented from borrowing money. “Trust-breakers” might be barred from working in finance, according to a 2016 directive. A case elsewhere highlighted by the advocacy group Human Rights Watch showed that citizens aren’t always aware that they’ve been blacklisted, and that it can be difficult to rectify mistakes. The National Development and Reform Commission — which is spearheading the social credit plan — said in its 2018 report it had added 14.2 million incidents to a list of “dishonest” activities. People can appeal, however. The commission said 2 million people had been removed from its blacklist, while Zhejiang, south of Shanghai, brought in rules to give citizens a year to rectify a bad score with good behavior. And people who live in Yiwu have 15 days to appeal social-credit information that’s released by the authorities.

5. What part is technology playing?

Advances in computer processing have simplified the task of collating vast databases, such as the network used to blacklist travelers. Regional officials are applying facial-recognition technology to identify jaywalkers and cyclists who run red lights.

by Karen Leigh and Dandan Li, Bloomberg | Read more:
Image: Pedestrian-detection technology at China's SenseTime Group Ltd. Gilles Sabrie/Bloomberg

Keanu Reeves Is Too Good for This World

Last week, I read a report in the Times about the current conditions on Mt. Everest, where climbers have taken to shoving one another out of the way in order to take selfies at the peak, creating a disastrous human pileup. It struck me as a cogent metaphor for how we live today: constantly teetering on the precipice to grasp at the latest popular thing. The story, like many stories these days, provoked anxiety, dread, and a kind of awe at the foolishness of fellow human beings. Luckily, the Internet has recently provided us with an unlikely antidote to everything wrong with the news cycle: the actor Keanu Reeves.

Take, for instance, a moment, a few weeks ago, when Reeves appeared on “The Late Show” to promote “John Wick: Chapter 3—Parabellum,” the latest installment in his action-movie franchise. Near the end of the interview, Stephen Colbert asked the actor what he thought happens after we die. Reeves was wearing a dark suit and tie, in the vein of a sensitive mafioso who is considering leaving it all behind to enter the priesthood. He paused for a moment, then answered, with some care, “I know that the ones who love us will miss us.” It was a response so wise, so genuinely thoughtful, that it seemed like a rebuke to the usual canned blather of late-night television. The clip was retweeted more than a hundred thousand times, but, when I watched it, I felt like I was standing alone in a rock garden, having a koan whispered into my ear.

Reeves, who is fifty-four, has had a thirty-five-year career in Hollywood. He was a moody teen stoner in “River’s Edge” and a sunny teen stoner in the “Bill & Ted” franchise; he was the tortured sci-fi action hero in the “Matrix” movies and the can-do hunky action hero in “Speed”; he was the slumming rent boy in “My Own Private Idaho,” the scheming Don John in “Much Ado About Nothing,” and the eligible middle-aged rom-com lead in “Destination Wedding.” Early in his career, his acting was often mocked for exhibiting a perceived skater-dude fuzziness; still, today, on YouTube, you can find several gleeful compilations of Reeves “acting badly.” (“I am an F.B.I. agent,” he shouts, not so convincingly, to Patrick Swayze in “Point Break.”) But over the years the peculiarities of Reeves’s acting style have come to be seen more generously. Though he possesses a classic leading-man beauty, he is no run-of-the-mill Hollywood stud; he is too aloof, too cipher-like, too mysterious. There is something a bit “Man Who Fell to Earth” about him, an otherworldliness that comes across in all of his performances, which tend to have a slightly uncanny, declamatory quality. No matter what role he plays, he is always himself. He is also clearly aware of the impression he makes. In the new Netflix comedy “Always Be My Maybe,” starring the standup comedian Ali Wong, he makes a cameo as a darkly handsome, black-clad, self-serious Keanu, speaking in huskily theatrical, quasi-spiritual sound bites that either baffle or arouse those around him. “I’ve missed your spirit,” he gasps at Wong, while kissing her, open-mouthed.

Though we’ve spent more than three decades with Reeves, we still know little about him. We know that he was born in Beirut, and that he is of English and Chinese-Hawaiian ancestry. (Ali Wong has said that she cast him in “Always Be My Maybe” in part because he’s Asian-American, even if many people forget it.) His father, who did a spell in jail for drug dealing, left home when Keanu was a young boy. His childhood was itinerant, as his mother remarried several times and moved the family from Sydney to New York and, finally, Toronto. We know that he used to play hockey, and that he is a motorcycle buff, and that he has experienced unthinkable tragedy: in the late nineties, his girlfriend, Jennifer Syme, gave birth to their child, who was stillborn; two years later, Syme died in a car accident. Otherwise, Reeves’s life is a closed book. Who is he friends with? What is his relationship with his family like? As Alex Pappademas wrote, for a cover story about the actor in GQ, in May, Reeves has somehow managed to “pull off the nearly impossible feat of remaining an enigmatic cult figure despite having been an A-list actor for decades.”

by Naomi Fry, New Yorker |  Read more:
Image: Karwai Tang/Getty
[ed. Interesting how our culture fixates on certain celebrity icons, seemingly at random: Frida Kahlo, Debbie Harry, Bob Ross, David Byrne, Vermeer's Girl With A Pearl Earring, Serge Gainsbourg and Jane Birkin, for example. Suddenly they're everywhere.]

Tuesday, June 4, 2019

The Coming G.O.P. Apocalypse

For much of the 20th century, young and old people voted pretty similarly. The defining gaps in our recent politics have been the gender gap (women preferring Democrats) and the education gap. But now the generation gap is back, with a vengeance.

This is most immediately evident in the way Democrats are sorting themselves in their early primary preferences. A Democratic voter’s race, sex or education level doesn’t predict which candidate he or she is leaning toward, but age does.

In one early New Hampshire poll, Joe Biden won 39 percent of the vote of those over 55, but just 22 percent of those under 35, trailing Bernie Sanders. Similarly, in an early Iowa poll, Biden won 41 percent of the oldster vote, but just 17 percent of the young adult vote, placing third, behind Sanders and Elizabeth Warren.

As Ronald Brownstein pointed out in The Atlantic, older Democrats prefer a more moderate candidate who they think can win. Younger Democrats prefer a more progressive candidate who they think can bring systemic change.

The generation gap is even more powerful when it comes to Republicans. To put it bluntly, young adults hate them.

In 2018, voters under 30 supported Democratic House candidates over Republican ones by an astounding 67 percent to 32 percent. A 2018 Pew survey found that 59 percent of millennial voters identify as Democrats or lean Democratic, while only 32 percent identify as Republicans or lean Republican.

The difference is ideological. According to Pew, 57 percent of millennials call themselves consistently liberal or mostly liberal. Only 12 percent call themselves consistently conservative or mostly conservative. This is the most important statistic in American politics right now.

Recent surveys of Generation Z voters (those born after 1996) find that, if anything, they are even more liberal than millennials.

In 2002, John B. Judis and Ruy Teixeira wrote a book called “The Emerging Democratic Majority,” which predicted electoral doom for the G.O.P. based on demographic data. That prediction turned out to be wrong, or at least wildly premature.

The authors did not foresee how older white voters would swing over to the Republican side and the way many assimilated Hispanics would vote like non-Hispanic whites. The failure of that book’s predictions has scared people off from making demographic forecasts.

But it’s hard to look at the generational data and not see long-term disaster for Republicans. Some people think generations get more conservative as they age, but that is not borne out by the evidence. Moreover, today’s generation gap is not based just on temporary intellectual postures. It is based on concrete, lived experience that is never going to go away.

Unlike the Silent Generation and the boomers, millennials and Gen Z voters live with difference every single day. Only 16 percent of the Silent Generation is minority, but 44 percent of the millennial generation is. If you are a millennial in California, Texas, Florida, Arizona or New Jersey, ethnic minorities make up more than half of your age cohort. In just over two decades, America will be a majority-minority country.

Young voters approve of these trends. Seventy-nine percent of millennials think immigration is good for America. Sixty-one percent think racial diversity is good for America.

They have constructed an ethos that is mostly about dealing with difference. They are much more sympathetic to those who identify as transgender. They are much more likely than other groups to say that racial discrimination is the main barrier to black progress. They are much less likely to say the U.S. is the best country in the world.

These days the Republican Party looks like a direct reaction against this ethos — against immigration, against diversity, against pluralism. Moreover, conservative thought seems to be getting less relevant to the America that is coming into being. (...)

The most burning question for conservatives should be: What do we have to say to young adults and about the diverse world they are living in? Instead, conservative intellectuals seem hellbent on taking their 12 percent share among the young and turning it to 3.

by David Brooks, NY Times |  Read more:
Image: Eric Thayer for The New York Times
[ed. We can only hope. Here's a question for conservatives: wouldn't it be great to see a Trump/Palin ticket in the next election? If not, why? See also: George Will’s Political Philosophy (NY Times).]

Metamotivation


Maslow's hierarchy of needs is often portrayed in the shape of a pyramid, with the largest and most fundamental levels of needs at the bottom, and the need for self-actualization at the top.While the pyramid has become the de facto way to represent the hierarchy, Maslow himself never used a pyramid to describe these levels in any of his writings on the subject.

The most fundamental and basic four layers of the pyramid contain what Maslow called "deficiency needs" or "d-needs": esteem, friendship and love, security, and physical needs. With the exception of the most fundamental (physiological) needs, if these "deficiency needs" are not met, the body gives no physical indication but the individual feels anxious and tense. Maslow's theory suggests that the most basic level of needs must be met before the individual will strongly desire (or focus motivation upon) the secondary or higher level needs. Maslow also coined the term Metamotivation to describe the motivation of people who go beyond the scope of the basic needs and strive for constant betterment. Metamotivated people are driven by B-needs (Being Needs), instead of deficiency needs (D-Needs).

via: Wikipedia
[ed. Repost]
[ed. I was familiar with Maslow's general hierarchy of needs but not the term Metamotivation i.e., striving to realize one's fullest potential. I wonder how a person's outlook on life and their personality are affected by an inability to achieve that need (if it is felt)? Furthermore, since basic needs are fluid (like health, friendship, economic security, intimacy, etc.) is metamotivation a temporary luxury (and ultimately an unsustainable goal)?]