Tuesday, November 27, 2018

Maybe They’re Just Bad People

Seven years ago, a former aide to Ralph Reed — who also worked, briefly, for Paul Manafort — published a tawdry, shallow memoir that is also one of the more revealing political books I’ve ever read. Lisa Baron was a pro-choice, pro-gay rights, hard-partying Jew who nonetheless made a career advancing the fortunes of the Christian right. She opened her book with an anecdote about performing oral sex on a future member of the George W. Bush administration during the 2000 primary, which, she wrote, “perfectly summed up my groupie-like relationship to politics at that time — I wanted it, I worshiped it, and I went for it.”

It’s not exactly a secret that politics is full of amoral careerists lusting — literally or figuratively — for access to power. Still, if you’re interested in politics because of values and ideas, it can be easier to understand people who have foul ideologies than those who don’t have ideologies at all. Steve Bannon, a quasi-fascist with delusions of grandeur, makes more sense to me than Anthony Scaramucci, a political cipher who likes to be on TV. I don’t think I’m alone. Consider all the energy spent trying to figure out Ivanka Trump’s true beliefs, when she’s shown that what she believes most is that she’s entitled to power and prestige.

Baron’s book, “Life of the Party: A Political Press Tart Bares All,” is useful because it is a self-portrait of a cynical, fame-hungry narcissist, a common type but one underrepresented in the stories we tell about partisan combat. A person of limited self-awareness — she seemed to think readers would find her right-wing exploits plucky and cute — Baron became Reed’s communications director because she saw it as a steppingstone to her dream job, White House press secretary, a position she envisioned in mostly sartorial terms. (“Outfits would be planned around the news of the day,” she wrote.) Reading Baron’s story helped me realize emotionally something I knew intellectually. It’s tempting for those of us who interpret politics for a living to overstate the importance of competing philosophies. We shouldn't forget the enduring role of sheer vanity. (...)

In many ways, the insincere Trumpists are the most frustrating. Because they don’t really believe in Trump’s belligerent nationalism and racist conspiracy theories, we keep expecting them to feel shame or remorse. But they’re not insincere because they believe in something better than Trumpism. Rather, they believe in very little. They are transactional in a way that makes no psychological sense to those of us who see politics as a moral drama; they might as well all be wearing jackets saying, “I really don’t care, do u?”

Baron’s book helped me grasp what public life is about for such people. “I loved being in the middle of something big, and the biggest thing in my life was Ralph,” she wrote in one of her more plaintive passages. “Without him, I was nobody.” Such a longing for validation is underrated as a political motivator. Senator Lindsey Graham, another insincere Trumpist, once justified his sycophantic relationship with the president by saying, “If you knew anything about me, I want to be relevant.” Some people would rather be on the wrong side than on the outside.

by Michelle Goldberg, NY Times |  Read more:
Image: uncredited
[ed. See also: Mia Love: "No real relationships, just convenient transactions".]

How a Japanese Craftsman Lives by the Consuming Art of Indigo Dyeing

Kanji Hama, 69, has quietly dedicated his life to maintaining the traditional Japanese craft of katazome: stencil-printed indigo-dyed kimonos made according to the manner and style of the Edo period. He works alone seven days a week from his home in Matsumoto, Nagano, keeping indigo fermentation vats brewing in his backyard and cutting highly detailed patterns into handmade paper hardened with persimmon tannins to create designs for a craft for which there is virtually no market. Nearly identical-looking garments can be had for a pittance at any souvenir store.

Indigo is one of a handful of blue dyes found in nature, and it’s surprising that it was ever discovered at all, as the plants that yield it reveal no hint of the secret they hold. Unlike other botanical dyestuff, which can be boiled or crushed to release its color, the creation of indigo requires a complex molecular process involving fermentation of the plant’s leaves. (The most common source is the tropical indigo plant, or Indigofera tinctoria, but Japanese dyes are generally made from Persicaria tinctoria, a species of buckwheat.) Everyone who has worked with indigo — from the Tuareg and Yoruba in Africa to the Indians and Japanese across Asia to the prehistoric tribes in the Americas — figured out their own methods for coaxing out the dye, and distinct ways of using it to embellish their clothing, costumes, domestic textiles or ritual objects that were particularly expressive of their own culture and beliefs.

No one knows exactly when indigo arrived in Japan, but beginning around the eighth century, the Japanese began creating a large repertoire of refined traditions for designing with it. Many indigo techniques are intended to hold back, or resist, the dye in certain areas to create designs. Nearly all of these, which include various ways of manipulating the fabric before it is dyed, such as tying it, knotting it, folding it, stitching it, rolling it or applying a gluey substance to it, are used in the great variety of Japanese traditions. But for Hama’s katazome practice, a paste of fermented rice is applied through a stencil laid on top of the fabric. After the fabric has been dipped in an indigo vat, the paste gets washed off and the stenciled design remains. (Resist pastes in other countries often employ local ingredients: Indonesian batik is made with wax, Indian dabu block prints with mud and Nigerian adire with cassava flour.) Katazome, however, unlike the other resist techniques, can yield very intricate and delicate designs because the stencil-making itself, called katagami, is a precise and elaborate craft, unique to Japan.

Matsumoto, which is roughly halfway between Tokyo and Kyoto, was once a center for the Japanese folk craft movement of the 1930s through the 1950s, which recognized and celebrated the beauty of regional, handcrafted everyday objects, or mingei. Hama’s grandfather was part of that movement and a pioneer in reviving natural dyeing after its obsolescence. Hama learned his trade as his father’s apprentice, starting when he was 18, working without salary or holidays, seven days a week for 15 years. (Every evening, from 8 p.m. until about 3 a.m., Hama returned to the studio to practice what he had learned that day.)

Wearing blue work clothes, his hair covered with an indigo scarf and his hands and fingernails stained blue, Hama ushers me to his studio, which occupies the second floor of his house and is outfitted with long, narrow tables built to accommodate lengths of kimono fabric (a standard kimono is about 40 feet long and 16 inches wide). From a back door off the studio, stairs lead to a shed that houses his fermentation vats and a small yard, given over in its entirety to sheaths of dyed kimono fabric, stretched from one end to the other — like long, slender hammocks — to dry.

Of the dozens of steps involved in his process, some are highly complicated and some are simply tedious, such as the repeated washing and starching and rinsing of the fabric, but all are time-consuming. “Craft is doing things with your hands. Once you manufacture things, it is no longer craft,” Hama tells me. As a holdout devoted to maintaining the tradition against all odds, almost to the point of tragic absurdity, Hama is not interested in the easy way. Rather than buy prewashed fabric or premade starch, Hama makes them himself. He sets down one of the stencils he has carved into persimmon-hardened paper called washi — a slight modification of an 18th-century pattern, which he has backed in silk to keep the intricate design intact — onto a length of fabric fastened to one of the tables. (He doesn’t make his own paper or persimmon extract, but only because he doesn’t think the variety of persimmon used today yields the same quality tannins as those from his grandfather’s day. As a result, he has planted a tree from which he hopes one day to make his own.) With a hera, a spatula-like tool, he evenly slathers a glutinous rice paste over the stencil to resist the dye. Because Hama wants a precise consistency to his paste, which varies based on the intricacy of the design and the weather conditions, he mixes his own, a process that takes half a day. He squeegees the excess off the stencil and, by eye, proceeds down the table, lining it up where the previous one left off. The fabric is then hung in the studio to dry before he can do the same work on the other side: Once sewn into a kimono, it won’t even be visible. Next, the fabric is moved outside, where it gets covered in soy milk (also homemade) to help keep the glue in place as it dries in the sun; this is repeated three times on each side before the dyeing can start. We head down to the fermentation dye vats, which are steaming cauldrons cut into the floor of a lean-to shed. Each indigo dyer has his own recipe for adding lime, ash, lye from wood and wheat husks to the sukumo (or composted indigo plant), which must be kept warm and stirred for a couple weeks in order to ferment and become dye in a process called aitate. Hama works according to the seasons. In the summer and monsoon seasons, it is too hot for indigo, as the paste will melt, while in winter, he must rise each morning at 3 a.m. to descend into the cold, adding new coals for a consistent temperature.

Hama is cognizant that what he knows will likely die along with him. Like many masters of traditional crafts in Japan, Hama does not believe in writing down the process, because the craft is understood to be so much more than its individual steps and thus impossible to transmit through written instruction. Indigo dyeing like this is a way of life, and to the extent to which Hama is a master, he possesses not just his own knowledge but, in a very real way, his father’s and his father’s father’s knowledge. This kind of embodied, tacit expertise doesn’t translate easily into English as it involves the very un-Western idea of the body and the intellect working in unison, masterfully and efficiently, as if in a dance. There is a chance his son will take on the business, but Hama thinks this generation is incapable of putting in the time it takes to gain the mastery of a craft like this.

by Deborah Needleman, NY Times |  Read more:
Image: Kyoko Hamada. Styled by Theresa Rivera. Photographer’s assistant: Garrett Milanovich. Styling assistant: Sarice Olson. Indigo pieces courtesy of Kanji Hama

Monday, November 26, 2018


Oleg Tselkov (Russian, b. 1934), Flush Toilet and Agave, 1956
via:

Michael Kidd (1937 - ) Derek Jarman’s Garden
via:

An Ecology of Beauty and Strong Drink

According to the theory of cultural evolution, rituals and other cultural elements evolve in the context of human beings. They depend on us for their reproduction, and sometimes help us feel good and accomplish our goals, reproductive and otherwise. Ritual performances, like uses of language, exhibit a high degree of variation; ritual performances change over time, and some changes are copied, some are not. As with genetic mutation, ritual novelty is constantly emerging.

The following presents several ecological metaphors for ritual adaptation: sexual selection, the isolated island, and the clearcut forest. Once these metaphors are established, I will explain how they apply to ritual, and suggest some policy recommendations based on this speculation. (...)

Clearcuts

When a mature natural ecosystem is destroyed by fire, clearcutting, or plowing, a particular process of succession follows. First, plants with a short life history that specialize in colonization emerge; these first-stage plants are often called weeds, or “weedy ephemerals,” and make up a large number of agricultural pest species. But these initial colonizers specialize in colonization at the expense of long-term competitiveness for light. Second, a wave of plants that are not as good at spreading their seed, but a little better at monopolizing light, gain dominance. These are followed by plants that are even better at long-term competition; eventually, absent human interference, the original weeds become rare.

Sometimes, however, the landscape is frozen at the first stage of succession; this is known as agriculture. Second-wave competitive plants are prevented from growing; the land is cleared again and again, and the seeds of a single species planted, providing an optimal environment for short-life-history weeds. Since the survival of humans and their livestock depends on only a few species of plants, other plants that would eventually out-compete the weeds must not be permitted to grow. Instead, herbicides are applied, resulting in selection for better and better weeds.

This is not an indictment of agriculture. Again, without these methods, most humans on earth would die. But the precariousness of the situation is a result of evolutionary processes. Perverse results are common in naive pest management strategies; Kaneshiro (pp. 13-14) suggests that eradication efforts for the Mediterranean fruit fly in California in the 1980s, despite temporarily reducing the population size substantially, paradoxically resulted in the adaptation of the fruit fly to winter conditions and subsequent population explosions. Pesticide resistance in plants and animals (and even diseases) frequently follows a similarly perverse course.

Ritual Ecology

Ecosystems are made up of “selfish” organisms that display variation, and undergo natural and sexual selection. Ecosystems seem to self-repair because any temporarily empty niche will quickly be filled by any organism that shows up to do the job, no matter how ill-suited it may be at first. Economies self-repair in the same manner: a product or service that is not being supplied is an opportunity.

Language appears to be remarkably self-repairing: deaf school children in Nicaragua, provided only with lipreading training of dubious effectiveness, developed their own language, which within two generations acquired the core expressive characteristics of any human language.

While inherited ritual traditions may be extremely useful and highly adapted to their contexts, ritual may exhibit a high degree of self-repair as well. And since the context of human existence has changed so rapidly since the Industrial Revolution, ancestral traditions may be poorly adapted to new contexts; self-repair for new contexts may be a necessity. The human being himself has not changed much, but his environment, duties, modes of subsistence, and social interdependencies have changed dramatically.

Memetic selection is like sexual selection, in that it is based on signal reception by a perceiving organism (another human or group of humans). Rituals are transmitted by preferential copying (with variation); even novel rituals, like the rock concert, the desert art festival, the school shooting, or the Twitter shaming, must be attended to and copied in order to survive and spread.

Some rituals are useful, providing group cohesion and bonding, the opportunity for costly signaling, free-rider detection and exclusion, and similar benefits. Some rituals have aesthetic or affective benefits, providing desirable mental states; these need not be happy, as one of the most popular affective states provided by songs is poignant sadness. Rituals vary in their usefulness, communication efficiency, pleasurability, and prestige; they will be selected for all these qualities.

Ritual is not a single, fungible substance. Rather, an entire human culture has many ritual niches, just like an ecosystem: rituals specialized for cohesion and bonding may display adaptations entirely distinct from rituals that are specialized for psychological self-control or pleasurable feelings. Marriage rituals are different from dispute resolution rituals; healing rituals are distinct from criminal justice rituals. Humans have many signaling and affective needs, and at any time many rituals are in competition to supply them.

Cultural Clearcutting: Ritual Shocks

Ordinarily, rituals evolve slowly and regularly, reflecting random chance as well as changes in context and technology. From time to time, there are shocks to the system, and an entire ritual ecosystem is destroyed and must be repaired out of sticks and twigs.

Recall that in literal clearcutting, short-life-history plants flourish. They specialize in spreading quickly, with little regard for long-term survival and zero regard for participating in relationships within a permanent ecosystem. After a cultural clearcutting occurs, short-life-history rituals such as drug abuse flourish. To take a very extreme example, the Native American genocide destroyed many cultures at one blow. Many peoples who had safely used alcohol in ceremonial contexts for centuries experienced chronic alcohol abuse as their cultures were erased and they were massacred and forcibly moved across the country to the most marginal lands. There is some recent evidence of ritual repair, however; among many Native American groups, alcohol use is lower than among whites, and the ratio of Native American to white alcohol deaths has been decreasing for decades.

Crack cocaine did not spread among healthy, ritually intact communities. It spread among communities that had been “clearcut” by economic problems (including loss of manufacturing jobs), sadistic urban planning practices, and tragic social changes in family structure. Methamphetamine has followed similar patterns.

Alcohol prohibition in the United States constituted both a ritual destruction and a pesticide-style management policy. Relatively healthy ritual environments for alcohol consumption, resulting in substantial social capital, were destroyed, including fine restaurants. American cuisine was set back decades as the legitimate fine restaurants could not survive economically without selling a bottle of wine with dinner. In their place, short-life-history ritual environments, such as the speakeasy, sprung up; they contributed little to social capital, and had no ritual standards for decorum.

During (alcohol) Prohibition, when grain and fruit alcohol was not available, poisonous wood alcohols or other toxic alcohol substitutes were commonly consumed, often (but not always) unknowingly. (It’s surprising that there are drugs more toxic than alcohol, but there you go.) The consumption of poisoned (denatured) or wood alcohol may be the ultimate short-life-history ritual; it contributed nothing to social capital, provided but a brief experience of palliation, and often resulted in death or serious medical consequences. Morgues filled with bodies. The modern-day policy of poisoning prescription opiates with acetaminophen has the same effect as the Prohibition-era policy of “denaturing” alcohol: death and suffering to those in too much pain to pay attention to long-term incentives.

Early 20th century and modern prohibitions clearly don’t eradicate short-life-history drug rituals; rather, they concentrate them in their most harmful forms, and at the same time create a permanent economic niche for distributors. As the recently deceased economist Douglass North said in his Nobel lecture,
The organizations that come into existence will reflect the opportunities provided by the institutional matrix. That is, if the institutional framework rewards piracy then piratical organizations will come into existence; and if the institutional framework rewards productive activities then organizations – firms – will come into existence to engage in productive activities.
If the ritual ecology within a category of ritual provides attractive niches for short-life-history rituals, and the economic ecology provides niches for drug cartels, then these will come into existence and prosper; but if a ritual context is allowed to evolve to encapsulate mind-altering substances, as it has for most human societies in the history of the world, and to direct the use of these substances in specific times, manners, and places, then these longer-life-history rituals specialized for competition rather than short-term palliation will flourish. Prohibition is a pesticide with perverse effects; ritual reforestation is a long-term solution. (...)

I focus on drugs because drugs are interesting, and they provide a tidy example of the processes in ritual ecology. But the same selective effects are present in many domains: music, drama, exercise, food, and the new ritual domain of the internet.

by Sarah Perry, Ribbonfarm |  Read more:
Image: Clearcut, Wikipedia

Sunday, November 25, 2018

Of America and the Rise of the Stupefied Plutocrat

At the higher elevations of informed American opinion in the spring of 2018 the voices of reason stand united in their fear and loathing of Donald J. Trump, real estate mogul, reality TV star, 45th president of the United States. Their viewing with alarm is bipartisan and heartfelt, but the dumbfounded question, “How can such things be?” is well behind the times. Trump is undoubtedly a menace, but he isn’t a surprise. His smug and self-satisfied face is the face of the way things are and have been in Washington and Wall Street for the last quarter of a century.

Trump staked his claim to the White House on the proposition that he was “really rich,” embodiment of the divine right of money and therefore free to say and do whatever it takes to make America great again. A deus ex machina descending an escalator into the atrium of his eponymous tower on Manhattan’s Fifth Avenue in June 2015, Trump was there to say, and say it plainly, that money is power, and power, ladies and gentlemen, is not self-sacrificing or democratic. The big money cares for nothing other than itself, always has and always will. Name of the game, nature of the beast.

Not the exact words in Trump’s loud and thoughtless mouth, but the gist of the message that over the next 17 months he shouted to fairground crowd and camera in states red, white and blue. A fair enough share of his fellow citizens screamed, stamped and voted in agreement because what he was saying they knew to be true, knew it not as precept borrowed from the collected works of V.I. Lenin or Ralph Lauren but from their own downwardly mobile experience on the losing side of a class war waged over the past 40 years by America’s increasingly frightened and selfish rich against its increasingly angry and debtbound poor.

Trump didn’t need briefing papers to refine the message. He presented it live and in person, an unscripted and overweight canary flown from its gilded cage, telling it like it is when seen from the perch of the haves looking down on the birdseed of the have-nots. Had he time or patience for looking into books instead of mirrors, he could have sourced his wisdom to Supreme Court Justice Louis Brandeis, who in 1933 presented the case for Franklin D. Roosevelt’s New Deal: “We must make our choice. We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.”

Not that it would have occurred to Trump to want both, but he might have been glad to know the Supreme Court had excused him from further study under the heading of politics. In the world according to Trump—as it was in the worlds according to Ronald Reagan, George Bush pere et fils, Bill Clinton and Barack Obama—the concentration of wealth is the good, the true and the beautiful. Democracy is for losers.

Ronald Reagan was elected President in 1980 with an attitude and agenda similar to Trump’s—to restore America to its rightful place where “someone can always get rich.” His administration arrived in Washington firm in its resolve to uproot the democratic style of feeling and thought that underwrote FDR’s New Deal. What was billed as the Reagan Revolution and the dawn of a New Morning in America recruited various parties of the dissatisfied right (conservative, neoconservative, libertarian, reactionary and evangelical) under one flag of abiding and transcendent truth—money ennobles rich people, making them healthy, wealthy and wise; money corrupts poor people, making them ignorant, lazy and sick.

Re-branded as neoliberalism in the 1990s the doctrine of enlightened selfishness has served as the wisdom in political and cultural office ever since Reagan stepped onto the White House stage promising a happy return to an imaginary American past—to the home on the range made safe from Apaches by John Wayne, an America once again cowboy-hatted and standing tall, risen from the ashes of defeat in Vietnam, cleansed of its Watergate impurities, outspending the Russians on weapons of mass destruction, releasing the free market from the prison of government regulation, going long on the private good, selling short the public good.

For 40 years under administrations Republican and Democrat, the concentrations of wealth and power have systematically shuffled public land and light and air into a private purse, extended the reach of corporate monopoly, shifted the bulk of the nation’s income to its top-tier fatted calves, let fall into disrepair nearly all the infrastructure—roads, water systems, schools, bridges, hospitals and power plants—that provides a democratic commonwealth with the means of production for its mutual enterprise. The subdivision of America the Beautiful into a nation of the rich and a nation of the poor has outfitted a tenth of the population with three-quarters of the nation’s wealth. The work in progress has been accompanied by the construction of a national security and surveillance state backed by the guarantee of never-ending foreign war and equipped with increasingly repressive police powers to quiet the voices of domestic discontent. In the 1950s the word public indicated a common good (public health, public school, public service, public spirit); private was a synonym for selfishness and greed (plutocrats in top hats, pigs at troughs). The connotations traded places in the 1980s; private to be associated with all things bright and beautiful (private trainer, private school, private plane), public a synonym for all things ugly, incompetent and unclean (public housing, public welfare, public toilet). (...)

The framers of the Constitution, prosperous and well-educated gentlemen assembled in Philadelphia in the summer of 1787, shared with John Adams the suspicion that “democracy will infallibly destroy all civilization,” agreed with James Madison that the turbulent passions of the common man lead to “reckless agitation” for the abolition of debts and “other wicked projects.” With Plato the framers shared the assumption that the best government incorporates the means by which a privileged few arrange the distribution of property and law for the less fortunate many. They envisioned an enlightened oligarchy to which they gave the name of a republic. Adams thought “the great functions of state” should be reserved for “the rich, the well-born, and the able,” the new republic to be managed by men to whom Madison attributed “most wisdom to discern and most virtue to pursue the common good of the society.” (...)

But unlike our present-day makers of money and law, the founders were not stupefied plutocrats. They knew how to read and write (in Latin or French if not also in Greek) and they weren’t preoccupied with the love and fear of money. From their reading of history they understood that oligarchy was well-advised to furnish democracy with some measure of political power because the failure to do so was apt to lead to their being roasted on pitchforks. Accepting of the fact that whereas democracy puts a premium on equality, a capitalist economy does not, the founders looked to balance the divergent ways and means, to accommodate both motions of the heart and the movement of a market. They conceived the Constitution as both organism and mechanism and offered as warranty for its worth the character of men presumably relieved of the necessity to cheat and steal and lie.

The presumption in 1787 could be taken at fair and face value. The framers were endowed with the intellectual energy of the 18th-century Enlightenment, armed with the moral force of the Christian religion. Their idea of law they held to be sacred, a marriage of faith and reason. But good intentions are a perishable commodity, and even the best of oligarchies bear comparison to cheese. Sooner or later they turn rancid in the sun. Wealth accumulates, men decay; a band of brothers that once aspired to form a wise and just government acquires the character of what Aristotle likened to that of “the prosperous fool,” a class of men insatiable in their appetite for more—more banquets, more laurel wreaths and naval victories, more temples, dancing girls and portrait busts—so intoxicated by the love of money “they therefore imagine there is nothing it cannot buy.” (...)

All men were maybe equal in the eye of God, but not in the pews in Boston’s Old North Church, in the streets of Benjamin Franklin’s Philadelphia, in the fields at Jefferson’s Monticello. The Calvinist doctrine of predestination divided the Massachusetts flock of Christian sheep into damned and saved; Cotton Mather in 1696 reminded the servants in his midst, “You are the animate, separate passive instruments of other men . . . your tongues, your hands, your feet, are your masters’s and they should move according to the will of your masters.” Franklin, enlightened businessman and founder of libraries, looked upon the Philadelphia rabble as coarse material that maybe could be brushed and combed into an acceptable grade of bourgeois broadcloth. His Poor Richard’s Almanac offered a program for turning sow’s ears if not into silk purses, then into useful tradesmen furnished with a “happy mediocrity.” For poor white children in Virginia, Jefferson proposed a scheme he described as “raking from the rubbish” the scraps of intellect and talent worth the trouble of further cultivation. A few young illiterates who showed promise as students were allowed to proceed beyond the elementary grades; the majority were released into a wilderness of ignorance and poverty, dispersed over time into the westward moving breeds of an American underclass variously denominated as “mudsill,” “hillbilly,” “cracker,” “Okie,” “redneck,” Hillary Clinton’s “basket of deplorables.”

Nor at any moment in its history has America declared a lasting peace between the haves and have-nots. Temporary cessations of hostilities, but no permanent closing of the moral and social frontier between debtor and creditor. The notion of a classless society derives its credibility from the relatively few periods in the life of the nation during which circumstances encouraged social readjustment and experiment—in the 1830s, 1840s, and 1850s, again in the 1940s, 1950s and 1960s—but for the most part the record will show the game securely rigged in favor of the rich, no matter how selfish or stupid, at the expense of the poor, no matter how innovative or entrepreneurial. During the last 30 years of the 19th century and the first 30 years of the 20th, class conflict furnished the newspaper mills with their best-selling headlines—railroad company thugs quelling labor unrest in the industrial East, the Ku Klux Klan lynching Negroes in the rural South, the U.S. army exterminating Sioux Indians on the Western plains.

Around the turn of the 20th century the forces of democracy pushed forward an era of progressive reform sponsored by both the Republican president, Theodore Roosevelt, and the Democratic president, Woodrow Wilson. During the middle years of the 20th century America at times showed some semblance of the republic envisioned by its 18th-century founders—Franklin D. Roosevelt’s New Deal, a citizen army fighting World War II, the Great Depression replaced with a fully employed economy in which all present shared in the profits.

The civil rights and anti-Vietnam war protests in the 1960s were expressions of democratic objection and dissent intended to reform the country’s political thought and practice, not to overthrow its government. Nobody was threatening to reset the game clock in the Rose Bowl, tear down Grand Central Terminal or remove the Lincoln Memorial. The men, women and children confronting racist tyranny in the South—sitting at a lunch counter in Alabama, riding a bus into Mississippi, going to school in Arkansas—risked their lives and sacred honor on behalf of a principle, not a lifestyle; for a government of laws, not men. The unarmed rebellion led to the enactment in the mid-1960s of the Economic Opportunity Act, the Voting Rights Act, the Medicare and Medicaid programs, eventually to the shutting down of the Vietnam War.

Faith in democracy survived the assassination of President John F. Kennedy in 1963; it didn’t survive the assassinations of Robert Kennedy and Martin Luther King in 1968. The 1960s and 1970s gave rise to a sequence of ferocious and destabilizing change—social, cultural, technological, sexual, economic and demographic—that tore up the roots of family, community and church from which a democratic society draws meaning and strength. The news media promoted the multiple wounds to the body politic (the murders of King and Kennedy, big-city race riots, the killing of college students at Kent State and Jackson State, crime in the streets of Los Angeles, Chicago and Newark) as revolution along the line of Robespierre’s reign of terror. The fantasy of armed revolt sold papers, boosted ratings, stimulated the demand for heavy surveillance and repressive law enforcement that over the last 50 years has blossomed into the richest and most innovative of the nation’s growth industries.

By the end of the 1970s democracy had come to be seen as a means of government gone soft in the head and weak in the knees, no match for unscrupulous Russians, incapable of securing domestic law and order, unable to disperse the barbarians (foreign and native born) at the gates of the gated real estate in Beverly Hills, Westchester County and Palm Beach. The various liberation movements still in progress no longer sought to right the wrongs of government. The political was personal, the personal political. Seized by the appetite for more—more entitlements, privileges and portrait busts—plaintiffs for both the haves and the have-nots agitated for a lifestyle, not a principle. The only constitutional value still on the table was the one constituting freedom as property, property as freedom. A fearful bourgeois society adrift in a sea of troubles was clinging to its love of money as if to the last lifeboat rowing away from the Titanic when Ronald Reagan in 1980 stepped onto the stage of the self-pitying national melodrama with the promise of an America to become great again in a future made of gold.

by Lewis Lapham, LitHub |  Read more:
Image: Detail from Jasper Johns 'White Flag'

America’s Epidemic of Empty Churches

Three blocks from my Brooklyn apartment, a large brick structure stretches toward heaven. Tourists recognize it as a church—the building’s bell tower and stained-glass windows give it away—but worshippers haven’t gathered here in years.

The 19th-century building was once known as St. Vincent De Paul Church and housed a vibrant congregation for more than a century. But attendance dwindled and coffers ran dry by the early 2000s. Rain leaked through holes left by missing shingles, a tree sprouted in the bell tower, and the Brooklyn diocese decided to sell the building to developers. Today, the Spire Lofts boasts 40 luxury apartments with one-bedroom units renting for as much as $4,812 per month. It takes serious cash to make God’s house your own, apparently.

Many of our nation’s churches can no longer afford to maintain their structures—between 6,000 and 10,000 churches die each year in America—and that number will likely grow. Though more than 70 percent of our citizens still claim to be Christian, congregational participation is less central to many Americans’ faith than it once was. Most denominations are declining as a share of the overall population, and donations to congregations have been falling for decades. Meanwhile, religiously unaffiliated Americans, nicknamed the “nones,” are growing as a share of the U.S. population.

Any minister can tell you that the two best predictors of a congregation’s survival are “budgets and butts,” and American churches are struggling by both metrics. As donations and attendance decrease, the cost of maintaining large physical structures that are only in use a few hours a week by a handful of worshippers becomes prohibitive. None of these trends show signs of slowing, so the United States’s struggling congregations face a choice: start packing or find a creative way to stay afloat.

Closure and adaptive reuse often seems like the simplest and most responsible path. Many houses of worship sit on prime real estate, often in the center of towns or cities where inventory is low. Selling the property to the highest bidder is a quick and effective way to cut losses and settle debts. But repurposing a sacred space for secular use has a number of drawbacks. There are zoning issues, price negotiations, and sometimes fierce pushback from the surrounding community and the parish’s former members.

by Jonathan Merritt, The Atlantic | Read more:
Image: Carlos Barria/Reuters
[ed. I wonder at what point they lose their tax exempt status? The article doesn't say.]

Saturday, November 24, 2018

In Praise of Mediocrity

I’m a little surprised by how many people tell me they have no hobbies. It may seem a small thing, but — at the risk of sounding grandiose — I see it as a sign of a civilization in decline. The idea of leisure, after all, is a hard-won achievement; it presupposes that we have overcome the exigencies of brute survival. Yet here in the United States, the wealthiest country in history, we seem to have forgotten the importance of doing things solely because we enjoy them.

Yes, I know: We are all so very busy. Between work and family and social obligations, where are we supposed to find the time?

But there’s a deeper reason, I’ve come to think, that so many people don’t have hobbies: We’re afraid of being bad at them. Or rather, we are intimidated by the expectation — itself a hallmark of our intensely public, performative age — that we must actually be skilled at what we do in our free time. Our “hobbies,” if that’s even the word for them anymore, have become too serious, too demanding, too much an occasion to become anxious about whether you are really the person you claim to be.

If you’re a jogger, it is no longer enough to cruise around the block; you’re training for the next marathon. If you’re a painter, you are no longer passing a pleasant afternoon, just you, your watercolors and your water lilies; you are trying to land a gallery show or at least garner a respectable social media following. When your identity is linked to your hobby — you’re a yogi, a surfer, a rock climber — you’d better be good at it, or else who are you?

Lost here is the gentle pursuit of a modest competence, the doing of something just because you enjoy it, not because you are good at it. Hobbies, let me remind you, are supposed to be something different from work. But alien values like “the pursuit of excellence” have crept into and corrupted what was once the realm of leisure, leaving little room for the true amateur. The population of our country now seems divided between the semipro hobbyists (some as devoted as Olympic athletes) and those who retreat into the passive, screeny leisure that is the signature of our technological moment.

I don’t deny that you can derive a lot of meaning from pursuing an activity at the highest level. I would never begrudge someone a lifetime devotion to a passion or an inborn talent. There are depths of experience that come with mastery. But there is also a real and pure joy, a sweet, childlike delight, that comes from just learning and trying to get better. Looking back, you will find that the best years of, say, scuba-diving or doing carpentry were those you spent on the learning curve, when there was exaltation in the mere act of doing.

by Tim Wu, NY Times |  Read more:
Image: markk

‘The Academy Is Largely Itself Responsible for Its Own Peril’

The book was supposed to end with the inauguration of Barack Obama. That was Jill Lepore’s plan when she began work in 2015 on her new history of America, These Truths (W.W. Norton). She had arrived at the Civil War when Donald J. Trump was elected. Not to alter the ending, she has said, would have felt like "a dereliction of duty as a historian."

These Truths clocks in at 789 pages (nearly 1,000 if you include the notes and index). It begins with Christopher Columbus and concludes with you-know-who. But the book isn’t a compendium; it’s an argument. The American Revolution, Lepore shows, was also an epistemological revolution. The country was built on truths that are self-evident and empirical, not sacred and God-given. "Let facts be submitted to a candid world," Thomas Jefferson wrote in the Declaration of Independence. Now, it seems, our faith in facts has been shaken. These Truths traces how we got here.

Lepore occupies a rarefied perch in American letters. She is a professor at Harvard University and a staff writer at The New Yorker. She has written books about King Philip’s War, Wonder Woman, and Jane Franklin, sister of Benjamin Franklin. She even co-wrote an entire novel in mock 18th-century prose. The Princeton historian Sean Wilentz has said of Lepore: "More successfully than any other American historian of her generation, she has gained a wide general readership without compromising her academic standing."

Lepore spoke with The Chronicle Review about how the American founding inaugurated a new way of thinking, the history of identity politics, and whether she's tired of people asking about her productivity. (...)

Q. America’s founding marked not only a new era of politics, but also a new way of thinking.

A. I call the book These Truths to invoke those truths in the Declaration of Independence that Jefferson describes, with the revision provided by Franklin, as "self-evident" — political equality, natural rights, and the sovereignty of the people. But I’m also talking about an unstated fourth truth, which is inquiry itself. Anyone who has spent time with the founding documents and the political and intellectual history in which they were written understands that the United States was founded quite explicitly as a political experiment, an experiment in the science of politics. It was always going to be subject to scrutiny. That scrutiny is done not from above by some commission, but by the citizenry itself.

Q. For democracy to work, of course, the people must be well informed. Yet we live in an age of epistemological mayhem. How did the relationship between truth and fact come unwound?

A. I spend a lot of time in the book getting it wound, to be fair. There’s an incredibly rich scholarship on the history of evidence, which traces its rise in the Middle Ages in the world of law, its migration into historical writing, and then finally into the realm that we’re most familiar with, journalism. That’s a centuries-long migration of an idea that begins in a very particular time and place, basically the rise of trial by jury starting in 1215. We have a much better vantage on the tenuousness of our own grasp of facts when we understand where facts come from.

The larger epistemological shift is how the elemental unit of knowledge has changed. Facts have been devalued for a long time. The rise of the fact was centuries ago. Facts were replaced by numbers in the 18th and 19th centuries as the higher-status unit of knowledge. That’s the moment at which the United States is founded as a demographic democracy. Now what’s considered to be most prestigious is data. The bigger the data, the better.

That transformation, from facts to numbers to data, traces something else: the shifting prestige placed on different ways of knowing. Facts come from the realm of the humanities, numbers represent the social sciences, and data the natural sciences. When people talk about the decline of the humanities, they are actually talking about the rise and fall of the fact, as well as other factors. When people try to re-establish the prestige of the humanities with the digital humanities and large data sets, that is no longer the humanities. What humanists do comes from a different epistemological scale of a unit of knowledge.

Q. How is the academy implicated in or imperiled by this moment of epistemological crisis?

A. The academy is largely itself responsible for its own peril. The retreat of humanists from public life has had enormous consequences for the prestige of humanistic ways of knowing and understanding the world.

Universities have also been complicit in letting sources of federal government funding set the intellectual agenda. The size and growth of majors follows the size of budgets, and unsurprisingly so. After World War II, the demands of the national security state greatly influenced the exciting fields of study. Federal-government funding is still crucial, but now there’s a lot of corporate money. Whole realms of knowing are being brought to the university through commerce.

I don’t expect the university to be a pure place, but there are questions that need to be asked. If we have a public culture that suffers for lack of ability to comprehend other human beings, we shouldn’t be surprised. The resources of institutions of higher learning have gone to teaching students how to engineer problems rather than speak to people. (...)

Q. The last chapter of These Truths is titled "America, Disrupted," and it traces the rise of ideas from the tech world, like innovation. You point out that innovation was traditionally seen as something to be wary of.

A. It’s true that the last chapter is about disruptive innovation, but it’s also true that the book starts with the history of writing as a technology. Reading "America, Disrupted" in isolation might seem like I have some beef with Silicon Valley — which may or may not be the case — but reading that chapter after the 15 that come before makes it clear that what I have is a deep and abiding interest in technology and communication.

Innovation as an idea in America is historically a negative thing. Innovation in politics is what is to be condemned: To experiment recklessly with a political arrangement is fatal to our domestic tranquillity. So there’s a lot of anti-innovation language around the founding, especially because Republicanism — Jeffersonianism — is considered excessively innovative. Innovation doesn’t assume its modern sense until the 1930s, and then only in a specialized literature.

Disruption has a totally different history. It’s a way to avoid the word "progress," which, even when it’s secularized, still implies some kind of moral progress. Disruption emerges in the 1990s as progress without any obligation to notions of goodness. And so "disruptive innovation," which became the buzzword of change in every realm in the first years of the 21st century, including higher education, is basically destroying things because we can and because there can be money made doing so. Before the 1990s, something that was disruptive was like the kid in the class throwing chalk. And that’s what disruptive innovation turned out to really mean. A little less disruptive innovation is called for.

by Evan Goldstein, Chronicle of Higher Education | Read more:
Image: Kayana Szymczak, The New York Times, Redux

Friday, November 23, 2018

Who Cares? On Nags, Martyrs, the Women Who Give Up, and the Men Who Don’t Get It

“Just let me do it,” I told Rob as I watched him struggle to fold our daughter’s fitted sheet shortly after he took over laundry duty. It’s a phrase I’m sure he’s heard from me countless times, and even when I’m not saying it out loud, I’ve often implied it with a single you’re-doing-it-wrong stare. I cannot pretend that I have not played a part in creating such a deep divide in the emotional labor expectations in my home. I want things done a certain way, and any deviation from my way can easily result in me taking over. If the dishwasher is loaded wrong, I take it back on instead of trying to show my husband how to load it. If the laundry isn’t folded correctly, I’ll decide to simply do it myself. On occasion I have found myself venting with friends that it is almost as if our male partners are purposefully doing things wrong so they won’t have to take on more work at home.

While I don’t think this has been the case in my own home, for some women this is a reality. A 2011 survey in the UK found that 30 percent of men deliberately did a poor job on domestic duties so that they wouldn’t be asked to do the job again in the future. They assumed that their frustrated partners would find it easier to do the job themselves than deal with the poor results of their half-hearted handiwork. And they were right. A full 25 percent of the men surveyed said they were no longer asked to help around the house, and 64 percent were only asked to pitch in occasionally (i.e., as a last resort).

Even if men aren’t consciously doing a poor job to get out of housework, their lackluster “help” still frustrates. A similar survey conducted by Sainsbury’s in the UK found that women spent a whole three hours per week, on average, redoing chores they had delegated to their partners. The list where men fell short left little ground uncovered: doing the dishes, making the bed, doing the laundry, vacuuming the floors, arranging couch cushions, and wiping down counters were all areas of complaint. Two-thirds of the women polled felt convinced that this was their partner’s best effort, so perhaps it’s not surprising that more than half didn’t bother “nagging” them to do better. They simply followed their partners around and cleaned up after them.

The ways in which women cling to maintaining rigid standards is what sociologists call “maternal gatekeeping,” and what we refer to, pre-baby, as simply “perfectionism.” We actively discourage men from becoming full partners at home, because we truly believe we can do everything better, faster, more efficiently than everyone else. Because we are the ones who control all the aspects of home and life organization for our families and especially our children, we become convinced that our way is the only way. We are hesitant to adjust our personal expectations, especially because we have put so much work into caring about our household systems. We’ve carefully considered how to best keep everyone comfortable and happy, so it seems natural that everyone should conform to the best-thought-out plan available: ours.

This thinking is consistently reinforced by a culture that tells us that we should hold ourselves to this higher standard. That if we don’t strive toward perfectionism, we are failing as women. We feel as if we are letting our families down, we are letting womankind down, we are letting ourselves down when we don’t perform emotional labor in the most intense possible way. Yet this level of perfectionism can be exhausting, and it dissuades those men who would help from even trying. Instead of assuming that men can hold down the fort while we are out of town, we leave a veritable handbook on how they should best care for their own children. Dufu writes in her book that she once wrote a list for her husband titled “Top Ten Tips for Traveling with Kofi,” which included, among other things, a reminder to feed their child. I have left freezer meals and detailed instructions for my husband on how to feed himself when I am out of town so he doesn’t wander into the grocery store and spend $200 on two days’ worth of food, instead of involving him in the process of meal planning so he could take it on himself. It’s not just society but also my maternal gatekeeping that contributes to the mental load I’ve taken on. I don’t leave room for mistakes, and because of that, I don’t leave room for progress. Then again, when I do, I’ve been let down.
***
We had both been warned by my oral surgeon that my wisdom tooth extraction was likely going to put me down for a few days, but instead of the intense prep I would normally do ahead of time, I assumed my husband would take over what I couldn’t do. He’d been slowly but surely picking up his share of emotional labor since my Harper’s Bazaar article had appeared three months earlier. He seemed ready to take on the type of full day I would have put in before he was laid off. The day of the surgery, I felt mostly fine immediately afterward. I took my pain pills but was moving around, had minimal swelling, and spent the evening going over the plans for the next day with Rob. I had worked with our son on his homework, but there was still one page that needed to be finished in the morning. He was allowed to bring in a Game Boy for the special “electronics day” their class had earned. Our daughter needed to go to preschool at 8:30 a.m., but her needs were simple — get her dressed, brush her hair, fill her water bottle. Our son had the option of hot lunch if the morning got out of hand, and I encouraged Rob to use it but just remember to pack him a snack. He had been around and helping with the morning routine for weeks since his layoff. I assumed he could do it alone just this once, though we both thought he wouldn’t have to. After all, I was fine.

Well, I was fine until 11:45 p.m., when I woke up crying and frantically scrambling for pain pills. The left side of my face had swollen to the size of a baseball, and I spent hours awake in excruciating pain. When morning came, the situation was even worse, and I could barely function. Rob woke me at 8:30 a.m. to tell me he was taking our daughter to school along with our youngest. Our six-year-old would have to be walked to school in half an hour. I set an alarm on my phone in case I dozed off, and our son came into the room and talked with me. I asked him if he had everything ready — his lunch, his clothes, his homework. He said yes, and I lay back relieved. I was barely able to get myself out of bed to walk him to school and found myself resenting the fact that his dad hadn’t thought to take all of them to drop off like I had done when he was working. My face throbbed with pain as I slipped on shoes and a jacket, then instructed our son to do the same. Then I came into the living room at the moment we had to leave and realized that my six-year-old had been wrong. His homework hadn’t been done or checked. His lunch hadn’t been packed. He didn’t have a snack or fresh water. He didn’t have an electronic device to bring to school for their special day. Now not only was I suffering the guilt of not getting him ready, but he would have to suffer the consequences of no one helping him. He would have to stay in at recess to complete his homework. He wouldn’t get the thirty minutes of electronic time his friends would have. I was able to grab an orange and throw it in his backpack for a snack, but it was too late for the rest of it. Even though my husband had been the one on duty for the morning, I was the one left with the guilt of taking my son to school ill prepared. I felt like I should have better prepared my husband to take over for me. I should have implemented my system better. If letting Rob take over was going to mean my kids’ needs falling through the cracks, I wasn’t here for it. I needed a better option, and that better option seemed to be doing things my way.

When I later brought up the morning mishap with Rob, he felt guilty also, but not in the way I had. He was able to acknowledge the problem, say he was sorry, and move on. He didn’t beat himself up over his mistake in the way I was beating myself up for not hovering more diligently. Parenting mistakes aren’t a moral failing for him like they are for me. Dads get the at-least-he’s-trying pat on the back when people see them mess up. Moms get the eye rolls and judgment. Everything that happened that morning was still “my fault,” because I wasn’t living up to the standard I should set for myself as a mom: the standard of perfection.

I was still expected to be the one in charge, even when I was incapacitated, because isn’t that just what moms are supposed to do? He wasn’t expected to have the morning routine locked down. He was still a dad — still exempt from judgment. Despite now being the at-home parent, at least for the time being, it still wasn’t his primary job or responsibility. It was mine, just as it had always been. I was trying to treat my husband as an equal partner. I was trying to let go of control, or adjust my expectations, or compromise my standards, but we kept coming up short. We kept missing that elusive balance, and more frustratingly, I was the only one who felt bad about it. I was the one who cared.

by Gemma Hartley, Longreads |  Read more:
Image: Katie Kosma
[ed. Yikes.]

Beijing to Judge Every Resident Based on Behavior by End of 2020

China’s plan to judge each of its 1.3 billion people based on their social behavior is moving a step closer to reality, with Beijing set to adopt a lifelong points program by 2021 that assigns personalized ratings for each resident.

The capital city will pool data from several departments to reward and punish some 22 million citizens based on their actions and reputations by the end of 2020, according to a plan posted on the Beijing municipal government’s website on Monday. Those with better so-called social credit will get “green channel” benefits while those who violate laws will find life more difficult.

The Beijing project will improve blacklist systems so that those deemed untrustworthy will be “unable to move even a single step,” according to the government’s plan. Xinhua reported on the proposal Tuesday, while the report posted on the municipal government’s website is dated July 18.

China has long experimented with systems that grade its citizens, rewarding good behavior with streamlined services while punishing bad actions with restrictions and penalties. Critics say such moves are fraught with risks and could lead to systems that reduce humans to little more than a report card.

Ambitious Plan

Beijing’s efforts represent the most ambitious yet among more than a dozen cities that are moving ahead with similar programs.

Hangzhou rolled out its personal credit system earlier this year, rewarding “pro-social behaviors” such as volunteer work and blood donations while punishing those who violate traffic laws and charge under-the-table fees. By the end of May, people with bad credit in China have been blocked from booking more than 11 million flights and 4 million high-speed train trips, according to the National Development and Reform Commission.

According to the Beijing government’s plan, different agencies will link databases to get a more detailed picture of every resident’s interactions across a swathe of services. The proposal calls for agencies including tourism bodies, business regulators and transit authorities to work together.

The tracking of individual behavior in China has become easier as economic life moves online, with apps such as Tencent’s WeChat and Ant Financial’s Alipay a central node for making payments, getting loans and organizing transport. Accounts are generally linked to mobile phone numbers, which in turn require government IDs.

by Claire Che, David Ramli, and Dandan Li, Bloomberg | Read more:
Image: Anthony Kwan/Bloomberg

Wednesday, November 21, 2018


Jean-Michel Basquiat, Two heads on Gold
via:

Thinking About Thinking - the CIA Guide

"When we speak of improving the mind we are usually referring to the acquisition of information or knowledge, or to the type of thoughts one should have, and not to the actual functioning of the mind. We spend little time monitoring our own thinking and comparing it with a more sophisticated ideal."

When we speak of improving intelligence analysis, we are usually referring to the quality of writing, types of analytical products, relations between intelligence analysts and intelligence consumers, or organization of the analytical process. Little attention is devoted to improving how analysts think.

Thinking analytically is a skill like carpentry or driving a car. It can be taught, it can be learned, and it can improve with practice. But like many other skills, such as riding a bike, it is not learned by sitting in a classroom and being told how to do it. Analysts learn by doing. Most people achieve at least a minimally acceptable level of analytical performance with little conscious effort beyond completing their education. With much effort and hard work, however, analysts can achieve a level of excellence beyond what comes naturally.

Regular running enhances endurance but does not improve technique without expert guidance. Similarly, expert guidance may be required to modify long-established analytical habits to achieve an optimal level of analytical excellence. An analytical coaching staff to help young analysts hone their analytical tradecraft would be a valuable supplement to classroom instruction.

One key to successful learning is motivation. Some of CIA's best analysts developed their skills as a consequence of experiencing analytical failure early in their careers. Failure motivated them to be more self-conscious about how they do analysis and to sharpen their thinking process. (...)

Herbert Simon first advanced the concept of "bounded" or limited rationality. Because of limits in human mental capacity, he argued, the mind cannot cope directly with the complexity of the world. Rather, we construct a simplified mental model of reality and then work with this model. We behave rationally within the confines of our mental model, but this model is not always well adapted to the requirements of the real world. The concept of bounded rationality has come to be recognized widely, though not universally, both as an accurate portrayal of human judgment and choice and as a sensible adjustment to the limitations inherent in how the human mind functions.

Much psychological research on perception, memory, attention span, and reasoning capacity documents the limitations in our "mental machinery" identified by Simon. Many scholars have applied these psychological insights to the study of international political behavior. A similar psychological perspective underlies some writings on intelligence failure and strategic surprise.

This book differs from those works in two respects. It analyzes problems from the perspective of intelligence analysts rather than policymakers. And it documents the impact of mental processes largely through experiments in cognitive psychology rather than through examples from diplomatic and military history.

A central focus of this book is to illuminate the role of the observer in determining what is observed and how it is interpreted. People construct their own version of "reality" on the basis of information provided by the senses, but this sensory input is mediated by complex mental processes that determine which information is attended to, how it is organized, and the meaning attributed to it. What people perceive, how readily they perceive it, and how they process this information after receiving it are all strongly influenced by past experience, education, cultural values, role requirements, and organizational norms, as well as by the specifics of the information received.

This process may be visualized as perceiving the world through a lens or screen that channels and focuses and thereby may distort the images that are seen. To achieve the clearest possible image of China, for example, analysts need more than information on China. They also need to understand their own lenses through which this information passes. These lenses are known by many terms--mental models, mind-sets, biases, or analytical assumptions. (...)

Not enough training is focused in this direction--that is, inward toward the analyst's own thought processes. Training of intelligence analysts generally means instruction in organizational procedures, methodological techniques, or substantive topics. More training time should be devoted to the mental act of thinking or analyzing. It is simply assumed, incorrectly, that analysts know how to analyze. This book is intended to support training that examines the thinking and reasoning processes involved in intelligence analysis.

As discussed in the next chapter, mind-sets and mental models are inescapable. They are, in essence, a distillation of all that we think we know about a subject. The problem is how to ensure that the mind remains open to alternative interpretations in a rapidly changing world.

The disadvantage of a mind-set is that it can color and control our perception to the extent that an experienced specialist may be among the last to see what is really happening when events take a new and unexpected turn. When faced with a major paradigm shift, analysts who know the most about a subject have the most to unlearn. This seems to have happened before the reunification of Germany, for example. Some German specialists had to be prodded by their more generalist supervisors to accept the significance of the dramatic changes in progress toward reunification of East and West Germany.

The advantage of mind-sets is that they help analysts get the production out on time and keep things going effectively between those watershed events that become chapter headings in the history books.

A generation ago, few intelligence analysts were self-conscious and introspective about the process by which they did analysis. The accepted wisdom was the "common sense" theory of knowledge--that to perceive events accurately it was necessary only to open one's eyes, look at the facts, and purge oneself of all preconceptions and prejudices in order to make an objective judgment.

Today, there is greatly increased understanding that intelligence analysts do not approach their tasks with empty minds. They start with a set of assumptions about how events normally transpire in the area for which they are responsible. Although this changed view is becoming conventional wisdom, the Intelligence Community has only begun to scratch the surface of its implications.

If analysts' understanding of events is greatly influenced by the mind-set or mental model through which they perceive those events, should there not be more research to explore and document the impact of different mental models?

The reaction of the Intelligence Community to many problems is to collect more information, even though analysts in many cases already have more information than they can digest. What analysts need is more truly useful information--mostly reliable HUMINT from knowledgeable insiders--to help them make good decisions. Or they need a more accurate mental model and better analytical tools to help them sort through, make sense of, and get the most out of the available ambiguous and conflicting information.

by Richards J. Heuer, Jr., U.S. Central Intelligence Agency |  Read more:

Subprime Rises: Credit Card Delinquencies Blow Through Financial-Crisis Peak at the 4,705 Smaller US Banks

In the third quarter, the “delinquency rate” on credit-card loan balances at commercial banks other than the largest 100 banks – so the delinquency rate at the 4,705 smaller banks in the US – spiked to 6.2%. This exceeds the peak during the Financial Crisis for these banks (5.9%).

The credit-card “charge-off rate” at these banks, at 7.4% in the third quarter, has now been above 7% for five quarters in a row. During the peak of the Financial Crisis, the charge-off rate for these banks was above 7% four quarters, and not in a row, with a peak of 8.9%

These numbers that the Federal Reserve Board of Governors reportedMonday afternoon are like a cold shower in consumer land where debt levels are considered to be in good shape. But wait… it gets complicated.

The credit-card delinquency rate at the largest 100 commercial banks was 2.48% (not seasonally adjusted). These 100 banks, due to their sheer size, carry the lion’s share of credit card loans, and this caused the overall credit-card delinquency rate for all commercial banks combined to tick up to a still soothing 2.54%.

In other words, the overall banking system is not at risk, the megabanks are not at risk, and no bailouts are needed. But the most vulnerable consumers – we’ll get to why they may end up at smaller banks – are falling apart:


Credit card balances are deemed “delinquent” when they’re 30 days or more past due. Balances are removed from the delinquency basket when the customer cures the delinquency, or when the bank charges off the delinquent balance. The rate is figured as a percent of total credit card balances. In other words, among the smaller banks in Q3, 6.2% of the outstanding credit card balances were delinquent.

So what’s going on here?

The credit card business is immensely profitable, and so banks are willing to take some risks. It’s immensely profitable for three reasons:

  • The fee the bank extracts from every transaction undertaken with its credit cards (merchant pays), even if the credit-card holder pays off the balance every month and never incurs any interest expense.
  • The fees the bank extracts from credit card holders, such as annual fees, late fees, etc.
  • The huge spread between the banks’ cost of funding and the interest rates banks charge on credit cards.

So how low is the banks’ cost of funding? For example, in its third-quarter regulatory filing with the SEC (10-Q), Wells Fargo disclosed that it had $1.73 trillion in total “funding sources.” This amount was used to fund $1.73 trillion in “earning assets,” such as loans to its customers or securities it had invested in.

This $1.73 trillion in funding was provided mostly by deposits: $465 billion in non-interest-bearing deposits (free money), and $907 billion in interest bearing deposits; for a total of $1.37 billion of ultra-cheap funding from deposits.

In addition to its deposits, Wells Fargo lists $353 billion in other sources of funding – “short-term and long-term borrowing” – such as bonds it issued.

For all sources of funding combined, so on the $1.73 trillion, the “total funding cost” was 0.87%. Nearly free money. Rate hikes no problem.

In Q3, Wells Fargo’s credit-card balances outstanding carried an average interest rate of 12.77%!

So, with its cost of funding at 0.87%, and the average interest rate of 12.77% on its credit card balances, Wells Fargo is making an interest margin on credit cards of 11.9 percentage points. In other words, this is an immensely profitable business – hence the incessant credit-card promos.

With credit cards, the US banking system has split in two.

The largest banks can offer the most attractive incentives on their credit cards (cash-back, miles, etc.) and thus attract the largest pool of applicants. Then they can reject those with higher credit risks – having not yet forgotten the lesson from the last debacle.

The thousands of smaller banks cannot offer the same incentives and lack the marketing clout to attract this large pool of customers with good credit. So they market to customers with less stellar credit, or with subprime-rated credit — and charge higher interest rates. 30% sounds like a deal, even if the customer will eventually buckle under that interest rate and will have to default.

That’s why banks take the risks of higher charge-offs: They’re getting paid for them! But at some point, it gets expensive. And if it takes a smaller bank to the brink, the FDIC might swoop in on a Friday evening and shut it down. No biggie. Happens routinely.

The real problem with credit cards isn’t the banks – credit card debt is not big enough to topple the US banking system. It’s the consumers, and what it says about the health of consumers.

The overall numbers give a falsely calming impression. Credit card debt and other revolving credit has reached $1.0 trillion (not seasonally adjusted). This is about flat with the prior peak a decade ago.

Since the prior peak of credit-card debt in 2008, the US population has grown by 20 million people, and there has been a decade of inflation and nominal wage increases, and so the overall credit card burden per capita is far lower today than it was in 2008 (though student loans and auto loans have shot through the roof). So no problem?

But this overall data hides the extent to which the most vulnerable consumers are getting into trouble with their credit cards, having borrowed too much at usurious rates. They’ll never be able to pay off or even just service those balances. For them, there is only one way out – to default.

The fact that this process is now taking on real momentum — as demonstrated by delinquency rates spiking at smaller banks — shows that the group of consumers that are falling apart is expanding. And these are still the good times, of low unemployment in a growing economy.

by Wolf Richter, Wolf Street |  Read more:
Image: Wolf Street