Thursday, January 26, 2017

The Twilight of the Liberal World Order

The liberal world order established in the aftermath of World War II may be coming to an end, challenged by forces both without and within. The external challenges come from the ambition of dissatisfied large and medium-size powers to overturn the existing strategic order dominated by the United States and its allies and partners. Their aim is to gain hegemony in their respective regions. China and Russia pose the greatest challenges to the world order because of their relative military, economic, and political power and their evident willingness to use it, which makes them significant players in world politics and, just as important, because the regions where they seek strategic hegemony—Asia and Europe—historically have been critical to global peace and stability. At a lesser but still significant level, Iran seeks regional hegemony in the Middle East and Persian Gulf, which if accomplished would have a strategic, economic, and political impact on the international system. North Korea seeks control of the Korean peninsula, which if accomplished would affect the stability and security of northeast Asia. Finally, at a much lower level of concern, there is the effort by ISIS and other radical Islamist groups to establish a new Islamic caliphate in the Middle East. If accomplished, that, too, would have effects on the global order.

However, it is the two great powers, China and Russia, that pose the greatest challenge to the relatively peaceful and prosperous international order created and sustained by the United States. If they were to accomplish their aims of establishing hegemony in their desired spheres of influence, the world would return to the condition it was in at the end of the 19th century, with competing great powers clashing over inevitably intersecting and overlapping spheres of interest. These were the unsettled, disordered conditions that produced the fertile ground for the two destructive world wars of the first half of the 20th century. The collapse of the British-dominated world order on the oceans, the disruption of the uneasy balance of power on the European continent due to the rise of a powerful unified Germany, combined with the rise of Japanese power in East Asia all contributed to a highly competitive international environment in which dissatisfied great powers took the opportunity to pursue their ambitions in the absence of any power or group of powers to unite in checking them. The result was an unprecedented global calamity. It has been the great accomplishment of the U.S.-led world order in the 70 years since the end of the Second World War that this kind of competition has been held in check and great power conflicts have been avoided.

The role of the United States, however, has been critical. Until recently, the dissatisfied great and medium-size powers have faced considerable and indeed almost insuperable obstacles in achieving their objectives. The chief obstacle has been the power and coherence of the order itself and of its principal promoter and defender. The American-led system of political and military alliances, especially in the two critical regions of Europe and East Asia, has presented China and Russia with what Dean Acheson once referred to as “situations of strength” in their regions that have required them to pursue their ambitions cautiously and in most respects to defer serious efforts to disrupt the international system. The system has served as a check on their ambitions in both positive and negative ways. They have been participants in and for the most part beneficiaries of the open international economic system the United States created and helped sustain and, so long as that system was functioning, have had more to gain by playing in it than by challenging and overturning it. The same cannot be said of the political and strategic aspects of the order, both of which have worked to their detriment. The growth and vibrancy of democratic government in the two decades following the collapse of Soviet communism has posed a continual threat to the ability of rulers in Beijing and Moscow to maintain control, and since the end of the Cold War they have regarded every advance of democratic institutions, including especially the geographical advance close to their borders, as an existential threat—and with reason. The continual threat to the basis of their rule posed by the U.S.-supported order has made them hostile both to the order and to the United States. However, it has also been a source of weakness and vulnerability. Chinese rulers in particular have had to worry about what an unsuccessful confrontation with the United States might do to their sources of legitimacy at home. And although Vladimir Putin has to some extent used a calculated foreign adventurism to maintain his hold on domestic power, he has taken a more cautious approach when met with determined U.S. and European opposition, as in the case of Ukraine, and pushed forward, as in Syria, only when invited to do so by U.S. and Western passivity. Autocratic rulers in a liberal democratic world have had to be careful.

The greatest check on Chinese and Russian ambitions, however, has come from the combined military power of the United States and its allies in Europe and Asia. China, although increasingly powerful itself, has had to contemplate facing the combined military strength of the world’s superpower and some very formidable regional powers linked by alliance or common strategic interest, including Japan, India, and South Korea, as well as smaller but still potent nations like Vietnam and Australia. Russia has had to face the United States and its NATO allies. When united, these military powers present a daunting challenge to a revisionist power that can call on no allies of its own for assistance. Even were the Chinese to score an early victory in a conflict, they would have to contend over time with the combined industrial productive capacities of some of the world’s richest and most technologically advanced nations. A weaker Russia would face an even greater challenge.

Faced with these obstacles, the two great powers, as well as the lesser dissatisfied powers, have had to hope for or if possible engineer a weakening of the U.S.-supported world order from within. This could come about either by separating the United States from its allies, raising doubts about the U.S. commitment to defend its allies militarily in the event of a conflict, or by various means wooing American allies out from within the liberal world order’s strategic structure. For most of the past decade, the reaction of American allies to greater aggressiveness on the part of China and Russia in their respective regions, and to Iran in the Middle East, has been to seek more reassurance from the United States. Russian actions in Georgia, Ukraine, and Syria; Chinese actions in the East and South China seas; Iranian actions in Syria, Iraq, and along the littoral of the Persian Gulf—all have led to calls by American allies and partners for a greater commitment. In this respect, the system has worked as it was supposed to. What the political scientist William Wohlforth once described as the inherent stability of the unipolar order reflected this dynamic—as dissatisfied regional powers sought to challenge the status quo, their alarmed neighbors turned to the distant American superpower to contain their ambitions.

The system has depended, however, on will, capacity, and coherence at the heart of the liberal world order. The United States had to be willing and able to play its part as the principal guarantor of the order, especially in the military and strategic realm. The order’s ideological and economic core order—the democracies of Europe and East Asia and the Pacific—had to remain relatively healthy and relatively confident. In such circumstances, the combined political, economic, and military power of the liberal world would be too great to be seriously challenged by the great powers, much less by the smaller dissatisfied powers.

In recent years, however, the liberal order has begun to weaken and fracture at the core. As a result of many related factors—difficult economic conditions, the recrudescence of nationalism and tribalism, weak and uncertain political leadership and unresponsive mainstream political parties, a new era of communications that seems to strengthen rather than weaken tribalism—there has emerged a crisis of confidence in what might be called the liberal enlightenment project. That project tended to elevate universal principles of individual rights and common humanity over ethnic, racial, religious, national, or tribal differences. It looked to a growing economic interdependence to create common interests across boundaries and the establishment of international institutions to smooth differences and facilitate cooperation among nations. Instead, the past decade has seen the rise of tribalism and nationalism; an increasing focus on the “other” in all societies; and a loss of confidence in government, in the capitalist system, and in democracy. We have been witnessing something like the opposite of the “end of history” but have returned to history with a vengeance, rediscovering all the darker aspects of the human soul. That includes, for many, the perennial human yearning for a strong leader to provide firm guidance in a time of seeming breakdown and incoherence.

by Robert Kagan, Brookings Institution |  Read more:
Image: Dr. Strangelove

The Long March From China to the Ivies

As the daughter of a senior colonel in China’s People’s Liberation Army, Ren Futong has lived all 17 years of her life in a high-walled military compound in northern Beijing. No foreigners are allowed inside the gates; the vast encampment, with its own bank, grocery store and laundromat, is patrolled by armed guards and goose-stepping soldiers.

Growing up in this enclave, Ren – also known as Monica, the English name she has adopted – imbibed the lessons of conformity and obedience, loyalty and patriotism, in their purest form. At her school, independent thought that deviated from the reams of right answers the students needed to memorise for the next exam was suppressed. The purpose of it all, Monica told me, was “to make everybody the same”.

For most of her childhood, Monica did as she was expected to. She gave up painting and calligraphy, and rose to the top of her class. Praised as a “study god”, she aced the national high-school entrance exam, but inside she was beginning to rebel. The agony and monotony of studying for that test made her dread the prospect of three more years cramming for the gaokao, the pressure-packed national exam whose result – a single number – is the sole criterion for admissions into Chinese universities.

One spring evening two years ago, Monica, then 15, came home to the compound and made what, for an acquiescent military daughter, was a startling pronouncement. “I told my parents that I was tired of preparing for tests like a machine,” she recalls. “I wanted to go to university in America.” She had hinted at this desire before, talking once over dinner about the freedom offered by an American liberal-arts education, but her parents had dismissed it as idle chatter. This time, they could see that she was dead serious. “My parents were kinda shocked,” she says. “They remained silent for a long period.”

Several days passed before they broke their silence. Her father, a taciturn career officer educated at a military academy, told her that “it would be much easier if you stayed in China where your future is guaranteed.” Her mother, an IT engineer, said Monica would very likely get into China’s most prestigious institution, Peking University, a training ground for the country’s future leaders. “Why give that up?” she asked. “We know the system here, but we know nothing about America, so we can’t help you there. You’d be totally on your own.” Then, after cycling through all the counter-arguments, her mother finally said: “If your heart is really set on going to the US, we will support your decision.”

The Ren family was taking a considerable risk. If Monica, their only child, wanted to study abroad, she would have to abandon the gaokao track, the only route available to universities within China, to have time to prepare for a completely different set of standardised tests and a confounding university application process. If she changed her mind – or, worse, failed to make the transition – she could not resume her studies within the Chinese system. And if that happened, she would miss the chance of going to an elite university and, therefore, of getting a top job within the system. For the Rens, this was the point of no return.

It is one of China’s curious contradictions that, even as the government tries to eradicate foreign influences from the country’s universities, the flood of Chinese students leaving for the West continues to rise. Over the past decade, the number of mainland Chinese students enrolled in American colleges and universities has nearly quintupled, from 62,523 in 2005 to 304,040 last year, according to the Institute of International Education. Many of these students are the sons and daughters of China’s rising elite, establishment families who can afford tuition fees of $60,000 a year for America’s top universities – and the tens of thousands of dollars needed to prepare for the transition. Even the daughter of Xi Jinping, China’s president and the man driving the campaign against foreign ideas, recently studied – under a pseudonym – at Harvard University.

Among Western educators, the Chinese system is famous for producing an elite corps of high-school students who regularly finish at the top of global test rankings, far ahead of their American and British counterparts. Yet so many Chinese families are now opting out of this system that selling education to Chinese students has become a profitable business for the West. They now account for nearly a third of all foreign students in America, contributing $9.8 billion a year to the United States’ economy. In Britain, too, Chinese students top the international lists. And the outflow shows no sign of subsiding: according to a recent Hurun Report, an annual survey of China’s elite, 80% of the country’s wealthy families plan to send their children abroad for education.

Not every Chinese student is driven, as Monica is, by the desire to escape the grind of the gaokao and get a more liberal education. For many Chinese families, sending a child to a Western university is a way of signalling status – yet “another luxury brand purchase,” as Jiang Xueqin, an educational consultant, puts it. For students faring poorly in the gaokao system, moreover, foreign universities offer an escape valve, and a way to gain an edge in the increasingly competitive job and marriage market back home. And for wealthy families seeking a safe haven for their assets – by one estimate more than $1 trillion in capital left China in 2015 – a foreign education for a child can serve as a first step towards capital flight, foreign investment, even eventual emigration.

by Brook Larmer, 1843 |  Read more:
Image: James Wasserman

Living the High Life


Image: The Interlace, Singapore, OMA/Ole Scheeren (2007-13)

Doomsday Prep For The Super-Rich

[ed. Having just experiencing a mini-disaster this last weekend I'd suggest a few basic precautions that anyone can take (other than maintaining a fully fueled helicopter): a case of bottled water, various canned goods (which don't require heating, maybe some hard crackers for a little starch), flashlight and batteries, solar phone charger, lighter, bag of charcoal (to cook whatever food you have left before it goes bad, which happens quicker than you might think), a supply of cash.]

Steve Huffman, the thirty-three-year-old co-founder and C.E.O. of Reddit, which is valued at six hundred million dollars, was nearsighted until November, 2015, when he arranged to have laser eye surgery. He underwent the procedure not for the sake of convenience or appearance but, rather, for a reason he doesn’t usually talk much about: he hopes that it will improve his odds of surviving a disaster, whether natural or man-made. “If the world ends—and not even if the world ends, but if we have trouble—getting contacts or glasses is going to be a huge pain in the ass,” he told me recently. “Without them, I’m fucked.”

Huffman, who lives in San Francisco, has large blue eyes, thick, sandy hair, and an air of restless curiosity; at the University of Virginia, he was a competitive ballroom dancer, who hacked his roommate’s Web site as a prank. He is less focussed on a specific threat—a quake on the San Andreas, a pandemic, a dirty bomb—than he is on the aftermath, “the temporary collapse of our government and structures,” as he puts it. “I own a couple of motorcycles. I have a bunch of guns and ammo. Food. I figure that, with that, I can hole up in my house for some amount of time.”

Survivalism, the practice of preparing for a crackup of civilization, tends to evoke a certain picture: the woodsman in the tinfoil hat, the hysteric with the hoard of beans, the religious doomsayer. But in recent years survivalism has expanded to more affluent quarters, taking root in Silicon Valley and New York City, among technology executives, hedge-fund managers, and others in their economic cohort.

Last spring, as the Presidential campaign exposed increasingly toxic divisions in America, Antonio García Martínez, a forty-year-old former Facebook product manager living in San Francisco, bought five wooded acres on an island in the Pacific Northwest and brought in generators, solar panels, and thousands of rounds of ammunition. “When society loses a healthy founding myth, it descends into chaos,” he told me. The author of “Chaos Monkeys,” an acerbic Silicon Valley memoir, García Martínez wanted a refuge that would be far from cities but not entirely isolated. “All these dudes think that one guy alone could somehow withstand the roving mob,” he said. “No, you’re going to need to form a local militia. You just need so many things to actually ride out the apocalypse.” Once he started telling peers in the Bay Area about his “little island project,” they came “out of the woodwork” to describe their own preparations, he said. “I think people who are particularly attuned to the levers by which society actually works understand that we are skating on really thin cultural ice right now.”

In private Facebook groups, wealthy survivalists swap tips on gas masks, bunkers, and locations safe from the effects of climate change. One member, the head of an investment firm, told me, “I keep a helicopter gassed up all the time, and I have an underground bunker with an air-filtration system.” He said that his preparations probably put him at the “extreme” end among his peers. But he added, “A lot of my friends do the guns and the motorcycles and the gold coins. That’s not too rare anymore.” (...)

How did a preoccupation with the apocalypse come to flourish in Silicon Valley, a place known, to the point of cliché, for unstinting confidence in its ability to change the world for the better?

Those impulses are not as contradictory as they seem. Technology rewards the ability to imagine wildly different futures, Roy Bahat, the head of Bloomberg Beta, a San Francisco-based venture-capital firm, told me. “When you do that, it’s pretty common that you take things ad infinitum, and that leads you to utopias and dystopias,” he said. It can inspire radical optimism—such as the cryonics movement, which calls for freezing bodies at death in the hope that science will one day revive them—or bleak scenarios. Tim Chang, the venture capitalist who keeps his bags packed, told me, “My current state of mind is oscillating between optimism and sheer terror.”

In recent years, survivalism has been edging deeper into mainstream culture. In 2012, National Geographic Channel launched “Doomsday Preppers,” a reality show featuring a series of Americans bracing for what they called S.H.T.F. (when the “shit hits the fan”). The première drew more than four million viewers, and, by the end of the first season, it was the most popular show in the channel’s history. A survey commissioned by National Geographic found that forty per cent of Americans believed that stocking up on supplies or building a bomb shelter was a wiser investment than a 401(k). Online, the prepper discussions run from folksy (“A Mom’s Guide to Preparing for Civil Unrest”) to grim (“How to Eat a Pine Tree to Survive”). (...)

How many wealthy Americans are really making preparations for a catastrophe? It’s hard to know exactly; a lot of people don’t like to talk about it. (“Anonymity is priceless,” one hedge-fund manager told me, declining an interview.) Sometimes the topic emerges in unexpected ways. Reid Hoffman, the co-founder of LinkedIn and a prominent investor, recalls telling a friend that he was thinking of visiting New Zealand. “Oh, are you going to get apocalypse insurance?” the friend asked. “I’m, like, Huh?” Hoffman told me. New Zealand, he discovered, is a favored refuge in the event of a cataclysm. Hoffman said, “Saying you’re ‘buying a house in New Zealand’ is kind of a wink, wink, say no more. Once you’ve done the Masonic handshake, they’ll be, like, ‘Oh, you know, I have a broker who sells old ICBM silos, and they’re nuclear-hardened, and they kind of look like they would be interesting to live in.’ ”

I asked Hoffman to estimate what share of fellow Silicon Valley billionaires have acquired some level of “apocalypse insurance,” in the form of a hideaway in the U.S. or abroad. “I would guess fifty-plus per cent,” he said, “but that’s parallel with the decision to buy a vacation home. Human motivation is complex, and I think people can say, ‘I now have a safety blanket for this thing that scares me.’ ” The fears vary, but many worry that, as artificial intelligence takes away a growing share of jobs, there will be a backlash against Silicon Valley, America’s second-highest concentration of wealth. (Southwestern Connecticut is first.) “I’ve heard this theme from a bunch of people,” Hoffman said. “Is the country going to turn against the wealthy? Is it going to turn against technological innovation? Is it going to turn into civil disorder?”

by Evan Osnos, New Yorker |  Read more:
Image: Dan Winters

Wednesday, January 25, 2017

Hillsong UNITED


[ed. See also: Touch The Sky]

What’s up with Firefox?

Until about five years ago, techies and others who wanted a speedier, extensible, more privacy-oriented web browser on their desktops often immediately downloaded Mozilla's Firefox to use instead of Internet Explorer on Windows or Safari on the Mac.

But those days seem long ago. Firefox is hardly discussed today, and its usage has cratered from a high of over 30 percent of the desktop browser market in 2010 to about 12 percent today, according to Mozilla, citing stats from NetMarketShare. (Various other analytics firms put the share as low as 10 percent or as high as 15 percent.) And Firefox’s share on mobile devices is even worse, at under 1 percent, according to the same firm.

Today, the go-to-browser is Google’s Chrome, which has over a 50 percent share on both desktop and mobile, according to NetMarketShare.

Mozilla Wakes Up

After years of neglecting Firefox, misreading mobile users, and putting most of its chips on a failed phone project, Mozilla says it is working hard to get Firefox off the mat.

“In many ways, we went through a time that you don’t get to survive,” says Mark Mayo, senior vice president for Firefox and a member of Mozilla’s decision-making steering committee. “Somehow we’re not dead… and it feels like we’re picking up speed and figuring out what to do.”

He admits that Firefox has fallen behind Chrome, Microsoft’s Edge, and Apple’s Safari technically, but says the company is executing with total focus on a plan to reverse that. “For several years, we have not been spending the effort we would normally spend on the flagship product,” Mayo concedes. “Firefox didn’t get better along with the competition.”

Why Firefox is Different

Now, he says, the company has embraced the proposition that “it kind of makes no sense to be us and not have the best browser.” That’s because for Mozilla, which is controlled by a foundation of the same name, Firefox is its main product. The two names are inseparable in many peoples’ minds. And an open, vibrant web — as opposed to a world of apps and social media and search controlled by a few companies — is its main philosophical concern.

That last bit may sound like idealistic claptrap, but it’s always been core to Mozilla’s mission. Mayo says he fears that big companies like Google and Apple don’t care whether roaming the open internet is subsumed by launching apps or by the act of searching. But, he says, Firefox does.

“Everyone else builds a browser for defensive reasons,” says Mayo. “We build one because we love browsers.” (...)

The Task Ahead

But building Firefox into a real contender will take a lot more work, and Mayo concedes that even parts of the plan won’t be visible to users until later this year. Still, Mozilla claims that it “aims to pass Chrome on key performance measures that matter by end of year.”

To do that, the company is betting on something called Project Quantum, a new under-the-hood browsing engine that will replace big chunks of Mozilla’s ancient Gecko engine. In an October blog post by David Bryant, head of platform engineering, the company claimed this:
“We are striving for performance gains from Quantum that will be so noticeable that your entire web experience will feel different. Pages will load faster, and scrolling will be silky smooth. Animations and interactive apps will respond instantly, and be able to handle more intensive content while holding consistent frame rates. And the content most important to you will automatically get the highest priority, focusing processing power where you need it the most.”
Another cornerstone for the new Firefox is a project called the Context Graph that aims to use an enhanced browser history to replace navigational search. The idea is to use differential privacy — the same kind of privacy-respecting machine learning that Apple uses — to suggest places on the web to go for particular needs, rather than getting navigational answers from search.

Mayo calls this “navigation by browser, not Google” and declares: “Navigation in the browser has been stagnant for a decade and we’re not going to stand for that.”

by Walt Mossberg, Recode |  Read more:
Image: uncredited

Everyday Authoritarianism is Boring and Tolerable

Malaysia is a country that I know well, and whose political system I have studied closely for fifteen years. It is also a country whose political liberalization I have long awaited. Malaysia has a multiparty parliamentary system of government, but the same coalition of parties has been in power for six decades, and has never lost a general election. The government retains—in a holdover from the British colonial period—the legal authority to detain people without trial if it so desires. The print and broadcast media are fairly compliant, mostly owned by the corporate allies of political elites, and rarely criticize the government.

Living in Malaysia and working on Malaysian politics has taught me something important about authoritarianism from my perspective as an American. That is, the mental image of authoritarian rule in the minds of most Americans is completely unrealistic, and dangerously so.

Even though Malaysia is a perfectly wonderful place to visit, and an emerging market economy grappling with the same “middle income trap” issues that characterize most emerging market economies, scholars of comparative politics do not consider it to be an electoral democracy. Freedom House considers Malaysia “Partly Free.” The Democracy-Dictatorship dataset codes Malaysia as a civilian dictatorship, as do Boix-Miller-Rosato. Levitsky and Way consider Malaysia to be a classic case of competitive authoritarianism. There are quite a few other countries like Malaysia: Mexico and Taiwan for most of the 20th century, Russia, Turkey, Singapore, Cameroon, Tanzania, and others.

The mental image that most American harbor of what actual authoritarianism looks like is fantastical and cartoonish. This vision of authoritarian rule has jackbooted thugs, all-powerful elites acting with impunity, poverty and desperate hardship for everyone else, strict controls on political expression and mobilization, and a dictator who spends his time ordering the murder or disappearance of his opponents using an effective and wholly compliant security apparatus. This image of authoritarianism comes from the popular media (dictators in movies are never constrained by anything but open insurrection), from American mythmaking about the Founding (and the Second World War and the Cold War), and from a kind of “imaginary othering” in which the opposite of democracy is the absence of everything that characterizes the one democracy that one knows.

Still, that fantastical image of authoritarianism is entirely misleading as a description of modern authoritarian rule and life under it. It is a description, to some approximation, of totalitarianism. Carl Friedrich is the best on totalitarianism (see PDF), and Hannah Arendt of course on its emergence (PDF). But Arendt and Friedrich were very clear that totalitarianism is exceptional as a form of politics.

The reality is that everyday life under the kinds of authoritarianism that exist today is very familiar to most Americans. You go to work, you eat your lunch, you go home to your family.* There are schools and businesses, and some people “make it” through hard work and luck. Most people worry about making sure their kids get into good schools. The military is in the barracks, and the police mostly investigate crimes and solve cases. There is political dissent, if rarely open protest, but in general people are free to complain to one another. There are even elections. This is Malaysia, and many countries like it.

Everyday life in the modern authoritarian regime is, in this sense, boring and tolerable. It is not outrageous. Most critics, even vocal ones, are not going to be murdered like Anna Politkovskaya, they are going to be frustrated. Most not-very-vocal critics will live their lives completely unmolested by the security forces. They will enjoy it when the trains run on time, blame the government when they do not, gripe at their taxes, and save for vacation. Elections, when they happen, will serve the “anesthetic function” that Philippe Schmitter attributed to elections in Portugal under Salazar in the greatly underappreciated in 1978 volume Elections without Choice.

Life under authoritarian rule in such situations looks a lot like life in a democracy. As Malaysia’s longtime Prime Minister Mahathir Mohamad used to say, “if you don’t like me, defeat me in my district.”

This observation has two particular consequences. One, for asking if “the people” will tolerate authoritarian rule. The premise upon which this question is based is that authoritarianism is intolerable generally. It turns out that most people express democratic values, but living in a complicated world in which people care more about more things than just their form of government, it is easy to see that given an orderly society and a functioning economy, democratic politics may become a low priority.** The answer to the question “will ‘the people’ tolerate authoritarian rule?” is yes, absolutely.

Second, for knowing if you are living in an authoritarian regime versus a democratic one. Most Americans conceptualize a hypothetical end of American democracy in Apocalyptic terms. But actually, you usually learn that you are no longer living in a democracy not because The Government Is Taking Away Your Rights, or passing laws that you oppose, or because there is a coup or a quisling. You know that you are no longer living in a democracy because the elections in which you are participating no longer can yield political change.

It is possible to read what I’ve written here as a defense of authoritarianism, or as a dismissal of democracy. But my message is the exact opposite. The fantasy of authoritarianism distracts Americans from the mundane ways in which the mechanisms of political competition and checks and balances can erode. Democracy has not survived because the alternatives are acutely horrible, and if it ends, it will not end in a bang. It is more likely that democracy ends, with a whimper, when the case for supporting it—the case, that is, for everyday democracy—is no longer compelling.

by Tom Pepinsky, Associate Professor of Government, Cornell University | Read more:

Tuesday, January 24, 2017

Saturday, January 21, 2017

The Trouble with Quantum Mechanics

The development of quantum mechanics in the first decades of the twentieth century came as a shock to many physicists. Today, despite the great successes of quantum mechanics, arguments continue about its meaning, and its future.

1.

The first shock came as a challenge to the clear categories to which physicists by 1900 had become accustomed. There were particles—atoms, and then electrons and atomic nuclei—and there were fields—conditions of space that pervade regions in which electric, magnetic, and gravitational forces are exerted. Light waves were clearly recognized as self-sustaining oscillations of electric and magnetic fields. But in order to understand the light emitted by heated bodies, Albert Einstein in 1905 found it necessary to describe light waves as streams of massless particles, later called photons.

Then in the 1920s, according to theories of Louis de Broglie and Erwin Schrödinger, it appeared that electrons, which had always been recognized as particles, under some circumstances behaved as waves. In order to account for the energies of the stable states of atoms, physicists had to give up the notion that electrons in atoms are little Newtonian planets in orbit around the atomic nucleus. Electrons in atoms are better described as waves, fitting around the nucleus like sound waves fitting into an organ pipe.1 The world’s categories had become all muddled.

Worse yet, the electron waves are not waves of electronic matter, in the way that ocean waves are waves of water. Rather, as Max Born came to realize, the electron waves are waves of probability. That is, when a free electron collides with an atom, we cannot in principle say in what direction it will bounce off. The electron wave, after encountering the atom, spreads out in all directions, like an ocean wave after striking a reef. As Born recognized, this does not mean that the electron itself spreads out. Instead, the undivided electron goes in some one direction, but not a precisely predictable direction. It is more likely to go in a direction where the wave is more intense, but any direction is possible.

Probability was not unfamiliar to the physicists of the 1920s, but it had generally been thought to reflect an imperfect knowledge of whatever was under study, not an indeterminism in the underlying physical laws. Newton’s theories of motion and gravitation had set the standard of deterministic laws. When we have reasonably precise knowledge of the location and velocity of each body in the solar system at a given moment, Newton’s laws tell us with good accuracy where they will all be for a long time in the future. Probability enters Newtonian physics only when our knowledge is imperfect, as for example when we do not have precise knowledge of how a pair of dice is thrown. But with the new quantum mechanics, the moment-to-moment determinism of the laws of physics themselves seemed to be lost.

All very strange. In a 1926 letter to Born, Einstein complained:
Quantum mechanics is very impressive. But an inner voice tells me that it is not yet the real thing. The theory produces a good deal but hardly brings us closer to the secret of the Old One. I am at all events convinced that He does not play dice.2
As late as 1964, in his Messenger lectures at Cornell, Richard Feynman lamented, “I think I can safely say that no one understands quantum mechanics.”3 With quantum mechanics, the break with the past was so sharp that all earlier physical theories became known as “classical.”

The weirdness of quantum mechanics did not matter for most purposes. Physicists learned how to use it to do increasingly precise calculations of the energy levels of atoms, and of the probabilities that particles will scatter in one direction or another when they collide. Lawrence Krauss has labeled the quantum mechanical calculation of one effect in the spectrum of hydrogen “the best, most accurate prediction in all of science.”4 Beyond atomic physics, early applications of quantum mechanics listed by the physicist Gino Segrè included the binding of atoms in molecules, the radioactive decay of atomic nuclei, electrical conduction, magnetism, and electromagnetic radiation.5 Later applications spanned theories of semiconductivity and superconductivity, white dwarf stars and neutron stars, nuclear forces, and elementary particles. Even the most adventurous modern speculations, such as string theory, are based on the principles of quantum mechanics.

Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries. Newton had introduced what his critics saw as an occult force, gravity, which was unrelated to any sort of tangible pushing and pulling, and which could not be explained on the basis of philosophy or pure mathematics. Also, his theories had renounced a chief aim of Ptolemy and Kepler, to calculate the sizes of planetary orbits from first principles. But in the end the opposition to Newtonianism faded away. Newton and his followers succeeded in accounting not only for the motions of planets and falling apples, but also for the movements of comets and moons and the shape of the earth and the change in direction of its axis of rotation. By the end of the eighteenth century this success had established Newton’s theories of motion and gravitation as correct, or at least as a marvelously accurate approximation. Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.

In quantum mechanics the state of a system is not described by giving the position and velocity of every particle and the values and rates of change of various fields, as in classical physics. Instead, the state of any system at any moment is described by a wave function, essentially a list of numbers, one number for every possible configuration of the system.6 If the system is a single particle, then there is a number for every possible position in space that the particle may occupy. This is something like the description of a sound wave in classical physics, except that for a sound wave a number for each position in space gives the pressure of the air at that point, while for a particle in quantum mechanics the wave function’s number for a given position reflects the probability that the particle is at that position. What is so terrible about that? Certainly, it was a tragic mistake for Einstein and Schrödinger to step away from using quantum mechanics, isolating themselves in their later lives from the exciting progress made by others.

2.

Even so, I’m not as sure as I once was about the future of quantum mechanics. It is a bad sign that those physicists today who are most comfortable with quantum mechanics do not agree with one another about what it all means. The dispute arises chiefly regarding the nature of measurement in quantum mechanics. This issue can be illustrated by considering a simple example, measurement of the spin of an electron. (A particle’s spin in any direction is a measure of the amount of rotation of matter around a line pointing in that direction.)

All theories agree, and experiment confirms, that when one measures the amount of spin of an electron in any arbitrarily chosen direction there are only two possible results. One possible result will be equal to a positive number, a universal constant of nature. (This is the constant that Max Planck originally introduced in his 1900 theory of heat radiation, denoted h, divided by 4π.) The other possible result is its opposite, the negative of the first. These positive or negative values of the spin correspond to an electron that is spinning either clockwise or counter-clockwise in the chosen direction.

But it is only when a measurement is made that these are the sole two possibilities. An electron spin that has not been measured is like a musical chord, formed from a superposition of two notes that correspond to positive or negative spins, each note with its own amplitude. Just as a chord creates a sound distinct from each of its constituent notes, the state of an electron spin that has not yet been measured is a superposition of the two possible states of definite spin, the superposition differing qualitatively from either state. In this musical analogy, the act of measuring the spin somehow shifts all the intensity of the chord to one of the notes, which we then hear on its own.

This can be put in terms of the wave function. If we disregard everything about an electron but its spin, there is not much that is wavelike about its wave function. It is just a pair of numbers, one number for each sign of the spin in some chosen direction, analogous to the amplitudes of each of the two notes in a chord.7 The wave function of an electron whose spin has not been measured generally has nonzero values for spins of both signs.

There is a rule of quantum mechanics, known as the Born rule, that tells us how to use the wave function to calculate the probabilities of getting various possible results in experiments. For example, the Born rule tells us that the probabilities of finding either a positive or a negative result when the spin in some chosen direction is measured are proportional to the squares of the numbers in the wave function for those two states of the spin.8

The introduction of probability into the principles of physics was disturbing to past physicists, but the trouble with quantum mechanics is not that it involves probabilities. We can live with that. The trouble is that in quantum mechanics the way that wave functions change with time is governed by an equation, the Schrödinger equation, that does not involve probabilities. It is just as deterministic as Newton’s equations of motion and gravitation. That is, given the wave function at any moment, the Schrödinger equation will tell you precisely what the wave function will be at any future time. There is not even the possibility of chaos, the extreme sensitivity to initial conditions that is possible in Newtonian mechanics. So if we regard the whole process of measurement as being governed by the equations of quantum mechanics, and these equations are perfectly deterministic, how do probabilities get into quantum mechanics?

One common answer is that, in a measurement, the spin (or whatever else is measured) is put in an interaction with a macroscopic environment that jitters in an unpredictable way. For example, the environment might be the shower of photons in a beam of light that is used to observe the system, as unpredictable in practice as a shower of raindrops. Such an environment causes the superposition of different states in the wave function to break down, leading to an unpredictable result of the measurement. (This is called decoherence.) It is as if a noisy background somehow unpredictably left only one of the notes of a chord audible. But this begs the question. If the deterministic Schrödinger equation governs the changes through time not only of the spin but also of the measuring apparatus and the physicist using it, then the results of measurement should not in principle be unpredictable. So we still have to ask, how do probabilities get into quantum mechanics?

One common answer is that, in a measurement, the spin (or whatever else is measured) is put in an interaction with a macroscopic environment that jitters in an unpredictable way. For example, the environment might be the shower of photons in a beam of light that is used to observe the system, as unpredictable in practice as a shower of raindrops. Such an environment causes the superposition of different states in the wave function to break down, leading to an unpredictable result of the measurement. (This is called decoherence.) It is as if a noisy background somehow unpredictably left only one of the notes of a chord audible. But this begs the question. If the deterministic Schrödinger equation governs the changes through time not only of the spin but also of the measuring apparatus and the physicist using it, then the results of measurement should not in principle be unpredictable. So we still have to ask, how do probabilities get into quantum mechanics?

One response to this puzzle was given in the 1920s by Niels Bohr, in what came to be called the Copenhagen interpretation of quantum mechanics. According to Bohr, in a measurement the state of a system such as a spin collapses to one result or another in a way that cannot itself be described by quantum mechanics, and is truly unpredictable. This answer is now widely felt to be unacceptable. There seems no way to locate the boundary between the realms in which, according to Bohr, quantum mechanics does or does not apply. As it happens, I was a graduate student at Bohr’s institute in Copenhagen, but he was very great and I was very young, and I never had a chance to ask him about this.

Today there are two widely followed approaches to quantum mechanics, the “realist” and “instrumentalist” approaches, which view the origin of probability in measurement in two very different ways.9 For reasons I will explain, neither approach seems to me quite satisfactory.10

by Steven Weinberg, NYRB | Read more:
Image: Eric J. Heller

John James Audubon, Louisiana Heron (1834)
via:

Showering with Spiders

One cold morning last autumn, with the shower’s hot, steamy water pleasantly pelting my neck and shoulders, I glanced up and noticed a spider hanging in the corner above my head—a quivering, spindly, brown spider. I’m not a spider aficionado, but I do know about poisonous spiders in our area of the Pacific Northwest: the hobo spider and the black widow. My shower companion was neither. A daddy longlegs, I deduced, Pholcus phalangioides to be precise, minding its own business near the showerhead.

Daddy longlegs spiders build messy webs with no particular pattern to them, and they eat insects, mites, and other spiders, including the poisonous hobo (and, I’m sorry to say, sometimes each other). They like ceiling corners and warmer spaces, so the beige fiberglass tub/shower combination in our twenty-plus-year-old home made a comfortable spot for spider settlement.

I was in a hurry, so I finished my shower and thought no more about the long-legged wall hugger. The next morning, as I shoved back the shower curtain and stepped into the tub, there it was again. Or still. How long had this creature lived in my bathroom without my noticing? Maybe for months, possibly longer. How long is that in spider time? With a life-span of two or three years, this arachnid may have inhabited the space for a third of its life or more. In a way, the spider had greater claim to the shower than I had. In terms of the percentages of our lives spent in the place, I was the newcomer, and if I cleared the web, I’d be the one driving out the longtime inhabitant. Besides, I could shower quite comfortably with or without him. Or her.

And so began my conscious choice to shower with spiders. It’s a small thing, one might say a silly and meaningless thing. We spend maybe ten minutes together each morning, all told, more time than some busy working couples spend in conversation each day. I have found that I’m strangely appreciative of our benign interspecies companionship during my morning routine. I’m required to do nothing special, except be quietly mindful of another being inhabiting my space in an unfathomable way. If I notice my itsy-bitsy neighbor slowly lowering itself from the ceiling toward the shower stall while I’m there, I’ll shake my hand to splash a bit of water as a warning. The spider, being mindful too, will vibrate for a moment, and then either stop and crouch with its belly close to the wall, or quick-step back up toward the ceiling. We have an understanding, the spider and I: do no harm.

by Victoria Doerper, Orion |  Read more:
Image: James Wardell
[ed. I stomped on a daddy longlegs this morning while taking a shower even though I know they're harmless. It was invading my space. Won't do that again (as long as there's some mutual accommodation).]

‘A Cat in Hell’s Chance’ – Why We’re Losing the Battle to Keep Global Warming Below 2C

It all seemed so simple in 2008. All we had was financial collapse, a cripplingly high oil price and global crop failures due to extreme weather events. In addition, my climate scientist colleague Dr Viki Johnson and I worked out that we had about 100 months before it would no longer be “likely” that global average surface temperatures could be held below a 2C rise, compared with pre-industrial times.

What’s so special about 2C? The simple answer is that it is a target that could be politically agreed on the international stage. It was first suggested in 1975 by the environmental economist William Nordhaus as an upper threshold beyond which we would arrive at a climate unrecognisable to humans. In 1990, the Stockholm Environment Institute recommended 2C as the maximum that should be tolerated, but noted: “Temperature increases beyond 1C may elicit rapid, unpredictable and non-linear responses that could lead to extensive ecosystem damage.”

To date, temperatures have risen by almost 1C since 1880. The effects of this warming are already being observed in melting ice, ocean levels rising, worse heat waves and other extreme weather events. There are negative impacts on farming, the disruption of plant and animal species on land and in the sea, extinctions, the disturbance of water supplies and food production and increased vulnerability, especially among people in poverty in low-income countries. But effects are global. So 2C was never seen as necessarily safe, just a guardrail between dangerous and very dangerous change.

To get a sense of what a 2C shift can do, just look in Earth’s rear-view mirror. When the planet was 2C colder than during the industrial revolution, we were in the grip of an ice age and a mile-thick North American ice sheet reached as far south as New York. The same warming again will intensify and accelerate human-driven changes already under way and has been described by James Hansen, one of the first scientists to call global attention to climate change, as a “prescription for long-term disaster”, including an ice-free Arctic. (...)

Is it still likely that we will stay below even 2C? In the 100 months since August 2008, I have been writing a climate-change diary for the Guardian to raise questions and monitor progress, or the lack of it, on climate action. To see how well we have fared, I asked a number of leading climate scientists and analysts for their views. The responses were as bracing as a bath in a pool of glacial meltwater.

by Andrew Simms, The Guardian |  Read more:
Image: NASA/EPA

Humanism, Science, and the Radical Expansion of the Possible

Humanism was the particular glory of the Renaissance. The recovery, translation, and dissemination of the literatures of antiquity created a new excitement, displaying so vividly the accomplishments and therefore the capacities of humankind, with consequences for civilization that are great beyond reckoning.

The disciplines that came with this awakening, the mastery of classical languages, the reverent attention to pagan poets and philosophers, the study of ancient history, and the adaptation of ancient forms to modern purposes, all bore the mark of their origins yet served as the robust foundation of education and culture for centuries, until the fairly recent past. In muted, expanded, and adapted forms, these Renaissance passions live on among us still in the study of the humanities, which, we are told, are now diminished and threatened. Their utility is in question, it seems, despite their having been at the center of learning throughout the period of the spectacular material and intellectual flourishing of Western civilization. Now we are less interested in equipping and refining thought, more interested in creating and mastering technologies that will yield measurable enhancements of material well-being—for those who create and master them, at least. Now we are less interested in the exploration of the glorious mind, more engrossed in the drama of staying ahead of whatever it is we think is pursuing us. Or perhaps we are just bent on evading the specter of entropy. In any case, the spirit of the times is one of joyless urgency, many of us preparing ourselves and our children to be means to inscrutable ends that are utterly not our own. In such an environment, the humanities do seem to have little place. They are poor preparation for economic servitude. This spirit is not the consequence but the cause of our present state of affairs. We have as good grounds for exulting in human brilliance as any generation that has ever lived.

The antidote to our gloom is to be found in contemporary science. This may seem an improbable stance from which to defend the humanities, and I do not wish to undervalue contemporary art or literature or music or philosophy. But it is difficult to recognize the genius of a period until it has passed. Milton, Bach, Mozart all suffered long periods of eclipse, beginning before their lives had ended. Our politics may appear in the light of history to have been filled with triumphs of statecraft, unlikely as this seems to us now. Science, on the other hand, can assert credible achievements and insights, however tentative, in present time. The last century and the beginning of this one have without question transformed the understanding of Being itself. “Understanding” is not quite the right word, since this mysterious old category, Being, fundamental to all experience past, present, and to come, is by no means understood. However, the terms in which understanding may, at the moment, be attempted have changed radically, and this in itself is potent information. The phenomenon called quantum entanglement, relatively old as theory and thoroughly demonstrated as fact, raises fundamental questions about time and space, and therefore about causality.

Particles that are “entangled,” however distant from one another, undergo the same changes simultaneously. This fact challenges our most deeply embedded habits of thought. To try to imagine any event occurring outside the constraints of locality and sequence is difficult enough. Then there is the problem of conceiving of a universe in which the old rituals of cause and effect seem a gross inefficiency beside the elegance and sleight of hand that operate discreetly beyond the reach of all but the most rarefied scientific inference and observation. However pervasive and robust entanglement is or is not, it implies a cosmos that unfolds or emerges on principles that bear scant analogy to the universe of common sense. It is abetted in this by string theory, which adds seven unexpressed dimensions to our familiar four. And, of course, those four seem suddenly tenuous when the fundamental character of time and space is being called into question. Mathematics, ontology, and metaphysics have become one thing. Einstein’s universe seems mechanistic in comparison. Newton’s, the work of a tinkerer. If Galileo shocked the world by removing the sun from its place, so to speak, then this polyglot army of mathematicians and cosmologists who offer always new grounds for new conceptions of absolute reality should dazzle us all, freeing us at last from the circle of old Urizen’s compass. But we are not free.

There is no art or discipline for which the nature of reality is a matter of indifference, so one ontology or another is always being assumed if not articulated. Great questions may be as open now as they have been since Babylonians began watching the stars, but certain disciplines are still deeply invested in a model of reality that is as simple and narrow as ideological reductionism can make it. I could mention a dominant school of economics with its anthropology. But I will instead consider science of a kind. The study of brain and consciousness, mind and self—associated with so-called neuroscience—asserts a model of mental function as straightforward, cau­sally speaking, as a game of billiards, and plumes itself on just this fact. It is by no means entangled with the sciences that address ontology. The most striking and consequential changes in the second of these, ontology, bring about no change at all in the first, neuroscience, either simultaneous or delayed. The gist of neuroscience is that the adverbs “simply” and “merely” can exorcise the mystifications that have always surrounded the operations of the mind/brain, exposing the machinery that in fact produces emotion, behavior, and all the rest. So while inquiries into the substance of reality reveal further subtleties, idioms of relation that are utterly new to our understanding, neuroscience tells us that the most complex object we know of, the human brain, can be explained sufficiently in terms of the activation of “packets of neurons,” which evolution has provided the organism in service to homeostasis. The amazing complexity of the individual cell is being pored over in other regions of science, while neuroscience persists in declaring the brain, this same complexity vastly compounded, an essentially simple thing. If this could be true, if this most intricate and vital object could be translated into an effective simplicity for which the living world seems to provide no analogy, this indeed would be one of nature’s wonders. (...)

The real assertion being made in all this (neuroscience is remarkable among the sciences for its tendency to bypass hypothesis and even theory and go directly to assertion) is that there is no soul. Only the soul is ever claimed to be nonphysical, therefore immortal, therefore sacred and sanctifying as an aspect of human being. It is the self but stands apart from the self. It suffers injuries of a moral kind, when the self it is and is not lies or steals or murders, but it is untouched by the accidents that maim the self or kill it. Obviously, this intuition—it is much richer and deeper than anything conveyed by the word “belief”—cannot be dispelled by proving the soul’s physicality, from which it is aloof by definition. And on these same grounds, its nonphysicality is no proof of its nonexistence. This might seem a clever evasion of skepticism if the character of the soul were not established in remote antiquity, in many places and cultures, long before such a thing as science was brought to bear on the question. (...)

Is it fair to say that this school of thought is directed against humanism? This seems on its face to be true. The old humanists took the works of the human mind—literature, music, philosophy, art, and languages—as proof of what the mind is and might be. Out of this has come the great aura of brilliance and exceptionalism around our species that neuroscience would dispel. If Shakespeare had undergone an MRI, there is no reason to believe there would be any more evidence of extraordinary brilliance in him than there would be of a self or a soul. He left a formidable body of evidence that he was both brilliant and singular, but it has fallen under the rubric of Renaissance drama and is somehow not germane, perhaps because this places the mind so squarely at the center of the humanities. From the neuroscientific point of view, this only obscures the question. After all, where did our high sense of ourselves come from? From what we have done and what we do. And where is this awareness preserved and enhanced? In the arts and the humane disciplines. I am sure there are any number of neuroscientists who know and love Mozart better than I do, and who find his music uplifting. The inconsistency is for them to explain. (...)

If there is a scientific mode of thought that is crowding out and demoralizing the humanities, it is not research in the biology of the cell or the quest for life on other planets. It is this neo-Darwinism, which claims to cut through the dense miasmas of delusion to what is mere, simple, and real. Since these “miasmas” have been the main work of human consciousness for as long as the mind has left a record of itself, its devaluing is a major work of dehumanization. This is true because it is the great measure of our distinctiveness as a species. It is what we know about ourselves. It has everything in the world to do with how we think and feel, with what we value or despise or fear, all these things refracted through cultures and again through families and individuals. If the object of neuroscience or neo-Darwinism was to describe an essential human nature, it would surely seek confirmation in history and culture. But these things are endlessly complex, and they are continually open to variation and disruption. So the insistence on an essential simplicity is understandable, if it is not fruitful. If I am correct in seeing neuroscience as essentially neo-Darwinist, then it is affixed to a model of reality that has not gone through any meaningful change in a century, except in the kind of machinery it brings to bear in asserting its worldview. (...)

That said, it might be time to pause and reflect. Holding to the old faith that everything is in principle knowable or comprehensible by us is a little like assuming that every human structure or artifact must be based on yards, feet, and inches. The notion that the universe is constructed, or we are evolved, so that reality must finally answer in every case to the questions we bring to it, is entirely as anthropocentric as the notion that the universe was designed to make us possible. Indeed, the affinity between the two ideas should be acknowledged. While the assumption of the intelligibility of the universe is still useful, it is not appropriately regarded as a statement of doctrine, and should never have been. Science of the kind I criticize tends to assert that everything is explicable, that whatever has not been explained will be explained—and, furthermore, by its methods. Its practitioners have seen to the heart of it all. So mystery is banished—mystery being no more than whatever their methods cannot capture yet. Mystery being also those aspects of reality whose implications are not always factors in their worldview, for example, the human mind, the human self, history, and religion—in other words, the terrain of the humanities. Or of the human.

by Marilynne Robinson, The Nation |  Read more:
Image: Kelly Ruth Winter/ The Nation
[ed. This essay is excerpted from The Givenness of Things, © Marilynne Robinson.]

Friday, January 20, 2017

Brazilian Girls

Get Rich. Save the World. Gut Fish

Venture capitalist Ross Baird, 32, has red hair and an open face that calls to mind Happy Days-era Ron Howard. He’s one of those preternaturally mature millennials who already has a developed philosophy, glossy academic credentials, and financial backing from important people for his fund, Village Capital. In high school at Phillips Exeter Academy, Mark Zuckerberg was the dormitory proctor who set up his e-mail. Plus, Baird wants to save the world while getting rich. All very Silicon Valley.

But the rule of Sand Hill Road (that’s shorthand for the Menlo Park, Calif., epicenter of tech VC) is to invest widely in nouvelle concepts, hoping that one will be at least a “ten-bagger” (posting a return 10 times the investment). Baird, however, typically invests in unsexy ideas that he hopes will be three-baggers, often in agriculture, energy, and health care. Venture capitalists fixated on finding the next Snapchat put 85 percent of their $50 billion in funding last year into states that voted for Hillary Clinton, most of it in California, Massachusetts, and New York. Meanwhile, for the past seven years, Baird has been doggedly finding and developing successful businesses in the downtrodden places whose economic distress ultimately helped elect Donald Trump. (...)

Baird is especially excited about Fin Gourmet Foods, a company in Paducah, Ky., that buys invasive Asian carp from local fishermen and turns it into boneless filets for gourmet restaurants and fish paste for Asian supermarkets. Asian carp is best known as the biggest threat to the ecosystem of the Great Lakes; the federal government just earmarked $42 million to combat the species. The youngest fish eat their body weight daily, outcompeting bass for plankton, leaving sport fishermen in fear of economic ruin. Asian carp grow into 70-pounders known to jump as high as 10 feet: There’s a wide selection of videos on YouTube of these leaping monsters terrifying—and occasionally injuring—boaters. And because the fish are full of bones that make them hard to eat without meticulous processing, they fetch a third the wholesale price of catfish.

Despite that, Fin Gourmet forecasts revenue will rise to more than $1.5 million this year from $320,000 in 2016. “They’re growing like crazy, the profit margins are good, and they’re taking something out of the environment that’s bad and turning it into something that people want to pay for,” Baird says. The couple who founded the company draw their workforce from the ranks of “people who need second chances from incarceration, drug courts, domestic violence,” according to the company’s website. One foundation dubbed Fin Gourmet “the future Zappos of fish processing” for its community-minded approach. Boneless filets from Asian carp have started appearing on menus in Louisville and Lexington, and even at the first farm-to-table restaurant in Paducah, where it’s branded Kentucky blue snapper and costs $21. Served with spiced yogurt, mint, or cilantro, the white fish looks and tastes like tilapia.

In December, after a warning from my wife to wear a life jacket, I set out for the waterways of Kentucky, deep in the red-state America that’s sparked no end of analysis—from best-selling memoirs such as J.D. Vance’s Hillbilly Elegy to Margaret Mead-style travelogues by coastal journalists like me—to see if it’s possible to create jobs in a place where the most plentiful resource is trash fish.

I accidentally drove past Fin Gourmet headquarters before circling back: It’s housed in a onetime barbecue joint across from an abandoned gas station. Workers in blue “American Carp” T-shirts—a joke naturalizing the foreign species—sliced fish at tables covered in guts and blood. “Seven to 9 a.m., we do bladders,” one said. Lula Luu and John Crilly, the energetic former academics who started the company, moved here from New Orleans because Paducah is near the confluence of the Ohio and Tennessee rivers, as well as Kentucky Lake, a vast reservoir created by a Tennessee Valley Authority dam, which are all rife with Asian carp.

Luu got a Ph.D. from the University of Kentucky in nutritional sciences, with a focus on health disparities in minority groups. Crilly, a former psychiatry professor at Tulane in New Orleans, has researched mental health and suicide in rural populations. In 2010 he and Luu started a New Orleans nonprofit job retraining agency. Among their clients were Vietnamese shrimpers looking for offseason fishing work. Crilly read an in-flight magazine article about some chefs’ efforts to beat back the Asian carp invasion by eating the fish, and wondered if they could be another source of income. One problem: A series of Y-shaped bones run through the filets. Crilly sliced thousands of fish himself before finding a way to remove them efficiently.

Luu and her mother had fled Vietnam in 1980. Growing up in Tennessee, Luu hated Vietnamese fish cakes, made from a paste known as surimi that’s a staple in many Asian dishes. Often loaded with MSG, the cakes upset her stomach. But when she made them from Asian carp, they were springy and fresh-tasting.

Carp became an obsession that she and Crilly juggled with their academic jobs. They sank $1.5 million in savings into a business they named Fin, for fish innovation. Skeptics told them you couldn’t make money from U.S. surimi. Chinese carp farms, which operate with little regulatory oversight and can dump wastewater straight into sewers, had the market cornered with cheap product. The shrimpers lost interest in carp after the Gulf oil spill when BP set up a compensation fund; they worried the paid work might cut into their relief income. The couple put 110,000 miles on their Toyota Camry in one year, searching for other regional fishermen and selling fish paste in Asian supermarkets and nail salons staffed with Vietnamese immigrants. They even got an audience with then-Secretary of Commerce Gary Locke, who promised to help if they prepared an “ironclad business plan.” (They completed one, but never got a call back.)

In 2014, Baird and Village Capital organized a three-month training program for agriculture startups in Louisville. Village Capital has made investments in more than 70 companies by putting entrepreneurs through these workshops, then having them rank one another in order to decide who gets funding. Luu and Crilly pitched their idea, and it was one of two winners. Baird put in $50,000, with a plan to get $150,000 back. (The deal gives him 5 percent of Fin Gourmet’s revenue until it reaches that target.) “If you walk into TechCrunch Disrupt,” says Baird, referring to the prominent conference, “Lula and John don’t look or talk like your average tech entrepreneur. But they’ve identified a very specific market and know what they’re doing,” he says.

by Peter Robison, Bloomberg |  Read more:
Image: Ross Mantle for Bloomberg Businessweek

Thursday, January 19, 2017

Who Decides Who Counts as Native American?

In the fall of 2012, a 48-year-old fisherman and carver named Terry St. Germain decided to enroll his five young children as members of the Nooksack, a federally recognized Native American tribe with some 2,000 members, centered in the northwestern corner of Washington State.

He’d enrolled his two older daughters, from a previous relationship, when they were babies, but hadn’t yet filed the paperwork to make his younger children — all of whom, including a set of twins, were under 7 — official members. He saw no reason to worry about a bureaucratic endorsement of what he knew to be true. “My kids, they love being Native,” he told me.

St. Germain was a teenager when he enrolled in the tribe. For decades, he used tribal fishing rights to harvest salmon and sea urchin and Dungeness crab alongside his cousins. He had dozens of family members who were also Nooksack. His mother, according to family lore, was directly descended from a 19th-century Nooksack chief known as Matsqui George. His brother, Rudy, was the secretary of the Nooksack tribal council, which oversaw membership decisions. The process, he figured, would be so straightforward that his kids would be certified Nooksacks in time for Christmas, when the tribe gives parents a small stipend for buying gifts: “I thought it was a cut-and-dried situation.”

But after a few months, the applications had still not gone through. When Rudy asked why, at a tribal council meeting, the chairman, Bob Kelly, called in the enrollment department. They told Rudy that they had found a problem with the paperwork. There were missing documents; ancestors seemed to be incorrectly identified. They didn’t think Terry’s children’s claims to tribal membership could be substantiated.

At the time, Rudy and Kelly were friends, allies on the council. At the long oval table where they met to discuss Nooksack business, Rudy always sat at Kelly’s right. But the debate over whether Rudy’s family qualified as Nooksack tore them apart. Today, more than four years later, they no longer speak. Rudy and his extended family refer to Kelly as a monster and a dictator; he calls them pond scum and con artists. They agree on almost nothing, but both remember the day when things fell apart the same way. “If my nephew isn’t Nooksack,” Rudy said in the council chambers, “then neither am I.”

To Rudy, the words were an expression of shock. “It’s fighting words,” he said, to tell someone they’re not really part of their tribe. At stake were not just his family’s jobs and homes and treaty rights but also who they were and where they belonged. “I’ll still be who I am, but I won’t have proof,” Rudy said. “I’ll be labeled a non-Indian. So yeah, I take this very personally.”

To Kelly, the words were an admission of guilt, implicating not just the St. Germains but also hundreds of tribal members to whom they were related. As chairman, he felt that he had a sacred duty: to protect the tribe from invasion by a group of people that, he would eventually argue, weren’t even Native Americans. “I’m in a war,” he told me later, sketching family trees on the back of a copy of the tribe’s constitution. “This is our culture, not a game.”

The St. Germains’ rejected application proved to be a turning point for the Nooksack. Separately, the family and the council began combing through Nooksack history, which, like that of many tribes in the United States, is complicated by government efforts to extinguish, assimilate and relocate the tribe, and by a dearth of historical documents. An international border drawn across historically Nooksack lands only adds to the confusion. There were some records and even some living memories of the ancestors whose Nooksack heritage was being called into doubt. But no one could agree on what the records meant.

In January 2013, Kelly announced that, after searching through files at the Bureau of Indian Affairs office in nearby Everett, he had reason to doubt the legitimacy of more than 300 enrolled Nooksacks related to the St. Germains, all of whom claimed to descend from a woman named Annie George, born in 1875. In February, he canceled the constitutionally required council meeting, saying it would be “improper” to convene when Rudy St. Germain and another council member, Rudy’s cousin Michelle Roberts, were not eligible to be part of the tribe they’d been elected to lead. A week later, he called an executive session of the council but demanded that St. Germain and Roberts remain outside while the rest of the council voted on whether to “initiate involuntary disenrollment” for them and 304 other Nooksacks, including 37 elders. The resolution passed unanimously. “It hurt me,” Terry St. Germain said later. Even harder was watching the effect on his brother, Rudy. “It took the wind right out of him.”

Two days after the meeting, the tribal council began sending out letters notifying affected members that unless they could provide proof of their legitimacy, they would be disenrolled in 30 days. Word and shock spread quickly through the small, tight-knit reservation. The disenrollees, now calling themselves “the Nooksack 306,” hired a lawyer and vowed to contest their expulsion. “I told ’em, ‘I know where I belong no matter what you say,’ ” an 80-year-old woman who, in her youth, had been punished for “speaking Indian” at school, said. “ ‘You can’t make me believe that I’m not.’ ”

The Nooksacks who want the 306 out of the tribe say they are standing up for their very identity, fighting for the integrity of a tribe taken over by outsiders. “We’re ready to die for this,” Kelly would later say. “And I think we will, before this is over.”

Outside the lands legally known as “Indian Country,” “membership” and “enrollment” are such blandly bureaucratic words that it’s easy to lose sight of how much they matter there. To the 566 federally recognized tribal nations, the ability to determine who is and isn’t part of a tribe is an essential element of what makes tribes sovereign entities. To individuals, membership means citizenship and all the emotional ties and treaty rights that come with it. To be disenrolled is to lose that citizenship: to become stateless. It can also mean the loss of a broader identity, because recognition by a tribe is the most accepted way to prove you are Indian — not just Nooksack but Native American at all.

Efforts to define Native American identity date from the earliest days of the colonies. Before the arrival of white settlers, tribal boundaries were generally fluid; intermarriages and alliances were common. But as the new government’s desire to expand into Indian Territory grew, so, too, did the interest in defining who was and who wasn’t a “real Indian.” Those definitions shifted as the colonial government’s goals did. “Mixed blood” Indians, for example, were added to rolls in hopes that assimilated Indians would be more likely to cede their land; later, after land claims were established, more restrictive definitions were adopted. In the 19th century, the government began relying heavily on blood quantum, or “degree of Indian blood,” wagering that, over generations of intermarriage, tribes would be diluted to the point that earlier treaties would not have to be honored. “ ‘As long as grass grows or water runs’ — a phrase that was often used in treaties with American Indians — is a relatively permanent term for a contract,” the Ojibwe author David Treuer wrote in a 2011 Op-Ed for The Times. “ ‘As long as the blood flows’ seemed measurably shorter.”

by Brooke Jarvis, NY Times |  Read more:
Image: Peter van Agtmael/Magnum, for The New York Times

The New Monarchy

‘He Has This Deep Fear That He Is Not a Legitimate President’

In the days immediately after the election that shocked the world, POLITICO Magazine convened the group of people who know Donald J. Trump better than anyone outside his family. We asked his biographers the questions that were on everyone’s mind: What happens next? Will the unabashedly self-promoting and self-obsessed businessman transform himself into a selfless and dignified president of the nation he was elected to lead?

Now, after more than two months of Trump’s norm-shattering transition, we gathered Gwenda Blair, Michael D’Antonio and Tim O’Brien by conference call (Wayne Barrett, the dean of Trump reporters, could not participate because of illness) to assess whether Trump has continued to surprise them. Their collective wisdom? In a word, no.

From his pick of nominees for posts in his cabinet to his belligerent use of Twitter (our conversation was a day before he traded barbs with Congressman John Lewis) to his unwillingness to cut ties with his business to avoid conflicts of interest, they see the same person they’ve always seen—the consummate classroom troublemaker; a vain, insecure bully; and an anti-institutional schemer, as adept at “gaming the system” as he is unashamed. As they look ahead to his inauguration speech in two days, and to his administration beyond, they feel confident predicting that he will run the country much as he has run his company. For himself.

“He’s not going to be that concerned with the actual competent administration of the government,” D’Antonio said. “It’s going to be what he seems to be gaining or losing in public esteem. So almost like a monarch. The figurehead who rallies people and gets credit for things.” (...)

Kruse: Michael, in your book, and other places, too, he has talked about how much he enjoys fighting. And he certainly fought a lot of people throughout the campaign, and he hasn’t stopped fighting. From Meryl Streep to the intelligence community, he’s still picking fights. Do you think he is going to pick fights with leaders of other countries? In other words, is there any indication that he would be able to separate the interests of the country now from his own personal pique?

Blair: Zero.

O’Brien: Absolutely not. There will be no divide there. The whole thing has been a vanity show from the second he ran to the Republican Convention. I think we can expect to see the same on Inauguration Day. He’s been unable to find a clean division between his own emotional needs and his own insecurities and simply being a healthy, strategically committed leader who wants to parse through good policy options and a wide series of public statements about the direction in which he’ll take the country.

Blair: There’s a fusion, I think, of his childhood, an emphasis on being combative, being killers—as his dad famously instructed his boys to be—but also, I think, his own competitive nature, and then his grasp in early adulthood that being a bully and really putting it to other people and not backing down often works. He also had his church background telling him that being a success was the most important thing and that got fused with the sort of ‘You want a crowd to show up, start a fight,’ P.T. Barnum-type thing early on in his career. And then Roy Cohn as a mentor, a guy who stood for cold-eye calculus about how bullying people works. And you put all of those pieces together, that he’s been doing this his whole life, and I don’t see a single reason for him to back down. He’s going to go full blast ahead with that.

O’Brien: His father and Roy Cohn, those are the two most singular influences on his whole life, and they provided him with a militarized, transactional view of human relationships, business dealings and the law. And he’s going to carry all of that stuff and all of that baggage with him into the White House.

D’Antonio: Those early influences are essential, and I also think it’s correct that he has been conducting his entire life as a vanity show, and he’s been rewarded, most recently since his reality TV show, by ever-greater public interest in him. This is a guy who is a president-elect who describes himself as a ratings machine, which is an absolutely absurd thing for a president to be reflecting on, but that matters to him.

But one thing I think that we have overlooked as we see Trump trying to delegitimize others is what I suspect is a feeling he has inside that nothing he’s ever achieved himself has ever been legitimate. This is a person who has never known whether anybody wants to be around him because he’s a person they want to be around or they want to be around his money. And since he’s promoted himself as this glamorous, incredibly wealthy person, that’s the draw he’s always given. So he doesn’t know if he has any legitimate relationships outside of his family, and that’s why he emphasizes family. … He’s always kind of gaming the system—not, in my view, winning on the merits. And even his election was with almost 3 million fewer votes than his opponent. So he has this deep fear that he is himself not a legitimate president, and I think that’s why he goes to such great lengths to delegitimize even the intelligence community, which is the president’s key resource in security, and he’s going to do this demeaning and delegitimizing behavior rather than accept what they have to tell him. (...)

D’Antonio: I think Donald Trump measures himself by the number of norms that he can violate. The more he can get away with, the more he can thumb his nose at convention, the more powerful he feels.

O’Brien: He’s a profoundly anti-institutional person, and I think that’s part of his great appeal to voters. Voters right now are sick of institutions, and he’s got no problem railing against them. I think the danger here is he’s completely ill-informed and lacks, I think, the generosity of public spirit to think about what the right replacements should be for the same institutions that he’s railing against.

by Michael Kruse, Politico |  Read more: