Saturday, January 21, 2017

The Trouble with Quantum Mechanics

The development of quantum mechanics in the first decades of the twentieth century came as a shock to many physicists. Today, despite the great successes of quantum mechanics, arguments continue about its meaning, and its future.


The first shock came as a challenge to the clear categories to which physicists by 1900 had become accustomed. There were particles—atoms, and then electrons and atomic nuclei—and there were fields—conditions of space that pervade regions in which electric, magnetic, and gravitational forces are exerted. Light waves were clearly recognized as self-sustaining oscillations of electric and magnetic fields. But in order to understand the light emitted by heated bodies, Albert Einstein in 1905 found it necessary to describe light waves as streams of massless particles, later called photons.

Then in the 1920s, according to theories of Louis de Broglie and Erwin Schrödinger, it appeared that electrons, which had always been recognized as particles, under some circumstances behaved as waves. In order to account for the energies of the stable states of atoms, physicists had to give up the notion that electrons in atoms are little Newtonian planets in orbit around the atomic nucleus. Electrons in atoms are better described as waves, fitting around the nucleus like sound waves fitting into an organ pipe.1 The world’s categories had become all muddled.

Worse yet, the electron waves are not waves of electronic matter, in the way that ocean waves are waves of water. Rather, as Max Born came to realize, the electron waves are waves of probability. That is, when a free electron collides with an atom, we cannot in principle say in what direction it will bounce off. The electron wave, after encountering the atom, spreads out in all directions, like an ocean wave after striking a reef. As Born recognized, this does not mean that the electron itself spreads out. Instead, the undivided electron goes in some one direction, but not a precisely predictable direction. It is more likely to go in a direction where the wave is more intense, but any direction is possible.

Probability was not unfamiliar to the physicists of the 1920s, but it had generally been thought to reflect an imperfect knowledge of whatever was under study, not an indeterminism in the underlying physical laws. Newton’s theories of motion and gravitation had set the standard of deterministic laws. When we have reasonably precise knowledge of the location and velocity of each body in the solar system at a given moment, Newton’s laws tell us with good accuracy where they will all be for a long time in the future. Probability enters Newtonian physics only when our knowledge is imperfect, as for example when we do not have precise knowledge of how a pair of dice is thrown. But with the new quantum mechanics, the moment-to-moment determinism of the laws of physics themselves seemed to be lost.

All very strange. In a 1926 letter to Born, Einstein complained:
Quantum mechanics is very impressive. But an inner voice tells me that it is not yet the real thing. The theory produces a good deal but hardly brings us closer to the secret of the Old One. I am at all events convinced that He does not play dice.2
As late as 1964, in his Messenger lectures at Cornell, Richard Feynman lamented, “I think I can safely say that no one understands quantum mechanics.”3 With quantum mechanics, the break with the past was so sharp that all earlier physical theories became known as “classical.”

The weirdness of quantum mechanics did not matter for most purposes. Physicists learned how to use it to do increasingly precise calculations of the energy levels of atoms, and of the probabilities that particles will scatter in one direction or another when they collide. Lawrence Krauss has labeled the quantum mechanical calculation of one effect in the spectrum of hydrogen “the best, most accurate prediction in all of science.”4 Beyond atomic physics, early applications of quantum mechanics listed by the physicist Gino Segrè included the binding of atoms in molecules, the radioactive decay of atomic nuclei, electrical conduction, magnetism, and electromagnetic radiation.5 Later applications spanned theories of semiconductivity and superconductivity, white dwarf stars and neutron stars, nuclear forces, and elementary particles. Even the most adventurous modern speculations, such as string theory, are based on the principles of quantum mechanics.

Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries. Newton had introduced what his critics saw as an occult force, gravity, which was unrelated to any sort of tangible pushing and pulling, and which could not be explained on the basis of philosophy or pure mathematics. Also, his theories had renounced a chief aim of Ptolemy and Kepler, to calculate the sizes of planetary orbits from first principles. But in the end the opposition to Newtonianism faded away. Newton and his followers succeeded in accounting not only for the motions of planets and falling apples, but also for the movements of comets and moons and the shape of the earth and the change in direction of its axis of rotation. By the end of the eighteenth century this success had established Newton’s theories of motion and gravitation as correct, or at least as a marvelously accurate approximation. Evidently it is a mistake to demand too strictly that new physical theories should fit some preconceived philosophical standard.

In quantum mechanics the state of a system is not described by giving the position and velocity of every particle and the values and rates of change of various fields, as in classical physics. Instead, the state of any system at any moment is described by a wave function, essentially a list of numbers, one number for every possible configuration of the system.6 If the system is a single particle, then there is a number for every possible position in space that the particle may occupy. This is something like the description of a sound wave in classical physics, except that for a sound wave a number for each position in space gives the pressure of the air at that point, while for a particle in quantum mechanics the wave function’s number for a given position reflects the probability that the particle is at that position. What is so terrible about that? Certainly, it was a tragic mistake for Einstein and Schrödinger to step away from using quantum mechanics, isolating themselves in their later lives from the exciting progress made by others.


Even so, I’m not as sure as I once was about the future of quantum mechanics. It is a bad sign that those physicists today who are most comfortable with quantum mechanics do not agree with one another about what it all means. The dispute arises chiefly regarding the nature of measurement in quantum mechanics. This issue can be illustrated by considering a simple example, measurement of the spin of an electron. (A particle’s spin in any direction is a measure of the amount of rotation of matter around a line pointing in that direction.)

All theories agree, and experiment confirms, that when one measures the amount of spin of an electron in any arbitrarily chosen direction there are only two possible results. One possible result will be equal to a positive number, a universal constant of nature. (This is the constant that Max Planck originally introduced in his 1900 theory of heat radiation, denoted h, divided by 4π.) The other possible result is its opposite, the negative of the first. These positive or negative values of the spin correspond to an electron that is spinning either clockwise or counter-clockwise in the chosen direction.

But it is only when a measurement is made that these are the sole two possibilities. An electron spin that has not been measured is like a musical chord, formed from a superposition of two notes that correspond to positive or negative spins, each note with its own amplitude. Just as a chord creates a sound distinct from each of its constituent notes, the state of an electron spin that has not yet been measured is a superposition of the two possible states of definite spin, the superposition differing qualitatively from either state. In this musical analogy, the act of measuring the spin somehow shifts all the intensity of the chord to one of the notes, which we then hear on its own.

This can be put in terms of the wave function. If we disregard everything about an electron but its spin, there is not much that is wavelike about its wave function. It is just a pair of numbers, one number for each sign of the spin in some chosen direction, analogous to the amplitudes of each of the two notes in a chord.7 The wave function of an electron whose spin has not been measured generally has nonzero values for spins of both signs.

There is a rule of quantum mechanics, known as the Born rule, that tells us how to use the wave function to calculate the probabilities of getting various possible results in experiments. For example, the Born rule tells us that the probabilities of finding either a positive or a negative result when the spin in some chosen direction is measured are proportional to the squares of the numbers in the wave function for those two states of the spin.8

The introduction of probability into the principles of physics was disturbing to past physicists, but the trouble with quantum mechanics is not that it involves probabilities. We can live with that. The trouble is that in quantum mechanics the way that wave functions change with time is governed by an equation, the Schrödinger equation, that does not involve probabilities. It is just as deterministic as Newton’s equations of motion and gravitation. That is, given the wave function at any moment, the Schrödinger equation will tell you precisely what the wave function will be at any future time. There is not even the possibility of chaos, the extreme sensitivity to initial conditions that is possible in Newtonian mechanics. So if we regard the whole process of measurement as being governed by the equations of quantum mechanics, and these equations are perfectly deterministic, how do probabilities get into quantum mechanics?

One common answer is that, in a measurement, the spin (or whatever else is measured) is put in an interaction with a macroscopic environment that jitters in an unpredictable way. For example, the environment might be the shower of photons in a beam of light that is used to observe the system, as unpredictable in practice as a shower of raindrops. Such an environment causes the superposition of different states in the wave function to break down, leading to an unpredictable result of the measurement. (This is called decoherence.) It is as if a noisy background somehow unpredictably left only one of the notes of a chord audible. But this begs the question. If the deterministic Schrödinger equation governs the changes through time not only of the spin but also of the measuring apparatus and the physicist using it, then the results of measurement should not in principle be unpredictable. So we still have to ask, how do probabilities get into quantum mechanics?

One common answer is that, in a measurement, the spin (or whatever else is measured) is put in an interaction with a macroscopic environment that jitters in an unpredictable way. For example, the environment might be the shower of photons in a beam of light that is used to observe the system, as unpredictable in practice as a shower of raindrops. Such an environment causes the superposition of different states in the wave function to break down, leading to an unpredictable result of the measurement. (This is called decoherence.) It is as if a noisy background somehow unpredictably left only one of the notes of a chord audible. But this begs the question. If the deterministic Schrödinger equation governs the changes through time not only of the spin but also of the measuring apparatus and the physicist using it, then the results of measurement should not in principle be unpredictable. So we still have to ask, how do probabilities get into quantum mechanics?

One response to this puzzle was given in the 1920s by Niels Bohr, in what came to be called the Copenhagen interpretation of quantum mechanics. According to Bohr, in a measurement the state of a system such as a spin collapses to one result or another in a way that cannot itself be described by quantum mechanics, and is truly unpredictable. This answer is now widely felt to be unacceptable. There seems no way to locate the boundary between the realms in which, according to Bohr, quantum mechanics does or does not apply. As it happens, I was a graduate student at Bohr’s institute in Copenhagen, but he was very great and I was very young, and I never had a chance to ask him about this.

Today there are two widely followed approaches to quantum mechanics, the “realist” and “instrumentalist” approaches, which view the origin of probability in measurement in two very different ways.9 For reasons I will explain, neither approach seems to me quite satisfactory.10

by Steven Weinberg, NYRB | Read more:
Image: Eric J. Heller

John James Audubon, Louisiana Heron (1834)

Showering with Spiders

One cold morning last autumn, with the shower’s hot, steamy water pleasantly pelting my neck and shoulders, I glanced up and noticed a spider hanging in the corner above my head—a quivering, spindly, brown spider. I’m not a spider aficionado, but I do know about poisonous spiders in our area of the Pacific Northwest: the hobo spider and the black widow. My shower companion was neither. A daddy longlegs, I deduced, Pholcus phalangioides to be precise, minding its own business near the showerhead.

Daddy longlegs spiders build messy webs with no particular pattern to them, and they eat insects, mites, and other spiders, including the poisonous hobo (and, I’m sorry to say, sometimes each other). They like ceiling corners and warmer spaces, so the beige fiberglass tub/shower combination in our twenty-plus-year-old home made a comfortable spot for spider settlement.

I was in a hurry, so I finished my shower and thought no more about the long-legged wall hugger. The next morning, as I shoved back the shower curtain and stepped into the tub, there it was again. Or still. How long had this creature lived in my bathroom without my noticing? Maybe for months, possibly longer. How long is that in spider time? With a life-span of two or three years, this arachnid may have inhabited the space for a third of its life or more. In a way, the spider had greater claim to the shower than I had. In terms of the percentages of our lives spent in the place, I was the newcomer, and if I cleared the web, I’d be the one driving out the longtime inhabitant. Besides, I could shower quite comfortably with or without him. Or her.

And so began my conscious choice to shower with spiders. It’s a small thing, one might say a silly and meaningless thing. We spend maybe ten minutes together each morning, all told, more time than some busy working couples spend in conversation each day. I have found that I’m strangely appreciative of our benign interspecies companionship during my morning routine. I’m required to do nothing special, except be quietly mindful of another being inhabiting my space in an unfathomable way. If I notice my itsy-bitsy neighbor slowly lowering itself from the ceiling toward the shower stall while I’m there, I’ll shake my hand to splash a bit of water as a warning. The spider, being mindful too, will vibrate for a moment, and then either stop and crouch with its belly close to the wall, or quick-step back up toward the ceiling. We have an understanding, the spider and I: do no harm.

by Victoria Doerper, Orion |  Read more:
Image: James Wardell
[ed. I stomped on a daddy longlegs this morning while taking a shower, even though I know they're harmless. It was invading my space. Won't do that again (as long as there's some mutual accommodation).]

‘A Cat in Hell’s Chance’ – Why We’re Losing the Battle to Keep Global Warming Below 2C

It all seemed so simple in 2008. All we had was financial collapse, a cripplingly high oil price and global crop failures due to extreme weather events. In addition, my climate scientist colleague Dr Viki Johnson and I worked out that we had about 100 months before it would no longer be “likely” that global average surface temperatures could be held below a 2C rise, compared with pre-industrial times.

What’s so special about 2C? The simple answer is that it is a target that could be politically agreed on the international stage. It was first suggested in 1975 by the environmental economist William Nordhaus as an upper threshold beyond which we would arrive at a climate unrecognisable to humans. In 1990, the Stockholm Environment Institute recommended 2C as the maximum that should be tolerated, but noted: “Temperature increases beyond 1C may elicit rapid, unpredictable and non-linear responses that could lead to extensive ecosystem damage.”

To date, temperatures have risen by almost 1C since 1880. The effects of this warming are already being observed in melting ice, ocean levels rising, worse heat waves and other extreme weather events. There are negative impacts on farming, the disruption of plant and animal species on land and in the sea, extinctions, the disturbance of water supplies and food production and increased vulnerability, especially among people in poverty in low-income countries. But effects are global. So 2C was never seen as necessarily safe, just a guardrail between dangerous and very dangerous change.

To get a sense of what a 2C shift can do, just look in Earth’s rear-view mirror. When the planet was 2C colder than during the industrial revolution, we were in the grip of an ice age and a mile-thick North American ice sheet reached as far south as New York. The same warming again will intensify and accelerate human-driven changes already under way and has been described by James Hansen, one of the first scientists to call global attention to climate change, as a “prescription for long-term disaster”, including an ice-free Arctic. (...)

Is it still likely that we will stay below even 2C? In the 100 months since August 2008, I have been writing a climate-change diary for the Guardian to raise questions and monitor progress, or the lack of it, on climate action. To see how well we have fared, I asked a number of leading climate scientists and analysts for their views. The responses were as bracing as a bath in a pool of glacial meltwater.

by Andrew Simms, The Guardian |  Read more:

Humanism, Science, and the Radical Expansion of the Possible

Humanism was the particular glory of the Renaissance. The recovery, translation, and dissemination of the literatures of antiquity created a new excitement, displaying so vividly the accomplishments and therefore the capacities of humankind, with consequences for civilization that are great beyond reckoning.

The disciplines that came with this awakening, the mastery of classical languages, the reverent attention to pagan poets and philosophers, the study of ancient history, and the adaptation of ancient forms to modern purposes, all bore the mark of their origins yet served as the robust foundation of education and culture for centuries, until the fairly recent past. In muted, expanded, and adapted forms, these Renaissance passions live on among us still in the study of the humanities, which, we are told, are now diminished and threatened. Their utility is in question, it seems, despite their having been at the center of learning throughout the period of the spectacular material and intellectual flourishing of Western civilization. Now we are less interested in equipping and refining thought, more interested in creating and mastering technologies that will yield measurable enhancements of material well-being—for those who create and master them, at least. Now we are less interested in the exploration of the glorious mind, more engrossed in the drama of staying ahead of whatever it is we think is pursuing us. Or perhaps we are just bent on evading the specter of entropy. In any case, the spirit of the times is one of joyless urgency, many of us preparing ourselves and our children to be means to inscrutable ends that are utterly not our own. In such an environment, the humanities do seem to have little place. They are poor preparation for economic servitude. This spirit is not the consequence but the cause of our present state of affairs. We have as good grounds for exulting in human brilliance as any generation that has ever lived.

The antidote to our gloom is to be found in contemporary science. This may seem an improbable stance from which to defend the humanities, and I do not wish to undervalue contemporary art or literature or music or philosophy. But it is difficult to recognize the genius of a period until it has passed. Milton, Bach, Mozart all suffered long periods of eclipse, beginning before their lives had ended. Our politics may appear in the light of history to have been filled with triumphs of statecraft, unlikely as this seems to us now. Science, on the other hand, can assert credible achievements and insights, however tentative, in present time. The last century and the beginning of this one have without question transformed the understanding of Being itself. “Understanding” is not quite the right word, since this mysterious old category, Being, fundamental to all experience past, present, and to come, is by no means understood. However, the terms in which understanding may, at the moment, be attempted have changed radically, and this in itself is potent information. The phenomenon called quantum entanglement, relatively old as theory and thoroughly demonstrated as fact, raises fundamental questions about time and space, and therefore about causality.

Particles that are “entangled,” however distant from one another, undergo the same changes simultaneously. This fact challenges our most deeply embedded habits of thought. To try to imagine any event occurring outside the constraints of locality and sequence is difficult enough. Then there is the problem of conceiving of a universe in which the old rituals of cause and effect seem a gross inefficiency beside the elegance and sleight of hand that operate discreetly beyond the reach of all but the most rarefied scientific inference and observation. However pervasive and robust entanglement is or is not, it implies a cosmos that unfolds or emerges on principles that bear scant analogy to the universe of common sense. It is abetted in this by string theory, which adds seven unexpressed dimensions to our familiar four. And, of course, those four seem suddenly tenuous when the fundamental character of time and space is being called into question. Mathematics, ontology, and metaphysics have become one thing. Einstein’s universe seems mechanistic in comparison. Newton’s, the work of a tinkerer. If Galileo shocked the world by removing the sun from its place, so to speak, then this polyglot army of mathematicians and cosmologists who offer always new grounds for new conceptions of absolute reality should dazzle us all, freeing us at last from the circle of old Urizen’s compass. But we are not free.

There is no art or discipline for which the nature of reality is a matter of indifference, so one ontology or another is always being assumed if not articulated. Great questions may be as open now as they have been since Babylonians began watching the stars, but certain disciplines are still deeply invested in a model of reality that is as simple and narrow as ideological reductionism can make it. I could mention a dominant school of economics with its anthropology. But I will instead consider science of a kind. The study of brain and consciousness, mind and self—associated with so-called neuroscience—asserts a model of mental function as straightforward, cau­sally speaking, as a game of billiards, and plumes itself on just this fact. It is by no means entangled with the sciences that address ontology. The most striking and consequential changes in the second of these, ontology, bring about no change at all in the first, neuroscience, either simultaneous or delayed. The gist of neuroscience is that the adverbs “simply” and “merely” can exorcise the mystifications that have always surrounded the operations of the mind/brain, exposing the machinery that in fact produces emotion, behavior, and all the rest. So while inquiries into the substance of reality reveal further subtleties, idioms of relation that are utterly new to our understanding, neuroscience tells us that the most complex object we know of, the human brain, can be explained sufficiently in terms of the activation of “packets of neurons,” which evolution has provided the organism in service to homeostasis. The amazing complexity of the individual cell is being pored over in other regions of science, while neuroscience persists in declaring the brain, this same complexity vastly compounded, an essentially simple thing. If this could be true, if this most intricate and vital object could be translated into an effective simplicity for which the living world seems to provide no analogy, this indeed would be one of nature’s wonders. (...)

The real assertion being made in all this (neuroscience is remarkable among the sciences for its tendency to bypass hypothesis and even theory and go directly to assertion) is that there is no soul. Only the soul is ever claimed to be nonphysical, therefore immortal, therefore sacred and sanctifying as an aspect of human being. It is the self but stands apart from the self. It suffers injuries of a moral kind, when the self it is and is not lies or steals or murders, but it is untouched by the accidents that maim the self or kill it. Obviously, this intuition—it is much richer and deeper than anything conveyed by the word “belief”—cannot be dispelled by proving the soul’s physicality, from which it is aloof by definition. And on these same grounds, its nonphysicality is no proof of its nonexistence. This might seem a clever evasion of skepticism if the character of the soul were not established in remote antiquity, in many places and cultures, long before such a thing as science was brought to bear on the question. (...)

Is it fair to say that this school of thought is directed against humanism? This seems on its face to be true. The old humanists took the works of the human mind—literature, music, philosophy, art, and languages—as proof of what the mind is and might be. Out of this has come the great aura of brilliance and exceptionalism around our species that neuroscience would dispel. If Shakespeare had undergone an MRI, there is no reason to believe there would be any more evidence of extraordinary brilliance in him than there would be of a self or a soul. He left a formidable body of evidence that he was both brilliant and singular, but it has fallen under the rubric of Renaissance drama and is somehow not germane, perhaps because this places the mind so squarely at the center of the humanities. From the neuroscientific point of view, this only obscures the question. After all, where did our high sense of ourselves come from? From what we have done and what we do. And where is this awareness preserved and enhanced? In the arts and the humane disciplines. I am sure there are any number of neuroscientists who know and love Mozart better than I do, and who find his music uplifting. The inconsistency is for them to explain. (...)

If there is a scientific mode of thought that is crowding out and demoralizing the humanities, it is not research in the biology of the cell or the quest for life on other planets. It is this neo-Darwinism, which claims to cut through the dense miasmas of delusion to what is mere, simple, and real. Since these “miasmas” have been the main work of human consciousness for as long as the mind has left a record of itself, its devaluing is a major work of dehumanization. This is true because it is the great measure of our distinctiveness as a species. It is what we know about ourselves. It has everything in the world to do with how we think and feel, with what we value or despise or fear, all these things refracted through cultures and again through families and individuals. If the object of neuroscience or neo-Darwinism was to describe an essential human nature, it would surely seek confirmation in history and culture. But these things are endlessly complex, and they are continually open to variation and disruption. So the insistence on an essential simplicity is understandable, if it is not fruitful. If I am correct in seeing neuroscience as essentially neo-Darwinist, then it is affixed to a model of reality that has not gone through any meaningful change in a century, except in the kind of machinery it brings to bear in asserting its worldview. (...)

That said, it might be time to pause and reflect. Holding to the old faith that everything is in principle knowable or comprehensible by us is a little like assuming that every human structure or artifact must be based on yards, feet, and inches. The notion that the universe is constructed, or we are evolved, so that reality must finally answer in every case to the questions we bring to it, is entirely as anthropocentric as the notion that the universe was designed to make us possible. Indeed, the affinity between the two ideas should be acknowledged. While the assumption of the intelligibility of the universe is still useful, it is not appropriately regarded as a statement of doctrine, and should never have been. Science of the kind I criticize tends to assert that everything is explicable, that whatever has not been explained will be explained—and, furthermore, by its methods. Its practitioners have seen to the heart of it all. So mystery is banished—mystery being no more than whatever their methods cannot capture yet. Mystery being also those aspects of reality whose implications are not always factors in their worldview, for example, the human mind, the human self, history, and religion—in other words, the terrain of the humanities. Or of the human.

by Marilynne Robinson, The Nation |  Read more:
Image: Kelly Ruth Winter/ The Nation
[ed. This essay is excerpted from The Givenness of Things, © Marilynne Robinson.]

Friday, January 20, 2017

Brazilian Girls

Get Rich. Save the World. Gut Fish

Venture capitalist Ross Baird, 32, has red hair and an open face that calls to mind Happy Days-era Ron Howard. He’s one of those preternaturally mature millennials who already has a developed philosophy, glossy academic credentials, and financial backing from important people for his fund, Village Capital. In high school at Phillips Exeter Academy, Mark Zuckerberg was the dormitory proctor who set up his e-mail. Plus, Baird wants to save the world while getting rich. All very Silicon Valley.

But the rule of Sand Hill Road (that’s shorthand for the Menlo Park, Calif., epicenter of tech VC) is to invest widely in nouvelle concepts, hoping that one will be at least a “ten-bagger” (posting a return 10 times the investment). Baird, however, typically invests in unsexy ideas that he hopes will be three-baggers, often in agriculture, energy, and health care. Venture capitalists fixated on finding the next Snapchat put 85 percent of their $50 billion in funding last year into states that voted for Hillary Clinton, most of it in California, Massachusetts, and New York. Meanwhile, for the past seven years, Baird has been doggedly finding and developing successful businesses in the downtrodden places whose economic distress ultimately helped elect Donald Trump. (...)

Baird is especially excited about Fin Gourmet Foods, a company in Paducah, Ky., that buys invasive Asian carp from local fishermen and turns it into boneless filets for gourmet restaurants and fish paste for Asian supermarkets. Asian carp is best known as the biggest threat to the ecosystem of the Great Lakes; the federal government just earmarked $42 million to combat the species. The youngest fish eat their body weight daily, outcompeting bass for plankton, leaving sport fishermen in fear of economic ruin. Asian carp grow into 70-pounders known to jump as high as 10 feet: There’s a wide selection of videos on YouTube of these leaping monsters terrifying—and occasionally injuring—boaters. And because the fish are full of bones that make them hard to eat without meticulous processing, they fetch a third the wholesale price of catfish.

Despite that, Fin Gourmet forecasts revenue will rise to more than $1.5 million this year from $320,000 in 2016. “They’re growing like crazy, the profit margins are good, and they’re taking something out of the environment that’s bad and turning it into something that people want to pay for,” Baird says. The couple who founded the company draw their workforce from the ranks of “people who need second chances from incarceration, drug courts, domestic violence,” according to the company’s website. One foundation dubbed Fin Gourmet “the future Zappos of fish processing” for its community-minded approach. Boneless filets from Asian carp have started appearing on menus in Louisville and Lexington, and even at the first farm-to-table restaurant in Paducah, where it’s branded Kentucky blue snapper and costs $21. Served with spiced yogurt, mint, or cilantro, the white fish looks and tastes like tilapia.

In December, after a warning from my wife to wear a life jacket, I set out for the waterways of Kentucky, deep in the red-state America that’s sparked no end of analysis—from best-selling memoirs such as J.D. Vance’s Hillbilly Elegy to Margaret Mead-style travelogues by coastal journalists like me—to see if it’s possible to create jobs in a place where the most plentiful resource is trash fish.

I accidentally drove past Fin Gourmet headquarters before circling back: It’s housed in a onetime barbecue joint across from an abandoned gas station. Workers in blue “American Carp” T-shirts—a joke naturalizing the foreign species—sliced fish at tables covered in guts and blood. “Seven to 9 a.m., we do bladders,” one said. Lula Luu and John Crilly, the energetic former academics who started the company, moved here from New Orleans because Paducah is near the confluence of the Ohio and Tennessee rivers, as well as Kentucky Lake, a vast reservoir created by a Tennessee Valley Authority dam, which are all rife with Asian carp.

Luu got a Ph.D. from the University of Kentucky in nutritional sciences, with a focus on health disparities in minority groups. Crilly, a former psychiatry professor at Tulane in New Orleans, has researched mental health and suicide in rural populations. In 2010 he and Luu started a New Orleans nonprofit job retraining agency. Among their clients were Vietnamese shrimpers looking for offseason fishing work. Crilly read an in-flight magazine article about some chefs’ efforts to beat back the Asian carp invasion by eating the fish, and wondered if they could be another source of income. One problem: A series of Y-shaped bones run through the filets. Crilly sliced thousands of fish himself before finding a way to remove them efficiently.

Luu and her mother had fled Vietnam in 1980. Growing up in Tennessee, Luu hated Vietnamese fish cakes, made from a paste known as surimi that’s a staple in many Asian dishes. Often loaded with MSG, the cakes upset her stomach. But when she made them from Asian carp, they were springy and fresh-tasting.

Carp became an obsession that she and Crilly juggled with their academic jobs. They sank $1.5 million in savings into a business they named Fin, for fish innovation. Skeptics told them you couldn’t make money from U.S. surimi. Chinese carp farms, which operate with little regulatory oversight and can dump wastewater straight into sewers, had the market cornered with cheap product. The shrimpers lost interest in carp after the Gulf oil spill when BP set up a compensation fund; they worried the paid work might cut into their relief income. The couple put 110,000 miles on their Toyota Camry in one year, searching for other regional fishermen and selling fish paste in Asian supermarkets and nail salons staffed with Vietnamese immigrants. They even got an audience with then-Secretary of Commerce Gary Locke, who promised to help if they prepared an “ironclad business plan.” (They completed one, but never got a call back.)

In 2014, Baird and Village Capital organized a three-month training program for agriculture startups in Louisville. Village Capital has made investments in more than 70 companies by putting entrepreneurs through these workshops, then having them rank one another in order to decide who gets funding. Luu and Crilly pitched their idea, and it was one of two winners. Baird put in $50,000, with a plan to get $150,000 back. (The deal gives him 5 percent of Fin Gourmet’s revenue until it reaches that target.) “If you walk into TechCrunch Disrupt,” says Baird, referring to the prominent conference, “Lula and John don’t look or talk like your average tech entrepreneur. But they’ve identified a very specific market and know what they’re doing,” he says.

by Peter Robison, Bloomberg |  Read more:
Image: Ross Mantle for Bloomberg Businessweek

Thursday, January 19, 2017

Who Decides Who Counts as Native American?

In the fall of 2012, a 48-year-old fisherman and carver named Terry St. Germain decided to enroll his five young children as members of the Nooksack, a federally recognized Native American tribe with some 2,000 members, centered in the northwestern corner of Washington State.

He’d enrolled his two older daughters, from a previous relationship, when they were babies, but hadn’t yet filed the paperwork to make his younger children — all of whom, including a set of twins, were under 7 — official members. He saw no reason to worry about a bureaucratic endorsement of what he knew to be true. “My kids, they love being Native,” he told me.

St. Germain was a teenager when he enrolled in the tribe. For decades, he used tribal fishing rights to harvest salmon and sea urchin and Dungeness crab alongside his cousins. He had dozens of family members who were also Nooksack. His mother, according to family lore, was directly descended from a 19th-century Nooksack chief known as Matsqui George. His brother, Rudy, was the secretary of the Nooksack tribal council, which oversaw membership decisions. The process, he figured, would be so straightforward that his kids would be certified Nooksacks in time for Christmas, when the tribe gives parents a small stipend for buying gifts: “I thought it was a cut-and-dried situation.”

But after a few months, the applications had still not gone through. When Rudy asked why, at a tribal council meeting, the chairman, Bob Kelly, called in the enrollment department. They told Rudy that they had found a problem with the paperwork. There were missing documents; ancestors seemed to be incorrectly identified. They didn’t think Terry’s children’s claims to tribal membership could be substantiated.

At the time, Rudy and Kelly were friends, allies on the council. At the long oval table where they met to discuss Nooksack business, Rudy always sat at Kelly’s right. But the debate over whether Rudy’s family qualified as Nooksack tore them apart. Today, more than four years later, they no longer speak. Rudy and his extended family refer to Kelly as a monster and a dictator; he calls them pond scum and con artists. They agree on almost nothing, but both remember the day when things fell apart the same way. “If my nephew isn’t Nooksack,” Rudy said in the council chambers, “then neither am I.”

To Rudy, the words were an expression of shock. “It’s fighting words,” he said, to tell someone they’re not really part of their tribe. At stake were not just his family’s jobs and homes and treaty rights but also who they were and where they belonged. “I’ll still be who I am, but I won’t have proof,” Rudy said. “I’ll be labeled a non-Indian. So yeah, I take this very personally.”

To Kelly, the words were an admission of guilt, implicating not just the St. Germains but also hundreds of tribal members to whom they were related. As chairman, he felt that he had a sacred duty: to protect the tribe from invasion by a group of people that, he would eventually argue, weren’t even Native Americans. “I’m in a war,” he told me later, sketching family trees on the back of a copy of the tribe’s constitution. “This is our culture, not a game.”

The St. Germains’ rejected application proved to be a turning point for the Nooksack. Separately, the family and the council began combing through Nooksack history, which, like that of many tribes in the United States, is complicated by government efforts to extinguish, assimilate and relocate the tribe, and by a dearth of historical documents. An international border drawn across historically Nooksack lands only adds to the confusion. There were some records and even some living memories of the ancestors whose Nooksack heritage was being called into doubt. But no one could agree on what the records meant.

In January 2013, Kelly announced that, after searching through files at the Bureau of Indian Affairs office in nearby Everett, he had reason to doubt the legitimacy of more than 300 enrolled Nooksacks related to the St. Germains, all of whom claimed to descend from a woman named Annie George, born in 1875. In February, he canceled the constitutionally required council meeting, saying it would be “improper” to convene when Rudy St. Germain and another council member, Rudy’s cousin Michelle Roberts, were not eligible to be part of the tribe they’d been elected to lead. A week later, he called an executive session of the council but demanded that St. Germain and Roberts remain outside while the rest of the council voted on whether to “initiate involuntary disenrollment” for them and 304 other Nooksacks, including 37 elders. The resolution passed unanimously. “It hurt me,” Terry St. Germain said later. Even harder was watching the effect on his brother, Rudy. “It took the wind right out of him.”

Two days after the meeting, the tribal council began sending out letters notifying affected members that unless they could provide proof of their legitimacy, they would be disenrolled in 30 days. Word and shock spread quickly through the small, tight-knit reservation. The disenrollees, now calling themselves “the Nooksack 306,” hired a lawyer and vowed to contest their expulsion. “I told ’em, ‘I know where I belong no matter what you say,’ ” an 80-year-old woman who, in her youth, had been punished for “speaking Indian” at school, said. “ ‘You can’t make me believe that I’m not.’ ”

The Nooksacks who want the 306 out of the tribe say they are standing up for their very identity, fighting for the integrity of a tribe taken over by outsiders. “We’re ready to die for this,” Kelly would later say. “And I think we will, before this is over.”

Outside the lands legally known as “Indian Country,” “membership” and “enrollment” are such blandly bureaucratic words that it’s easy to lose sight of how much they matter there. To the 566 federally recognized tribal nations, the ability to determine who is and isn’t part of a tribe is an essential element of what makes tribes sovereign entities. To individuals, membership means citizenship and all the emotional ties and treaty rights that come with it. To be disenrolled is to lose that citizenship: to become stateless. It can also mean the loss of a broader identity, because recognition by a tribe is the most accepted way to prove you are Indian — not just Nooksack but Native American at all.

Efforts to define Native American identity date from the earliest days of the colonies. Before the arrival of white settlers, tribal boundaries were generally fluid; intermarriages and alliances were common. But as the new government’s desire to expand into Indian Territory grew, so, too, did the interest in defining who was and who wasn’t a “real Indian.” Those definitions shifted as the colonial government’s goals did. “Mixed blood” Indians, for example, were added to rolls in hopes that assimilated Indians would be more likely to cede their land; later, after land claims were established, more restrictive definitions were adopted. In the 19th century, the government began relying heavily on blood quantum, or “degree of Indian blood,” wagering that, over generations of intermarriage, tribes would be diluted to the point that earlier treaties would not have to be honored. “ ‘As long as grass grows or water runs’ — a phrase that was often used in treaties with American Indians — is a relatively permanent term for a contract,” the Ojibwe author David Treuer wrote in a 2011 Op-Ed for The Times. “ ‘As long as the blood flows’ seemed measurably shorter.”

by Brooke Jarvis, NY Times |  Read more:
Image: Peter van Agtmael/Magnum, for The New York Times

The New Monarchy

‘He Has This Deep Fear That He Is Not a Legitimate President’

In the days immediately after the election that shocked the world, POLITICO Magazine convened the group of people who know Donald J. Trump better than anyone outside his family. We asked his biographers the questions that were on everyone’s mind: What happens next? Will the unabashedly self-promoting and self-obsessed businessman transform himself into a selfless and dignified president of the nation he was elected to lead?

Now, after more than two months of Trump’s norm-shattering transition, we gathered Gwenda Blair, Michael D’Antonio and Tim O’Brien by conference call (Wayne Barrett, the dean of Trump reporters, could not participate because of illness) to assess whether Trump has continued to surprise them. Their collective wisdom? In a word, no.

From his pick of nominees for posts in his cabinet to his belligerent use of Twitter (our conversation was a day before he traded barbs with Congressman John Lewis) to his unwillingness to cut ties with his business to avoid conflicts of interest, they see the same person they’ve always seen—the consummate classroom troublemaker; a vain, insecure bully; and an anti-institutional schemer, as adept at “gaming the system” as he is unashamed. As they look ahead to his inauguration speech in two days, and to his administration beyond, they feel confident predicting that he will run the country much as he has run his company. For himself.

“He’s not going to be that concerned with the actual competent administration of the government,” D’Antonio said. “It’s going to be what he seems to be gaining or losing in public esteem. So almost like a monarch. The figurehead who rallies people and gets credit for things.” (...)

Kruse: Michael, in your book, and other places, too, he has talked about how much he enjoys fighting. And he certainly fought a lot of people throughout the campaign, and he hasn’t stopped fighting. From Meryl Streep to the intelligence community, he’s still picking fights. Do you think he is going to pick fights with leaders of other countries? In other words, is there any indication that he would be able to separate the interests of the country now from his own personal pique?

Blair: Zero.

O’Brien: Absolutely not. There will be no divide there. The whole thing has been a vanity show from the second he ran to the Republican Convention. I think we can expect to see the same on Inauguration Day. He’s been unable to find a clean division between his own emotional needs and his own insecurities and simply being a healthy, strategically committed leader who wants to parse through good policy options and a wide series of public statements about the direction in which he’ll take the country.

Blair: There’s a fusion, I think, of his childhood, an emphasis on being combative, being killers—as his dad famously instructed his boys to be—but also, I think, his own competitive nature, and then his grasp in early adulthood that being a bully and really putting it to other people and not backing down often works. He also had his church background telling him that being a success was the most important thing and that got fused with the sort of ‘You want a crowd to show up, start a fight,’ P.T. Barnum-type thing early on in his career. And then Roy Cohn as a mentor, a guy who stood for cold-eye calculus about how bullying people works. And you put all of those pieces together, that he’s been doing this his whole life, and I don’t see a single reason for him to back down. He’s going to go full blast ahead with that.

O’Brien: His father and Roy Cohn, those are the two most singular influences on his whole life, and they provided him with a militarized, transactional view of human relationships, business dealings and the law. And he’s going to carry all of that stuff and all of that baggage with him into the White House.

D’Antonio: Those early influences are essential, and I also think it’s correct that he has been conducting his entire life as a vanity show, and he’s been rewarded, most recently since his reality TV show, by ever-greater public interest in him. This is a guy who is a president-elect who describes himself as a ratings machine, which is an absolutely absurd thing for a president to be reflecting on, but that matters to him.

But one thing I think that we have overlooked as we see Trump trying to delegitimize others is what I suspect is a feeling he has inside that nothing he’s ever achieved himself has ever been legitimate. This is a person who has never known whether anybody wants to be around him because he’s a person they want to be around or they want to be around his money. And since he’s promoted himself as this glamorous, incredibly wealthy person, that’s the draw he’s always given. So he doesn’t know if he has any legitimate relationships outside of his family, and that’s why he emphasizes family. … He’s always kind of gaming the system—not, in my view, winning on the merits. And even his election was with almost 3 million fewer votes than his opponent. So he has this deep fear that he is himself not a legitimate president, and I think that’s why he goes to such great lengths to delegitimize even the intelligence community, which is the president’s key resource in security, and he’s going to do this demeaning and delegitimizing behavior rather than accept what they have to tell him. (...)

D’Antonio: I think Donald Trump measures himself by the number of norms that he can violate. The more he can get away with, the more he can thumb his nose at convention, the more powerful he feels.

O’Brien: He’s a profoundly anti-institutional person, and I think that’s part of his great appeal to voters. Voters right now are sick of institutions, and he’s got no problem railing against them. I think the danger here is he’s completely ill-informed and lacks, I think, the generosity of public spirit to think about what the right replacements should be for the same institutions that he’s railing against.

by Michael Kruse, Politico |  Read more:

Wednesday, January 18, 2017

The Undeniable Facts About the Safety of Diet Coke

[ed. This post from the archives must have been forwarded somewhere because its been quite popular lately. Unfortunately, the original link went dead. This one should work.]

I sat down at the table with friends, enjoying our get-together at the diner. The waitress took my order for a Diet Coke. She left. A friend spoke up.

“They say that Diet Coke increases your chance of getting diabetes by a factor of seven.”

“I heard people were getting seizures from the aspartame in it.”

“Today the news said a lady died after drinking 10 liters of Coke.”

“That’s nice. Enjoy your glass of city water filled with chemicals like fluoride,” I replied.

Are you kidding me?

Not much for alcohol. Never smoked. Don’t do drugs, and barely take aspirin. I exercise at the gym three times a week. I walk to work briskly every day, which comes to around 3/4 of a mile daily. When I get home, I try to avoid sitting and work at a standing desk. I go for walks when weather allows. I don’t eat much red meat at all, mainly poultry if any. I drink plenty of water, and often it is in the form of green, white, or herbal teas. I don’t drink coffee. In other words, I’m not health-obsessed, but I do alright.

My two vices?

An occasional Diet Coke as a treat a couple of times a week (and not even full cans!) and chocolate.

There are two important facts about life:
  • I am going to die.
  • You are going to die.
Let’s just be honest: people who point out the inadequacies in my eating and health regimen are merely quibbling over the bet they’re placing that I’ll die first. You’re telling me I’m killing myself and it’s my fault. You almost hint that I can take the blame for any physical ailment coming my way. I propose that cellular degeneration and the natural order of things might get some blame, and not just that Snickers I ate yesterday.

“Oh, but it’s a quality of life thing.”

Snow White's poisoned apple is a metaphor for supermarkets

The fact that I’m not fixating on the perfect purity of my food and not doing it to those around me means I have a pretty good quality of life.

When I eat a burger, I am thankful I have food, and that I don’t have to go out and gut the cow myself.

As I’m standing in the grocery store, I think of some of the poorest people in Nicaragua I’ve seen living and scrounging for food near the garbage dump. I get a bit upset at the arrogance that says the strawberries or apples or oranges stacked in heaping piles before me are “not good enough” because they are not organic.

I am repulsed by the idolatry that my body is so precious that I must find something more healthy and pure, that these non-organic fruits lack enough nutritional value for the little god that is me.

How does it work, that having a bountiful supply of food before me is seen as the enemy instead of a blessing?

Do I think I’m better than those people in poverty, so I deserve optimal “natural” food? Or, do I think that everyone deserves it, but because not everyone is in a place to access it, rice and corn mash are good enough for their kids but definitely not mine? When you donate food to the food pantry, do you donate the expensive organic carefully-sourced food that you insist is the only acceptable thing to put in your body and that you feed yourself and your family, or do you get the cheapest canned and boxed food at the store?

If your diet requires it, great. If you prefer it, fine. If you think it’s the only way to go, have at it. But don’t lecture me especially while we’re in the process of eating. I shouldn’t have to defend my digestive history. (...)

Maybe people ought to be more concerned about what they’re allowing in their head, rather than just their mouth. Shall I get after you for what you do and don’t read? Shall I lecture you on the shallow life of pursuing bodily health and not a robust mental existence?

Turn the TV off, unplug the internet, and shut out the voices convincing you that a world of unimaginable plenty isn’t good enough, isn’t healthy enough. Eat the food you have in moderation. The quality of my life, and my health, is fine. Someday it might not be. The same is true for you. Whether I drop over dead tomorrow or live to be 104, I’m not going to enjoy it any more by skipping the Diet Coke or excessive chocolate consumption. Keep your own guilt.

by Julie R. Neidlinger, Lone Prairie |  Read more:
Photo: uncredited

Wayne Thiebaud, Candy Counter, 1962