Thursday, September 19, 2013
From Cocaine to Fascism
In 1920, at the age of just twenty-seven, a young Italian named Dino Segre, writing under the pen name Pitigrilli, achieved notoriety with a book of short stories called Luxurious Breasts, followed the next year by the novel Cocaine and a second book of stories entitled The Chastity Belt.
Behind Italy’s official façade of bourgeois morality, traditional family life, and patriotism, Pitigrilli saw a world driven by sex, power and greed, in which adultery, illegitimate children, and hypocrisy were the order of the day and husbands and wives were little more than respectable-seeming pimps and prostitutes. Born in Turin, Segre himself had been the illegitimate child of a Jewish father—also named Dino Segre—and a young Catholic mother. (His father did not marry his mother until their child was eight years old.) In his work he delighted in turning conventional morality on its head, along with most of the Ten Commandments:
The principal occupation of the characters of Cocaine is distracting themselves from the horrors of real life. In searching for any kind of thrill or stimulation, they resort to “the fashionable poisons of the moment, the wild exaltation they produce, the craze for ether and chloroform and the white Bolivian powder that produces hallucinations.” As Tito’s lover (or one of his lovers), Kalantan, tells him:
by Alexander Stille, NY Review of Books | Read more:
Image: Benito Mussolini; drawing by David Levine
Behind Italy’s official façade of bourgeois morality, traditional family life, and patriotism, Pitigrilli saw a world driven by sex, power and greed, in which adultery, illegitimate children, and hypocrisy were the order of the day and husbands and wives were little more than respectable-seeming pimps and prostitutes. Born in Turin, Segre himself had been the illegitimate child of a Jewish father—also named Dino Segre—and a young Catholic mother. (His father did not marry his mother until their child was eight years old.) In his work he delighted in turning conventional morality on its head, along with most of the Ten Commandments:
Never tell the truth. A lie is a weapon. I speak of useful, necessary lies. A useless lie is as unpleasant and odious as a useless homicide… (...)In Cocaine, perhaps his most successful effort at a sustained narrative, Pitigrilli describes a world of cocaine dens, gambling parlors, orgies, lewd entertainment, and séances. His main character Tito Arnaudi is a failed medical student who has just been hired as a journalist in Paris, where begins to investigate cocaine dens in order to write an article for a Paris newspaper appropriately named The Fleeting Moment. In the course of his research, he indulges in the white powder, which for a time acts as a kind of welcome balm, giving one “a sense not just of euphoria, but of boundless optimism and a special kind of receptivity to insults.”
The principal occupation of the characters of Cocaine is distracting themselves from the horrors of real life. In searching for any kind of thrill or stimulation, they resort to “the fashionable poisons of the moment, the wild exaltation they produce, the craze for ether and chloroform and the white Bolivian powder that produces hallucinations.” As Tito’s lover (or one of his lovers), Kalantan, tells him:
“There’s still hope for you…. You haven’t yet got to the stage of tremendous depression, of insuperable melancholy. Now you smile when you have the powder in your blood. You’re at the early stage in which you go back to childhood.”
She spoke to him as to a child, though they were both of the same age. Cocaine achieves the cruel miracle of distorting time.Kalantan is a wealthy Armenian woman whom Tito meets on the cusp of widowhood. A drug addict as well, she keeps a black coffin in her bedroom for making love. She explains her curious habit thus:
It’s comfortable and delightful. When I die they’ll shut me up in it forever, and all the happiest memories of my life will be in it…. It also offers another advantage. When it’s over I’m left alone, all alone; it’s the man who has to go away. Afterwards I find the man disgusting. Forgive me for saying so, but afterwards men are always disgusting. Either they follow the satisfied male’s impulse and get up as quickly as they do from a dentist’s chair, or they stay close to me out of politeness or delicacy of feeling; and that revolts me, because there’s something in them that is no longer male.At a certain point, Tito’s two principal drugs, cocaine and sex, fuse in the figure of Maud, the main female character; Pitigrilli begins to call her Cocaine, since he becomes equally addicted to both at the same time. Maud too is a kind of addict, distracting herself by having sex with a procession of men, in some cases for money and in others for pleasure. She makes no effort to hide her activities from Tito, who follows her to South America in hopes of having her entirely to himself. The affair with Maud follows the course that addiction to cocaine generally follows: leading from initial euphoria to increasing desperation and psychological collapse. When Tito finally does himself in, Maud and Tito’s best friend Pietro attend to him on his deathbed. Struck by Tito’s final despair, they vow to give up their lives of excess but soon fall into bed with one another, ending the novel on a note of Pitigrillian cynicism, in which despair is leavened by bitter laughter.
by Alexander Stille, NY Review of Books | Read more:
Image: Benito Mussolini; drawing by David Levine
Seeing Straight
Look around the room you are sitting in now. How many right angles can you see? Book-spines, the ceiling, picture frames, door panels, the capital T and L at the bottom of this page, this page itself. Now spare a thought for a young domestic servant working at a Christian mission in Malawi in the late nineteenth century, whose experience was recorded by Robert Laws inWomen’s Work at Livingstonia (1886):
“In laying the table there is trouble for the girl. At home her house is round; a straight line and the right angle are unknown to her . . . . Day after day therefore she will lay the cloth with the folds anything but parallel with one edge of the table. Plates, knives and forks are set down in a confusing manner, and it is only after lessons often repeated and much annoyance that she begins to see how things might be done.”
Vision is a form of cognition: the kinds of things we see shape the ways we think. That is why it is so hard to imagine the visual experience of our prehistoric ancestors, or, for that matter, the girls of nineteenth-century Malawi, who lived in a world without right angles. Inhabitants of, say, late Neolithic Orkney would only have seen a handful of perpendicular lines a day: tools, shaped stones, perhaps some simple geometric decoration on a pot. For the most part, their world was curved: circular buildings, round tombs, stone circles, rounded clay vessels.
What does a round building mean? Does it mean anything, or is the choice of one shape of house over another simply a matter of practicalities? It is, for instance, easy to build extensions on to a rectangular building, since extra rooms can simply be added onto the sides or end; if the owners of an Iron Age roundhouse want a bigger living room, they have little choice but to knock the whole thing down and start again. Roundhouses are more storm- and wind-resistant, while parts of a rectangular house can more easily be partitioned or closed off, to provide privacy or a secure storage place. But this is obviously not the whole story. None of these practical arguments applies to a burial mound, which might as well take the form of a rectangular barrow as a round tumulus. So when we find that prehistoric Europeans who lived in roundhouses also tended to build circular wall circuits around their towns, to erect round tombs to their dead, and to worship their gods in circular temples or enclosures, it becomes clear – as Richard Bradley argues in his absorbing new book – that we are dealing not solely, or even primarily, with a practical choice, but with a particular way of seeing the world: an “Idea of Order”, as his title suggests.
Circles, unlike rectangles, are common in the natural world (fungi, the moon, the pupil of the human eye), and it is probably no coincidence that, with a few exceptions, prehistoric Europeans seem to have started off as circle-people. Roundhouses have traditionally been favoured by hunter-gatherers and pastoralist societies, while farmers prefer rectilinear structures (round cattle-byres, but square barns). Conversion to the right angle came at different points in different regions. In Britain, a long local tradition of roundhouses went into a steep decline after the Roman conquest, although, as Bradley notes, the inhabitants of Roman Britain and northern Gaul retained a most un-Roman preference for circular temples right down through the Roman period. The last part of Europe to retain a strong tradition of round buildings was Ireland, where circular earthworks (“raths”) and roundhouses remained the norm well into the early medieval period. Royal centres like Tara and Uisneach continued to be dominated by great circular and figure-of-eight enclosures. It was only with the Christianization of Ireland that the right angle finally triumphed here too: the early medieval island hermitage of Illaunloughan contained four traditional roundhouses but, ominously, a square Christian church and shrine, reflecting the shape of things to come.
“In laying the table there is trouble for the girl. At home her house is round; a straight line and the right angle are unknown to her . . . . Day after day therefore she will lay the cloth with the folds anything but parallel with one edge of the table. Plates, knives and forks are set down in a confusing manner, and it is only after lessons often repeated and much annoyance that she begins to see how things might be done.”Vision is a form of cognition: the kinds of things we see shape the ways we think. That is why it is so hard to imagine the visual experience of our prehistoric ancestors, or, for that matter, the girls of nineteenth-century Malawi, who lived in a world without right angles. Inhabitants of, say, late Neolithic Orkney would only have seen a handful of perpendicular lines a day: tools, shaped stones, perhaps some simple geometric decoration on a pot. For the most part, their world was curved: circular buildings, round tombs, stone circles, rounded clay vessels.
What does a round building mean? Does it mean anything, or is the choice of one shape of house over another simply a matter of practicalities? It is, for instance, easy to build extensions on to a rectangular building, since extra rooms can simply be added onto the sides or end; if the owners of an Iron Age roundhouse want a bigger living room, they have little choice but to knock the whole thing down and start again. Roundhouses are more storm- and wind-resistant, while parts of a rectangular house can more easily be partitioned or closed off, to provide privacy or a secure storage place. But this is obviously not the whole story. None of these practical arguments applies to a burial mound, which might as well take the form of a rectangular barrow as a round tumulus. So when we find that prehistoric Europeans who lived in roundhouses also tended to build circular wall circuits around their towns, to erect round tombs to their dead, and to worship their gods in circular temples or enclosures, it becomes clear – as Richard Bradley argues in his absorbing new book – that we are dealing not solely, or even primarily, with a practical choice, but with a particular way of seeing the world: an “Idea of Order”, as his title suggests.
Circles, unlike rectangles, are common in the natural world (fungi, the moon, the pupil of the human eye), and it is probably no coincidence that, with a few exceptions, prehistoric Europeans seem to have started off as circle-people. Roundhouses have traditionally been favoured by hunter-gatherers and pastoralist societies, while farmers prefer rectilinear structures (round cattle-byres, but square barns). Conversion to the right angle came at different points in different regions. In Britain, a long local tradition of roundhouses went into a steep decline after the Roman conquest, although, as Bradley notes, the inhabitants of Roman Britain and northern Gaul retained a most un-Roman preference for circular temples right down through the Roman period. The last part of Europe to retain a strong tradition of round buildings was Ireland, where circular earthworks (“raths”) and roundhouses remained the norm well into the early medieval period. Royal centres like Tara and Uisneach continued to be dominated by great circular and figure-of-eight enclosures. It was only with the Christianization of Ireland that the right angle finally triumphed here too: the early medieval island hermitage of Illaunloughan contained four traditional roundhouses but, ominously, a square Christian church and shrine, reflecting the shape of things to come.
by Peter Thonemann, TLS | Read more:
Image: Boscawen-Un Stone Circle, St Buryan, Cornwall. Robert Harding/Getty ImagesA Jewel at the Heart of Quantum Physics
Physicists have discovered a jewel-like geometric object that dramatically simplifies calculations of particle interactions and challenges the notion that space and time are fundamental components of reality.
“This is completely new and very much simpler than anything that has been done before,” said Andrew Hodges, a mathematical physicist at Oxford University who has been following the work.
The revelation that particle interactions, the most basic events in nature, may be consequences of geometry significantly advances a decades-long effort to reformulate quantum field theory, the body of laws describing elementary particles and their interactions. Interactions that were previously calculated with mathematical formulas thousands of terms long can now be described by computing the volume of the corresponding jewel-like “amplituhedron,” which yields an equivalent one-term expression.
“The degree of efficiency is mind-boggling,” said Jacob Bourjaily, a theoretical physicist at Harvard University and one of the researchers who developed the new idea. “You can easily do, on paper, computations that were infeasible even with a computer before.” (...)
The amplituhedron looks like an intricate, multifaceted jewel in higher dimensions. Encoded in its volume are the most basic features of reality that can be calculated, “scattering amplitudes,” which represent the likelihood that a certain set of particles will turn into certain other particles upon colliding. These numbers are what particle physicists calculate and test to high precision at particle accelerators like the Large Hadron Collider in Switzerland.
The 60-year-old method for calculating scattering amplitudes — a major innovation at the time — was pioneered by the Nobel Prize-winning physicist Richard Feynman. He sketched line drawings of all the ways a scattering process could occur and then summed the likelihoods of the different drawings. The simplest Feynman diagrams look like trees: The particles involved in a collision come together like roots, and the particles that result shoot out like branches. More complicated diagrams have loops, where colliding particles turn into unobservable “virtual particles” that interact with each other before branching out as real final products. There are diagrams with one loop, two loops, three loops and so on — increasingly baroque iterations of the scattering process that contribute progressively less to its total amplitude. Virtual particles are never observed in nature, but they were considered mathematically necessary for unitarity — the requirement that probabilities sum to one.
“The number of Feynman diagrams is so explosively large that even computations of really simple processes weren’t done until the age of computers,” Bourjaily said. A seemingly simple event, such as two subatomic particles called gluons colliding to produce four less energetic gluons (which happens billions of times a second during collisions at the Large Hadron Collider), involves 220 diagrams, which collectively contribute thousands of terms to the calculation of the scattering amplitude.
In 1986, it became apparent that Feynman’s apparatus was a Rube Goldberg machine.
To prepare for the construction of the Superconducting Super Collider in Texas (a project that was later canceled), theorists wanted to calculate the scattering amplitudes of known particle interactions to establish a background against which interesting or exotic signals would stand out. But even 2-gluon to 4-gluon processes were so complex, a group of physicists had written two years earlier, “that they may not be evaluated in the foreseeable future.”
Stephen Parke and Tommy Taylor, theorists at Fermi National Accelerator Laboratory in Illinois, took that statement as a challenge. Using a few mathematical tricks, they managed to simplify the 2-gluon to 4-gluon amplitude calculation from several billion terms to a 9-page-long formula, which a 1980s supercomputer could handle. Then, based on a pattern they observed in the scattering amplitudes of other gluon interactions, Parke and Taylor guessed a simple one-term expression for the amplitude. It was, the computer verified, equivalent to the 9-page formula. In other words, the traditional machinery of quantum field theory, involving hundreds of Feynman diagrams worth thousands of mathematical terms, was obfuscating something much simpler. As Bourjaily put it: “Why are you summing up millions of things when the answer is just one function?”
“We knew at the time that we had an important result,” Parke said. “We knew it instantly. But what to do with it?”
by Natalie Wolchover, Quanta | Read more:
“This is completely new and very much simpler than anything that has been done before,” said Andrew Hodges, a mathematical physicist at Oxford University who has been following the work.
The revelation that particle interactions, the most basic events in nature, may be consequences of geometry significantly advances a decades-long effort to reformulate quantum field theory, the body of laws describing elementary particles and their interactions. Interactions that were previously calculated with mathematical formulas thousands of terms long can now be described by computing the volume of the corresponding jewel-like “amplituhedron,” which yields an equivalent one-term expression.
“The degree of efficiency is mind-boggling,” said Jacob Bourjaily, a theoretical physicist at Harvard University and one of the researchers who developed the new idea. “You can easily do, on paper, computations that were infeasible even with a computer before.” (...)
The amplituhedron looks like an intricate, multifaceted jewel in higher dimensions. Encoded in its volume are the most basic features of reality that can be calculated, “scattering amplitudes,” which represent the likelihood that a certain set of particles will turn into certain other particles upon colliding. These numbers are what particle physicists calculate and test to high precision at particle accelerators like the Large Hadron Collider in Switzerland.
The 60-year-old method for calculating scattering amplitudes — a major innovation at the time — was pioneered by the Nobel Prize-winning physicist Richard Feynman. He sketched line drawings of all the ways a scattering process could occur and then summed the likelihoods of the different drawings. The simplest Feynman diagrams look like trees: The particles involved in a collision come together like roots, and the particles that result shoot out like branches. More complicated diagrams have loops, where colliding particles turn into unobservable “virtual particles” that interact with each other before branching out as real final products. There are diagrams with one loop, two loops, three loops and so on — increasingly baroque iterations of the scattering process that contribute progressively less to its total amplitude. Virtual particles are never observed in nature, but they were considered mathematically necessary for unitarity — the requirement that probabilities sum to one.
“The number of Feynman diagrams is so explosively large that even computations of really simple processes weren’t done until the age of computers,” Bourjaily said. A seemingly simple event, such as two subatomic particles called gluons colliding to produce four less energetic gluons (which happens billions of times a second during collisions at the Large Hadron Collider), involves 220 diagrams, which collectively contribute thousands of terms to the calculation of the scattering amplitude.
In 1986, it became apparent that Feynman’s apparatus was a Rube Goldberg machine.
To prepare for the construction of the Superconducting Super Collider in Texas (a project that was later canceled), theorists wanted to calculate the scattering amplitudes of known particle interactions to establish a background against which interesting or exotic signals would stand out. But even 2-gluon to 4-gluon processes were so complex, a group of physicists had written two years earlier, “that they may not be evaluated in the foreseeable future.”
Stephen Parke and Tommy Taylor, theorists at Fermi National Accelerator Laboratory in Illinois, took that statement as a challenge. Using a few mathematical tricks, they managed to simplify the 2-gluon to 4-gluon amplitude calculation from several billion terms to a 9-page-long formula, which a 1980s supercomputer could handle. Then, based on a pattern they observed in the scattering amplitudes of other gluon interactions, Parke and Taylor guessed a simple one-term expression for the amplitude. It was, the computer verified, equivalent to the 9-page formula. In other words, the traditional machinery of quantum field theory, involving hundreds of Feynman diagrams worth thousands of mathematical terms, was obfuscating something much simpler. As Bourjaily put it: “Why are you summing up millions of things when the answer is just one function?”
“We knew at the time that we had an important result,” Parke said. “We knew it instantly. But what to do with it?”
by Natalie Wolchover, Quanta | Read more:
Image: Andy Gilmore
Wednesday, September 18, 2013
Boz Scaggs
[ed. Still one of my favorites -- including the killer track Loan Me A Dime with Duane Allman.]
Dating App Tinder Catches Fire
Miranda Levitt was gushing about her new guy. His name was Todd, she told a girlfriend one day this summer, and he was so great—a director, older, established. The 26-year-old New York actress kept enthusing until her friend, with a dawning sense of recognition, cut her off: What’s his name again? The same “great guy” had been asking her out for a week on Tinder.
“My first reaction is like, ‘What the f--- is Tinder?’ ” Levitt says. “So of course I downloaded it and proceeded to play on it like it was a video game for weeks.”
Tinder, as Levitt learned, is not a website. It’s a pathologically addictive flirting-dating-hookup app. The first step in using it is to sign in with your Facebook ID, which gives Tinder your name, age, photos, and sexual orientation. There is no second step. You’re immediately shown the face of a person of your preferred sex, and, again, there’s only one thing to do: Swipe right if you like what you see, swipe left if you don’t. Another face instantly appears for appraisal, and then another.
Tinder feels like a game until you remember that the people behind those faces are swiping you back. If, and only if, both parties like each other, a private chat box appears. You could conceivably have a conversation. You could make a date. Or you could simply meet for sex, minutes after Tinder’s algorithms matched your profiles. One year after launching, Tinder’s hordes have swipe-rated each other 13 billion times—3 billion in August alone—and 2 million matches happen each day. It’s the fastest-growing free dating app in the U.S.
The average Tinderer checks the app 11 times per day, seven minutes at a time. The company says it knows of 50 marriage proposals to date. Levitt cannot escape it. “Last night I was out with a friend,” she says. At the bar there was a guy, and things were going well. “I go to the bathroom, and when I come back I look over at his phone and Tinder is up! I was like, ‘Are you kidding?!’ And he was like, ‘No, I mean, someone matched me, and I’m checking it!’ I was like, ‘OK, dude.’ ”
Levitt makes an exasperated noise. “It’s being integrated into my life as a twentysomething a lot more than I thought it would be,” she says.
Like the monster in Alien, Tinder may be a perfectly evolved organism, a predator for your attention built on the DNA of its social networking predecessors. The faces you see on Tinder seem real because they’re tied to Facebook accounts, the gold standard of authenticity. Tinder takes the gay app Grindr’s location function, which pinpoints eager men down to the foot, and tames it for a female audience, rounding distance to the nearest mile. You can chat with Tinder matches, but you can’t send photos or video, so the app avoids Chatroulette’s fate of being overrun by aspiring Anthony Weiners.
What makes Tinder truly killer, though, is that it was designed exclusively for smartphones and the hypersocial millennials who wield them. Although online dating has long since lost its stigma, OkCupid and EHarmony remain sites you browse alone at home, with a fortifying glass of wine and a spreadsheet to track interactions. Tinder is an app you pull up at a bar with friends, passing the iPhone around.
by Nick Summers, Bloomberg Businessweek | Read more:
Image: Gallery Stock
“My first reaction is like, ‘What the f--- is Tinder?’ ” Levitt says. “So of course I downloaded it and proceeded to play on it like it was a video game for weeks.”Tinder, as Levitt learned, is not a website. It’s a pathologically addictive flirting-dating-hookup app. The first step in using it is to sign in with your Facebook ID, which gives Tinder your name, age, photos, and sexual orientation. There is no second step. You’re immediately shown the face of a person of your preferred sex, and, again, there’s only one thing to do: Swipe right if you like what you see, swipe left if you don’t. Another face instantly appears for appraisal, and then another.
Tinder feels like a game until you remember that the people behind those faces are swiping you back. If, and only if, both parties like each other, a private chat box appears. You could conceivably have a conversation. You could make a date. Or you could simply meet for sex, minutes after Tinder’s algorithms matched your profiles. One year after launching, Tinder’s hordes have swipe-rated each other 13 billion times—3 billion in August alone—and 2 million matches happen each day. It’s the fastest-growing free dating app in the U.S.
The average Tinderer checks the app 11 times per day, seven minutes at a time. The company says it knows of 50 marriage proposals to date. Levitt cannot escape it. “Last night I was out with a friend,” she says. At the bar there was a guy, and things were going well. “I go to the bathroom, and when I come back I look over at his phone and Tinder is up! I was like, ‘Are you kidding?!’ And he was like, ‘No, I mean, someone matched me, and I’m checking it!’ I was like, ‘OK, dude.’ ”
Levitt makes an exasperated noise. “It’s being integrated into my life as a twentysomething a lot more than I thought it would be,” she says.
Like the monster in Alien, Tinder may be a perfectly evolved organism, a predator for your attention built on the DNA of its social networking predecessors. The faces you see on Tinder seem real because they’re tied to Facebook accounts, the gold standard of authenticity. Tinder takes the gay app Grindr’s location function, which pinpoints eager men down to the foot, and tames it for a female audience, rounding distance to the nearest mile. You can chat with Tinder matches, but you can’t send photos or video, so the app avoids Chatroulette’s fate of being overrun by aspiring Anthony Weiners.
What makes Tinder truly killer, though, is that it was designed exclusively for smartphones and the hypersocial millennials who wield them. Although online dating has long since lost its stigma, OkCupid and EHarmony remain sites you browse alone at home, with a fortifying glass of wine and a spreadsheet to track interactions. Tinder is an app you pull up at a bar with friends, passing the iPhone around.
by Nick Summers, Bloomberg Businessweek | Read more:
Image: Gallery Stock
Overpopulation Is Not the Problem
Many scientists believe that by transforming the earth’s natural landscapes, we are undermining the very life support systems that sustain us. Like bacteria in a petri dish, our exploding numbers are reaching the limits of a finite planet, with dire consequences. Disaster looms as humans exceed the earth’s natural carrying capacity. Clearly, this could not be sustainable.
This is nonsense. Even today, I hear some of my scientific colleagues repeat these and similar claims — often unchallenged. And once, I too believed them. Yet these claims demonstrate a profound misunderstanding of the ecology of human systems. The conditions that sustain humanity are not natural and never have been. Since prehistory, human populations have used technologies and engineered ecosystems to sustain populations well beyond the capabilities of unaltered “natural” ecosystems.
The evidence from archaeology is clear. Our predecessors in the genus Homo used social hunting strategies and tools of stone and fire to extract more sustenance from landscapes than would otherwise be possible. And, of course, Homo sapiens went much further, learning over generations, once their preferred big game became rare or extinct, to make use of a far broader spectrum of species. They did this by extracting more nutrients from these species by cooking and grinding them, by propagating the most useful species and by burning woodlands to enhance hunting and foraging success.
Even before the last ice age had ended, thousands of years before agriculture, hunter-gatherer societies were well established across the earth and depended increasingly on sophisticated technological strategies to sustain growing populations in landscapes long ago transformed by their ancestors.
The planet’s carrying capacity for prehistoric human hunter-gatherers was probably no more than 100 million. But without their Paleolithic technologies and ways of life, the number would be far less — perhaps a few tens of millions. The rise of agriculture enabled even greater population growth requiring ever more intensive land-use practices to gain more sustenance from the same old land. At their peak, those agricultural systems might have sustained as many as three billion people in poverty on near-vegetarian diets.
The world population is now estimated at 7.2 billion. But with current industrial technologies, the Food and Agriculture Organization of the United Nations has estimated that the more than nine billion people expected by 2050 as the population nears its peak could be supported as long as necessary investments in infrastructure and conducive trade, anti-poverty and food security policies are in place. Who knows what will be possible with the technologies of the future? The important message from these rough numbers should be clear. There really is no such thing as a human carrying capacity. We are nothing at all like bacteria in a petri dish.
Why is it that highly trained natural scientists don’t understand this? My experience is likely to be illustrative. Trained as a biologist, I learned the classic mathematics of population growth — that populations must have their limits and must ultimately reach a balance with their environments. Not to think so would be to misunderstand physics: there is only one earth, of course!
It was only after years of research into the ecology of agriculture in China that I reached the point where my observations forced me to see beyond my biologists’s blinders. Unable to explain how populations grew for millenniums while increasing the productivity of the same land, I discovered the agricultural economist Ester Boserup, the antidote to the demographer and economist Thomas Malthus and his theory that population growth tends to outrun the food supply. Her theories of population growth as a driver of land productivity explained the data I was gathering in ways that Malthus could never do. While remaining an ecologist, I became a fellow traveler with those who directly study long-term human-environment relationships — archaeologists, geographers, environmental historians and agricultural economists.
The science of human sustenance is inherently a social science. Neither physics nor chemistry nor even biology is adequate to understand how it has been possible for one species to reshape both its own future and the destiny of an entire planet. This is the science of the Anthropocene. The idea that humans must live within the natural environmental limits of our planet denies the realities of our entire history, and most likely the future. Humans are niche creators. We transform ecosystems to sustain ourselves. This is what we do and have always done. Our planet’s human-carrying capacity emerges from the capabilities of our social systems and our technologies more than from any environmental limits.
Two hundred thousand years ago we started down this path. The planet will never be the same. It is time for all of us to wake up to the limits we really face: the social and technological systems that sustain us need improvement.
by Erle C. Ellis, NY Times | Read more:
Image: Katherine Streeter
This is nonsense. Even today, I hear some of my scientific colleagues repeat these and similar claims — often unchallenged. And once, I too believed them. Yet these claims demonstrate a profound misunderstanding of the ecology of human systems. The conditions that sustain humanity are not natural and never have been. Since prehistory, human populations have used technologies and engineered ecosystems to sustain populations well beyond the capabilities of unaltered “natural” ecosystems.
The evidence from archaeology is clear. Our predecessors in the genus Homo used social hunting strategies and tools of stone and fire to extract more sustenance from landscapes than would otherwise be possible. And, of course, Homo sapiens went much further, learning over generations, once their preferred big game became rare or extinct, to make use of a far broader spectrum of species. They did this by extracting more nutrients from these species by cooking and grinding them, by propagating the most useful species and by burning woodlands to enhance hunting and foraging success.Even before the last ice age had ended, thousands of years before agriculture, hunter-gatherer societies were well established across the earth and depended increasingly on sophisticated technological strategies to sustain growing populations in landscapes long ago transformed by their ancestors.
The planet’s carrying capacity for prehistoric human hunter-gatherers was probably no more than 100 million. But without their Paleolithic technologies and ways of life, the number would be far less — perhaps a few tens of millions. The rise of agriculture enabled even greater population growth requiring ever more intensive land-use practices to gain more sustenance from the same old land. At their peak, those agricultural systems might have sustained as many as three billion people in poverty on near-vegetarian diets.
The world population is now estimated at 7.2 billion. But with current industrial technologies, the Food and Agriculture Organization of the United Nations has estimated that the more than nine billion people expected by 2050 as the population nears its peak could be supported as long as necessary investments in infrastructure and conducive trade, anti-poverty and food security policies are in place. Who knows what will be possible with the technologies of the future? The important message from these rough numbers should be clear. There really is no such thing as a human carrying capacity. We are nothing at all like bacteria in a petri dish.
Why is it that highly trained natural scientists don’t understand this? My experience is likely to be illustrative. Trained as a biologist, I learned the classic mathematics of population growth — that populations must have their limits and must ultimately reach a balance with their environments. Not to think so would be to misunderstand physics: there is only one earth, of course!
It was only after years of research into the ecology of agriculture in China that I reached the point where my observations forced me to see beyond my biologists’s blinders. Unable to explain how populations grew for millenniums while increasing the productivity of the same land, I discovered the agricultural economist Ester Boserup, the antidote to the demographer and economist Thomas Malthus and his theory that population growth tends to outrun the food supply. Her theories of population growth as a driver of land productivity explained the data I was gathering in ways that Malthus could never do. While remaining an ecologist, I became a fellow traveler with those who directly study long-term human-environment relationships — archaeologists, geographers, environmental historians and agricultural economists.
The science of human sustenance is inherently a social science. Neither physics nor chemistry nor even biology is adequate to understand how it has been possible for one species to reshape both its own future and the destiny of an entire planet. This is the science of the Anthropocene. The idea that humans must live within the natural environmental limits of our planet denies the realities of our entire history, and most likely the future. Humans are niche creators. We transform ecosystems to sustain ourselves. This is what we do and have always done. Our planet’s human-carrying capacity emerges from the capabilities of our social systems and our technologies more than from any environmental limits.
Two hundred thousand years ago we started down this path. The planet will never be the same. It is time for all of us to wake up to the limits we really face: the social and technological systems that sustain us need improvement.
by Erle C. Ellis, NY Times | Read more:
Image: Katherine Streeter
Foolproof Pan Pizza
I've got a confession to make: I love pan pizza.
I'm not talking deep-dish Chicago-style with its crisp crust and rivers of cheese and sauce, I'm talking thick-crusted, fried-on-the-bottom, puffy, cheesy, focaccia-esque pan pizza of the kind that you might remember Pizza Hut having when you were a kid, though in reality, most likely that pizza never really existed—as they say, pizzas past always look better through pepperoni-tinted glasses.
It would arrive at the table in a jet black, well-worn pan, its edges browned and crisped where the cheese has melted into the gap between the crust and the pan. You'd lift up a slice and long threads of mozzarella pull out, stretching all the way across the table, a signpost saying "hey everyone, it's this kid's birthday!" You'd reach out your fingers—almost involuntarily—grasping at those cheese strings, plucking at them like guitar strings, wrapping them around your fingers so you can suck them off before diving into the slice itself.
That perfect pan pizza had an open, airy, chewy crumb in the center that slowly transformed into a crisp, golden-brown, fried crust at the very bottom and a soft, thin, doughy layer at the top right at the crust-sauce interface. It was thick and robust enough to support a heavy load of toppings, though even a plain cheese or pepperoni slice would do.
It's been years since I've gone to an actual Pizza Hut (they don't even exist in New York aside from those crappy "Pizza Hut Express" joints with the pre-fab, lukewarm individual pizzas), but I've spent a good deal of time working on my own pan pizza recipe to the point that it finally lives up to that perfect image of my childhood pan pizza that still lives on in my mind.
If only pizza that good were also easy to make. Well here's the good news: It is. This is the easiest pizza you will ever make. Seriously. All it takes is a few basic kitchen essentials, some simple ingredients, and a bit of patience.
The way I see it, there are three basic difficulties most folks have with pizza:
You can jump straight into a full step-by-step slideshow of the process or find the exact measurements and instructions in the recipe here, or read on for a few more details on what to expect and how we got there.
By now, everybody and their baker's heard about no knead dough. It's a technique that was developed by Jim Lahey of Sullivan Street Bakery and popularized by Mark Bittman of the New York Times. The basic premise is simple: mix together your dough ingredients in a bowl just until they're combined, cover it, and let time take care of the rest. That's it.
So how does it work? Well the goal of kneading in a traditional dough is to create gluten, a web-like network of interconnected proteins that forms when flour is mixed together with water. All wheat flour contains some amount of protein (usually around 10 to 15%, depending on the variety of wheat). In their normal state, these proteins resemble tiny crumpled up little balls of wire. With kneading, your goal is to first work these proteins until they untangle a bit, then to rub them against each other until they link up, forming a solid chain-link fence.
It's this gluten matrix that allows your dough to be stretched without breaking, and what allows it to hold nice big air bubbles inside. Ever have a dense under-risen pizza crust? It's because whoever made it didn't properly form their gluten in the process.
Now you can see how how this can take a lot of work. Kneading, aligning, folding, linking. That's why most pizza dough recipes takes a good ten to twenty minutes of elbow grease or time in a stand mixer.
But there's another way.
See, flour naturally contains enzymes that will break down large proteins into smaller ones. Imagine them as teeny-tiny wire cutter that cut those jumbled up balls of wire into shorter pieces. The shorter the pieces are, they easier it is to untangle them, and the easier it is to then align them and link them up into a good, strong network. No-knead dough recipes take advantage of this fact.
Over the course of an overnight sit at room temperature, those enzymes get to work breaking down proteins. Meanwhile, yeast starts to consume sugars in the flour, releasing carbon dioxide gas int he process. These bubbles of gas will cause the dough to start stretching, and in the process, will jostle and align the enzyme-primed proteins, thereby creating gluten.
Simply allowing the dough to sit overnight will create a gluten network at least as strong (if not stronger!) than a dough that had been kneaded in a mixer or by hand, all with pretty much zero effort. Indeed, the flavor produced by letting yeast do its thing over the course of this night will also be superior to that of any same-day dough. Win win!
Other than time, the only real key to a successful no-knead dough is high hydration. Specifically, the water content should be at least 60% of the weight of the flour you use. Luckily, high hydration also leads to superior hole structure upon baking. I go for about 65%.
by J. Kenji López-Alt, Serious Eats | Read more:
I'm not talking deep-dish Chicago-style with its crisp crust and rivers of cheese and sauce, I'm talking thick-crusted, fried-on-the-bottom, puffy, cheesy, focaccia-esque pan pizza of the kind that you might remember Pizza Hut having when you were a kid, though in reality, most likely that pizza never really existed—as they say, pizzas past always look better through pepperoni-tinted glasses.It would arrive at the table in a jet black, well-worn pan, its edges browned and crisped where the cheese has melted into the gap between the crust and the pan. You'd lift up a slice and long threads of mozzarella pull out, stretching all the way across the table, a signpost saying "hey everyone, it's this kid's birthday!" You'd reach out your fingers—almost involuntarily—grasping at those cheese strings, plucking at them like guitar strings, wrapping them around your fingers so you can suck them off before diving into the slice itself.
That perfect pan pizza had an open, airy, chewy crumb in the center that slowly transformed into a crisp, golden-brown, fried crust at the very bottom and a soft, thin, doughy layer at the top right at the crust-sauce interface. It was thick and robust enough to support a heavy load of toppings, though even a plain cheese or pepperoni slice would do.
It's been years since I've gone to an actual Pizza Hut (they don't even exist in New York aside from those crappy "Pizza Hut Express" joints with the pre-fab, lukewarm individual pizzas), but I've spent a good deal of time working on my own pan pizza recipe to the point that it finally lives up to that perfect image of my childhood pan pizza that still lives on in my mind.
If only pizza that good were also easy to make. Well here's the good news: It is. This is the easiest pizza you will ever make. Seriously. All it takes is a few basic kitchen essentials, some simple ingredients, and a bit of patience.
The way I see it, there are three basic difficulties most folks have with pizza:
- Problem 1: Kneading. How long is enough? What motion do I use? And is it really worth the doggone effort?
- Problem 2: Stretching. Once I've got that disk of dough, how do I get it into the shape of an actual pizza, ready to be topped?
- Problem 3: Transferring. Ok, let's say I've got my dough made and perfectly stretched onto my pizza peel. How do I get it onto that stone in the oven without disturbing the toppings or having it turn into a misshapen blob?
You can jump straight into a full step-by-step slideshow of the process or find the exact measurements and instructions in the recipe here, or read on for a few more details on what to expect and how we got there.
By now, everybody and their baker's heard about no knead dough. It's a technique that was developed by Jim Lahey of Sullivan Street Bakery and popularized by Mark Bittman of the New York Times. The basic premise is simple: mix together your dough ingredients in a bowl just until they're combined, cover it, and let time take care of the rest. That's it.
So how does it work? Well the goal of kneading in a traditional dough is to create gluten, a web-like network of interconnected proteins that forms when flour is mixed together with water. All wheat flour contains some amount of protein (usually around 10 to 15%, depending on the variety of wheat). In their normal state, these proteins resemble tiny crumpled up little balls of wire. With kneading, your goal is to first work these proteins until they untangle a bit, then to rub them against each other until they link up, forming a solid chain-link fence.
It's this gluten matrix that allows your dough to be stretched without breaking, and what allows it to hold nice big air bubbles inside. Ever have a dense under-risen pizza crust? It's because whoever made it didn't properly form their gluten in the process.
Now you can see how how this can take a lot of work. Kneading, aligning, folding, linking. That's why most pizza dough recipes takes a good ten to twenty minutes of elbow grease or time in a stand mixer.
But there's another way.
See, flour naturally contains enzymes that will break down large proteins into smaller ones. Imagine them as teeny-tiny wire cutter that cut those jumbled up balls of wire into shorter pieces. The shorter the pieces are, they easier it is to untangle them, and the easier it is to then align them and link them up into a good, strong network. No-knead dough recipes take advantage of this fact.
Over the course of an overnight sit at room temperature, those enzymes get to work breaking down proteins. Meanwhile, yeast starts to consume sugars in the flour, releasing carbon dioxide gas int he process. These bubbles of gas will cause the dough to start stretching, and in the process, will jostle and align the enzyme-primed proteins, thereby creating gluten.
Simply allowing the dough to sit overnight will create a gluten network at least as strong (if not stronger!) than a dough that had been kneaded in a mixer or by hand, all with pretty much zero effort. Indeed, the flavor produced by letting yeast do its thing over the course of this night will also be superior to that of any same-day dough. Win win!
Other than time, the only real key to a successful no-knead dough is high hydration. Specifically, the water content should be at least 60% of the weight of the flour you use. Luckily, high hydration also leads to superior hole structure upon baking. I go for about 65%.
by J. Kenji López-Alt, Serious Eats | Read more:
Image: J. Kenji López-Alt
From Mars
This spring, Jenny Hollander, a twenty-three-year-old Columbia Journalism School student, sent out her résumé for summer internships. “Where didn’t I apply?” Hollander, who is from the U.K., said recently. “BuzzFeed, Mashable, the Fiscal Times, a lot of very small county papers all over the U.S.; California Watch, which is an investigative thing in California; the L.A. Times; the Huffington Post—twice.” She was either rejected or ignored by all of them. Then she came across a notice, on a Columbia Listserv, for a “writing internship” at an unnamed startup. The job paid fifty dollars a day. “It was all a little bit cloak-and-dagger,” Hollander said. She knew nothing about the company, but she applied anyway, and was delighted when she was hired.
On her first day of work, instead of going to an office, Hollander arrived at a newly renovated four-story town house in Williamsburg, Brooklyn. It had two kitchens, two living rooms, and a roof deck—all decorated in a funky flea-market style. The house was the headquarters of Bustle, a new online publication for women. There were four editors in their mid-twenties, and a gaggle of interns—college students or recent graduates, all women—sat around, typing on MacBooks. Many students have summer jobs that involve little more than fetching coffee and maintaining Twitter feeds, so Hollander was surprised when she was told to take out her laptop and start writing blog posts. “I called my housemate and was, like, ‘So I’m doing this job, and all I’m doing is sitting on sofas in this gorgeous house with a bunch of other girls, and we’re all writing together!’ ”
If you go to Bustle.com, you will find a sleekly designed Web site, with headlines that read like the result of a one-night stand between Us Weekly and U.S. News & World Report. Its loosely female-oriented articles cover topics ranging from evergreen style tips (“Eight Modern Ways to Wear a Hair Scarf”) to celebrity gossip (“Why We’re Concerned for Simon Cowell’s Unborn Son”), with a prominent dash of hard news (the top stories last week were about Syria). To a large degree, the articles consist of aggregation: a Bustle writer finds a piece of news that interests her—from the Times, or from a blog she likes—and summarizes it for Bustle’s readers, perhaps making its contents into a list, or collecting some related tweets. Bustle’s house style—to the extent that one exists—is brisk and easily digestible, if a little thin. Soon after she started writing for Bustle, Hollander developed a recurring feature called “This Week in Studies,” in which she recaps the results of scientific research, in slide-show form: “A stunning new study reveals that a quarter of people regret something that we posted on social media at some point: a drunk Tweet, a melancholy Facebook post. . . . Seriously: only a quarter regret these things?”
Bustle’s articles are modest, but the ambitions of its founder, a young Silicon Valley entrepreneur named Bryan Goldberg, are not. When I first spoke to him, early in the summer, he referred to Bustle as “the next great women’s publication.” He was in the process of raising an unusually large amount of pre-launch money—$6.5 million—from investors such as Time Warner Investments and 500 Startups. In six years, Goldberg told me, he hopes that Bustle will attract fifty million visitors each month and earn more than a hundred million dollars a year in advertising revenue, making it the “biggest and the most powerful women’s publication in the world.”
Goldberg, who is thirty, is not a traditional publisher: he speaks more admiringly of Elon Musk than of any Pulitzer Prize-winner. But he is not all bluff. Six years ago, at the age of twenty-four, he and a few friends started Bleacher Report, a sports Web site that, in 2012, they sold to Turner Broadcasting for more than two hundred million dollars. Bleacher Report’s success was a striking example of the new economics of media: when it began, its articles were written by a network of two thousand unpaid sports fans (critics have described the site as an example of “loser-generated content”), yet today it attracts twenty-two million unique visitors each month, putting it behind only Yahoo U.S. Sports and ESPN.com among non-league sports Web sites. Bleacher Report’s high traffic and low production costs have made it extremely profitable. Soon after acquiring Bleacher Report, Turner made it the source of sports news at CNN.com, where it replaced Sports Illustrated. This changing of the guard was a reminder of how quickly, in the Internet age, a cost-effective business plan can overtake one built on a reputation for quality. Goldberg points out that Bleacher Report is now likely worth more than the two hundred and fifty million dollars that Jeff Bezos recently paid for the Washington Post.
On her first day of work, instead of going to an office, Hollander arrived at a newly renovated four-story town house in Williamsburg, Brooklyn. It had two kitchens, two living rooms, and a roof deck—all decorated in a funky flea-market style. The house was the headquarters of Bustle, a new online publication for women. There were four editors in their mid-twenties, and a gaggle of interns—college students or recent graduates, all women—sat around, typing on MacBooks. Many students have summer jobs that involve little more than fetching coffee and maintaining Twitter feeds, so Hollander was surprised when she was told to take out her laptop and start writing blog posts. “I called my housemate and was, like, ‘So I’m doing this job, and all I’m doing is sitting on sofas in this gorgeous house with a bunch of other girls, and we’re all writing together!’ ”
If you go to Bustle.com, you will find a sleekly designed Web site, with headlines that read like the result of a one-night stand between Us Weekly and U.S. News & World Report. Its loosely female-oriented articles cover topics ranging from evergreen style tips (“Eight Modern Ways to Wear a Hair Scarf”) to celebrity gossip (“Why We’re Concerned for Simon Cowell’s Unborn Son”), with a prominent dash of hard news (the top stories last week were about Syria). To a large degree, the articles consist of aggregation: a Bustle writer finds a piece of news that interests her—from the Times, or from a blog she likes—and summarizes it for Bustle’s readers, perhaps making its contents into a list, or collecting some related tweets. Bustle’s house style—to the extent that one exists—is brisk and easily digestible, if a little thin. Soon after she started writing for Bustle, Hollander developed a recurring feature called “This Week in Studies,” in which she recaps the results of scientific research, in slide-show form: “A stunning new study reveals that a quarter of people regret something that we posted on social media at some point: a drunk Tweet, a melancholy Facebook post. . . . Seriously: only a quarter regret these things?”
Bustle’s articles are modest, but the ambitions of its founder, a young Silicon Valley entrepreneur named Bryan Goldberg, are not. When I first spoke to him, early in the summer, he referred to Bustle as “the next great women’s publication.” He was in the process of raising an unusually large amount of pre-launch money—$6.5 million—from investors such as Time Warner Investments and 500 Startups. In six years, Goldberg told me, he hopes that Bustle will attract fifty million visitors each month and earn more than a hundred million dollars a year in advertising revenue, making it the “biggest and the most powerful women’s publication in the world.”
Goldberg, who is thirty, is not a traditional publisher: he speaks more admiringly of Elon Musk than of any Pulitzer Prize-winner. But he is not all bluff. Six years ago, at the age of twenty-four, he and a few friends started Bleacher Report, a sports Web site that, in 2012, they sold to Turner Broadcasting for more than two hundred million dollars. Bleacher Report’s success was a striking example of the new economics of media: when it began, its articles were written by a network of two thousand unpaid sports fans (critics have described the site as an example of “loser-generated content”), yet today it attracts twenty-two million unique visitors each month, putting it behind only Yahoo U.S. Sports and ESPN.com among non-league sports Web sites. Bleacher Report’s high traffic and low production costs have made it extremely profitable. Soon after acquiring Bleacher Report, Turner made it the source of sports news at CNN.com, where it replaced Sports Illustrated. This changing of the guard was a reminder of how quickly, in the Internet age, a cost-effective business plan can overtake one built on a reputation for quality. Goldberg points out that Bleacher Report is now likely worth more than the two hundred and fifty million dollars that Jeff Bezos recently paid for the Washington Post.
by Lizzie Widdicombe, New Yorker | Read more:
Image: Pari DukovicSea Change
[ed. If you read anything this week, read this.]
Imagine every person on Earth tossing a hunk of CO2 as heavy as a bowling ball into the sea. That’s what we do to the oceans every day.
Burning fossil fuels, such as coal, oil and natural gas, belches carbon dioxide into the air. But a quarter of that CO2 then gets absorbed by the seas — eight pounds per person per day, about 20 trillion pounds a year.
Scientists once considered that entirely good news, since it removed CO2 from the sky. Some even proposed piping more emissions to the sea.
But all that CO2 is changing the chemistry of the ocean faster than at any time in human history. Now the phenomenon known as ocean acidification — the lesser-known twin of climate change — is helping push the seas toward a great unraveling that threatens to scramble marine life on a scale almost too big to fathom, and far faster than first expected.
Here’s why: When CO2 mixes with water it takes on a corrosive power that erodes some animals’ shells or skeletons. It lowers the pH, making oceans more acidic and sour, and robs the water of ingredients animals use to grow shells in the first place.
Acidification wasn’t supposed to start doing its damage until much later this century.
Instead, changing sea chemistry already has killed billions of oysters along the Washington coast and at a hatchery that draws water from Hood Canal. It’s helping destroy mussels on some Northwest shores. It is a suspect in the softening of clam shells and in the death of baby scallops. It is dissolving a tiny plankton species eaten by many ocean creatures, from auklets and puffins to fish and whales — and that had not been expected for another 25 years.
And this is just the beginning.
by Craig Welch, Seattle Times | Read more:
Image: Steve Ringman
Anglo American Withdraws from Pebble Mine
[ed. Sounds like the end might be near but big projects like this never really seem to die (in Alaska, anyway), they just cycle through a couple generations (or less) and reappear in new packaging. For additional background see: Gold Fish]
Anglo American, one of the key backers of the controversial Pebble mine in Alaska's Bristol Bay region, announced Monday that it is withdrawing from the Pebble Partnership -- and will take a $300 million hit for doing so. The London-based Anglo American has a 50 percent share of the Pebble venture, with Northern Dynasty Minerals out of Vancouver, Canada controlling the other half. The company said that Northern Dynasty will assume sole responsibility for the project.In a statement, Anglo American CEO Mark Cutifani said that the company was seeking other investment opportunities.
"Despite our belief that Pebble is a deposit of rare magnitude and quality, we have taken the decision to withdraw following a thorough assessment of Anglo American’s extensive pipeline of long-dated project options," Cutifani said. "Our focus has been to prioritize capital to projects with the highest value and lowest risks within our portfolio, and reduce the capital required to sustain such projects during the pre-approval phases of development as part of a more effective, value-driven capital allocation model."
John Shively, CEO of the Pebble Partnership, insisted that reports of Pebble's death are premature. “Obviously we’re disappointed, but we still have a great project,” he said. “Anglo American was reviewing all of their assets. When they got to us, we didn’t make the cut,” he said.
Shively, who learned of the pullout this weekend in phone calls from the owner companies, said he expects that Northern Dynasty will decide in the next two or three weeks what its next steps should be. He said the “partnership has to be unraveled,” and Northern Dynasty has to consider its options.
Pebble has received intense scrutiny during the exploratory phase of the project. Critics say the mine's proposed location could present a risk to the Bristol Bay watershed and salmon fishery, one of the most lucrative fisheries in the world. Supporters have accused the Environmental Protection Agency of playing politics with the project after the EPA released an assessment of the potential impacts of a large open-pit mine on Bristol Bay fisheries last year. That report said that even barring a major mishap, damage to salmon runs were a likely side effect of mine development.
Meanwhile, the Pebble Mine prospect is also a high-value proposition: Northern Dynasty estimates that the proposed mining area could contain as much as 81 billion pounds of copper, 5.6 billion pounds of molybdenum and 107 million ounces of gold. Estimates have put the value of the resources at up to $300 billion.
by Ben Anderson, Alaska Dispatch | Read more:
Image: EPA
Subscribe to:
Posts (Atom)













