Wednesday, January 20, 2016
The Political Scientist Who Debunked Mainstream Economics
“Picture a pasture open to all.”
For at least a generation, the very idea of the commons has been marginalized and dismissed as a misguided way to manage resources: the so-called tragedy of the commons. In a short but influential essay published in Science in 1968, ecologist Garrett Hardin gave the story a fresh formulation and a memorable tagline.
“The tragedy of the commons develops in this way,” wrote Hardin, proposing to his readers that they envision an open pasture:
It is to be expected that each herdsman will try to keep as many cattle as possible in the commons. Such an arrangement may work reasonably satisfactorily for centuries because tribal wars, poaching and disease keep the numbers of both man and beast well below the carrying capacity of the land. Finally, however, comes the day of reckoning, that is, the day when the long-desired goal of social stability becomes a reality. At this point, the inherent logic of the commons remorselessly generates tragedy. As a rational being, each herdsman seeks to maximize his gain. Explicitly or implicitly, more or less consciously, he asks, “What is the utility to me of adding one more animal to my herd?”
The rational herdsman concludes that the only sensible course for him to pursue is to add another animal to his herd. And another…. But this is the conclusion reached by each and every rational herdsman sharing a commons. Therein is the tragedy. Each man is locked into a system that compels him to increase his herd with- out limit—in a world that is limited. Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons. Freedom in a commons brings ruin to all.
The tragedy of the commons is one of those basic concepts that is drilled into the minds of every undergraduate, at least in economics courses. The idea is considered a basic principle of economics—a cautionary lesson about the impossibility of collective action. Once the class has been escorted through a ritual shudder, the professor whisks them along to the main attraction, the virtues of private property and free markets. Here, finally, economists reveal, we may surmount the dismal tragedy of a commons. The catechism is hammered home: individual freedom to own and trade private property in open markets is the only way to produce enduring personal satisfaction and social prosperity.
Hardin explains the logic this way: we can overcome the tragedy of the commons through a system of “mutual coercion, mutually agreed upon by the majority of the people affected.” For him, the best approach is “the institution of private property coupled with legal inheritance.” He concedes that this is not a perfectly just alternative, but he asserts that Darwinian natural selection is ultimately the best available option, saying, “those who are biologically more fit to be the custodians of property and power should legally inherit more.” We put up with this imperfect legal order, he adds, “because we are not convinced, at the moment, that anyone has invented a better system. The alternative of the commons is too horrifying to contemplate. Injustice is preferable to total ruin.”
Such musings by a libertarian-minded scientist have been catnip to conservative ideologues and economists (who are so often one and the same). They see Hardin’s essay as a gospel parable that affirms some core principles of neoliberal economic ideology. It affirms the importance of “free markets” and justifies the property rights of the wealthy. It bolsters a commitment to individual rights and private property as the cornerstone of economic thought and policy. People will supposedly have the motivation to take responsibility for resources if they are guaranteed private ownership and access to free markets. Tragic outcomes—“total ruin”—can thereby be avoided. The failure of the commons, in this telling, is conflated with government itself, if only to suggest that one of the few recognized vehicles for advancing collective interests, government, will also succumb to the “tragedy” paradigm. (That is the gist of Public Choice theory, which applies standard economic logic to problems in political science.)
Over the past several decades, the tragedy of the commons has taken root as an economic truism. The Hardin essay has become a staple of undergraduate education in the US, taught not just in economics courses but in political science, sociology and other fields. It is no wonder that so many people consider the commons with such glib condescension. The commons = chaos, ruin and failure.
There is just one significant flaw in the tragedy parable. It does not accurately describe a commons. Hardin’s fictional scenario sets forth a system that has no boundaries around the pasture, no rules for managing it, no punishments for over-use and no distinct community of users. But that is not a commons. It is an open-access regime, or a free-for-all. A commons has boundaries, rules, social norms and sanctions against free riders. A commons requires that there be a community willing to act as a conscientious steward of a resource. Hardin was confusing a commons with “no-man’s-land”—and in the process, he smeared the commons as a failed paradigm for managing resources.
To be fair, Hardin was following a long line of polemicists who projected their unexamined commitments to market individualism onto the world. As we will see later, the theories of philosopher John Locke have been widely used to justify treating the New World as terra nullius—open, unowned land—even though it was populated by millions of Native Americans who managed their natural resources as beloved commons with unwritten but highly sophisticated rules.
Hardin’s essay was inspired by his reading of an 1832 talk by William Forster Lloyd, an English lecturer who, like Hardin, was worried about overpopulation in a period of intense enclosures of land. Lloyd’s talk is notable because it rehearses the same line of argument and makes the same fanciful error—that people are incapable of negotiating a solution to the “tragedy.” Instead of a shared pasture, Lloyd’s metaphor was a joint pool of money that could be accessed by every contributor. Lloyd asserted that each individual would quickly deplete more than his share of the pool while a private purse of money would be frugally managed.
I mention Lloyd’s essay to illustrate how ridiculous yet persistent the misconceptions about the “tragedy” dynamic truly are. Commons scholar Lewis Hyde dryly notes, “Just as Hardin proposes a herdsman whose reason is unable to encompass the common good, so Lloyd supposes persons who have no way to speak with each other or make joint decisions. Both writers inject laissez-faire individualism into an old agrarian village and then gravely announce that the commons is dead. From the point of view of such a village, Lloyd’s assumptions are as crazy as asking us to ‘suppose a man to have a purse to which his left and right hand may freely resort, each unaware of the other’.” (...)
Paradoxically enough, the heedless quest for selfish gain— “rationally” pursued, of course, yet indifferent toward the collective good—is a better description of the conventional market economy than a commons. In the run-up to the 2008 financial crisis, such a mindset propelled the wizards of Wall Street to maximize private gains without regard for the systemic risks or local impacts. The real tragedy precipitated by “rational” individualism is not the tragedy of the commons, but the tragedy of the market.
by David Bollier, Evonomics | Read more:
Image: via:
For at least a generation, the very idea of the commons has been marginalized and dismissed as a misguided way to manage resources: the so-called tragedy of the commons. In a short but influential essay published in Science in 1968, ecologist Garrett Hardin gave the story a fresh formulation and a memorable tagline.
“The tragedy of the commons develops in this way,” wrote Hardin, proposing to his readers that they envision an open pasture:
It is to be expected that each herdsman will try to keep as many cattle as possible in the commons. Such an arrangement may work reasonably satisfactorily for centuries because tribal wars, poaching and disease keep the numbers of both man and beast well below the carrying capacity of the land. Finally, however, comes the day of reckoning, that is, the day when the long-desired goal of social stability becomes a reality. At this point, the inherent logic of the commons remorselessly generates tragedy. As a rational being, each herdsman seeks to maximize his gain. Explicitly or implicitly, more or less consciously, he asks, “What is the utility to me of adding one more animal to my herd?”The rational herdsman concludes that the only sensible course for him to pursue is to add another animal to his herd. And another…. But this is the conclusion reached by each and every rational herdsman sharing a commons. Therein is the tragedy. Each man is locked into a system that compels him to increase his herd with- out limit—in a world that is limited. Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons. Freedom in a commons brings ruin to all.
The tragedy of the commons is one of those basic concepts that is drilled into the minds of every undergraduate, at least in economics courses. The idea is considered a basic principle of economics—a cautionary lesson about the impossibility of collective action. Once the class has been escorted through a ritual shudder, the professor whisks them along to the main attraction, the virtues of private property and free markets. Here, finally, economists reveal, we may surmount the dismal tragedy of a commons. The catechism is hammered home: individual freedom to own and trade private property in open markets is the only way to produce enduring personal satisfaction and social prosperity.
Hardin explains the logic this way: we can overcome the tragedy of the commons through a system of “mutual coercion, mutually agreed upon by the majority of the people affected.” For him, the best approach is “the institution of private property coupled with legal inheritance.” He concedes that this is not a perfectly just alternative, but he asserts that Darwinian natural selection is ultimately the best available option, saying, “those who are biologically more fit to be the custodians of property and power should legally inherit more.” We put up with this imperfect legal order, he adds, “because we are not convinced, at the moment, that anyone has invented a better system. The alternative of the commons is too horrifying to contemplate. Injustice is preferable to total ruin.”
Such musings by a libertarian-minded scientist have been catnip to conservative ideologues and economists (who are so often one and the same). They see Hardin’s essay as a gospel parable that affirms some core principles of neoliberal economic ideology. It affirms the importance of “free markets” and justifies the property rights of the wealthy. It bolsters a commitment to individual rights and private property as the cornerstone of economic thought and policy. People will supposedly have the motivation to take responsibility for resources if they are guaranteed private ownership and access to free markets. Tragic outcomes—“total ruin”—can thereby be avoided. The failure of the commons, in this telling, is conflated with government itself, if only to suggest that one of the few recognized vehicles for advancing collective interests, government, will also succumb to the “tragedy” paradigm. (That is the gist of Public Choice theory, which applies standard economic logic to problems in political science.)
Over the past several decades, the tragedy of the commons has taken root as an economic truism. The Hardin essay has become a staple of undergraduate education in the US, taught not just in economics courses but in political science, sociology and other fields. It is no wonder that so many people consider the commons with such glib condescension. The commons = chaos, ruin and failure.
There is just one significant flaw in the tragedy parable. It does not accurately describe a commons. Hardin’s fictional scenario sets forth a system that has no boundaries around the pasture, no rules for managing it, no punishments for over-use and no distinct community of users. But that is not a commons. It is an open-access regime, or a free-for-all. A commons has boundaries, rules, social norms and sanctions against free riders. A commons requires that there be a community willing to act as a conscientious steward of a resource. Hardin was confusing a commons with “no-man’s-land”—and in the process, he smeared the commons as a failed paradigm for managing resources.
To be fair, Hardin was following a long line of polemicists who projected their unexamined commitments to market individualism onto the world. As we will see later, the theories of philosopher John Locke have been widely used to justify treating the New World as terra nullius—open, unowned land—even though it was populated by millions of Native Americans who managed their natural resources as beloved commons with unwritten but highly sophisticated rules.
Hardin’s essay was inspired by his reading of an 1832 talk by William Forster Lloyd, an English lecturer who, like Hardin, was worried about overpopulation in a period of intense enclosures of land. Lloyd’s talk is notable because it rehearses the same line of argument and makes the same fanciful error—that people are incapable of negotiating a solution to the “tragedy.” Instead of a shared pasture, Lloyd’s metaphor was a joint pool of money that could be accessed by every contributor. Lloyd asserted that each individual would quickly deplete more than his share of the pool while a private purse of money would be frugally managed.
I mention Lloyd’s essay to illustrate how ridiculous yet persistent the misconceptions about the “tragedy” dynamic truly are. Commons scholar Lewis Hyde dryly notes, “Just as Hardin proposes a herdsman whose reason is unable to encompass the common good, so Lloyd supposes persons who have no way to speak with each other or make joint decisions. Both writers inject laissez-faire individualism into an old agrarian village and then gravely announce that the commons is dead. From the point of view of such a village, Lloyd’s assumptions are as crazy as asking us to ‘suppose a man to have a purse to which his left and right hand may freely resort, each unaware of the other’.” (...)
Paradoxically enough, the heedless quest for selfish gain— “rationally” pursued, of course, yet indifferent toward the collective good—is a better description of the conventional market economy than a commons. In the run-up to the 2008 financial crisis, such a mindset propelled the wizards of Wall Street to maximize private gains without regard for the systemic risks or local impacts. The real tragedy precipitated by “rational” individualism is not the tragedy of the commons, but the tragedy of the market.
by David Bollier, Evonomics | Read more:
Image: via:
Thursday, January 14, 2016
[ed. Sorry, have to take another short break. Enjoy the archives and see you next week.]
[ed. Well, that took a little longer than expected but nice to have an internet break for a week.]
[ed. Well, that took a little longer than expected but nice to have an internet break for a week.]
Wednesday, January 13, 2016
Meat Market
Chef-turned media personality Anthony Bourdain has made a career of bringing far-flung food culture onto our most closely held screens. He has built his brand by articulating the anti–Olive Garden for viewers anxious about the authenticity of their culinary practices. The hidden treasures he reveals on his shows—the best Vietnamese street vendor’s pho or delicate Colombian arepas—are difficult to access, the menus of the restaurants to which he treks look intimidating to his anglophone audience, and the food itself often doesn’t even seem appetizing—all the better to foment his brand’s air of exclusivity, always the handmaiden to authenticity.
Now, as Stephen Werther, Bourdain’s business partner, told the New York Times, “people want Tony’s show to come to life.” Never mind the fact that Bourdain does go to real places with live people in his show. What Werther is describing as “coming to life” is not any single thing from a Bourdain show but the relationship between subjects of the show and Bourdain’s seal of approval that holds it all together. Bourdain Market, set to open in about two years on Pier 57 in Manhattan’s Meat Packing district, purports to deliver exclusivity and democracy at the same time by putting remarkable food vendors all under one roof, thus consolidating all the hard work of curation and discovery and saving consumers from having to do any of it.
Up until his decision to open a market, Bourdain’s entire business had been capturing exotic dining experiences for television. Bourdain Market, like the World’s Fairs of the 19th and 20th centuries, will invert this business model by bringing people from around the world to a humongous food court so that they may “do” culture. It will provide what Bourdain calls a “democratic space open to and used by all,” a place where “wealthy and working class alike” can congregate in what promises to be the largest food hall in the city. Patrons will munch on prepared foods from both world-renowned and obscure restaurateurs on common tables and select the finest meats from butchers and fisheries. “Think of an Asian night market,” Bourdain tells New York Eater, as if that is a stable and widely understood reference for Americans, before clarifying that it means “eating and drinking at midnight”—something that could just as easily be said about a TGI Fridays. It will be a place that is “transparent and authentic”—unlike, presumably, the nearby Chelsea Market, once a public market by and for New Yorkers, now mainly a tourist destination. (...)
Authenticity is, for marketers and some cultural commentators, what objectivity is for scientists. It masquerades as an absolute, ascertainable quality inherent in situations when in fact it is a function of many contingencies, including subject position, social structure, historical happenstance, economic forces, and cultural norms. While objectivity relies on the expertise and training of scientists who follow certain procedures, authenticity is a product of cultural expertise with its own set of semi-arbitrary rules. Cultural experts are ordained with the power of finding and selling authenticity on the assumption that it exists somewhere, outside the self, and with the right training it can be discovered.
Just as adherence to objectivity is a necessary prerequisite for scientific “truth,” authenticity can seem to anchor taste judgments in some pure transcendent realm beyond the influence of social strategy or economic expediency. Though the aura of authenticity may seem like a matter of the aggressively unique thing in its “real” place, as when Bourdain boasts of tasting exotic foods that “you can’t find anywhere else in the world,” it is actually created in the space between the consumer and the consumed. For Walter Benjamin, aura is born of our desire to bring things closer, to experience the original outside the bounds of technological reproducibility. This desired closeness is two-fold: spatial and emotional, measured in distance and human connection. (...)
Pier 57, the future home of Bourdain Market, is a strange place to anchor a multimillion-dollar argument for the absolute existence of authenticity. The market will be connected to the High Line, a park built on the raised railroad tracks that once carried freight around the docks and shipping piers of the Meatpacking District. The High Line represents a new kind of fun complex that preserves past industrial history as a quaint tourist destination, in which the pieces of decommissioned track compliment the native flora. Bourdain himself, in his latest show The Layover, gives the High Line a minute-long commercial where he calls it “distinctly strange and beautiful,” but he says it with none of the passion and romance he reserves for a well-constructed hot dog. The High Line, through plaques and tour guides, informs visitors that what they are seeing is simultaneously a conscious selection of flora that was endemic to Manhattan Island prior to urbanization and the nostalgic preservation of an industrial infrastructure prior to New York’s latest wave of gentrification. Both of these combine in a mise-en-scène of New York City through different scales of time. There is even a small amphitheater suspended above the street, so that visitors can stare at unfolding city life as if it were theater.
This is all antithetical to Bourdainian authenticity, which he frames as a matter of direct accessibility and individuated distinction. In his shows Bourdain has nothing but disdain for the carefully posed and self-consciously displayed. Everything that, for him, is contained in “the hipster” is a profane act of showmanship, not craftsmanship. To have a truly authentic experience one must identify something as authentic and then take the leap of faith to literally consume it. You put your trust in a local with whom you can imagine you have some sort of noncontractual relationship. Those relations are authentic; tour guides are irredeemable.
But what is Bourdain to his shows’ audiences and the patrons of his future food market, if not their contractually hired tour guide? If Bourdain Market is supposed to make the content of his show—the authenticity of hard-to-access food—“come alive,” then what it will sell is more about the proximity between products (I get to sample elk meat right before finding out what a papaya tastes like) than the food itself. Yet if this is the case, then anything in Bourdain Market must lose a portion of its aura, as papaya and elk are not endemic to the same region, nor the hinterlands of Manhattan. The authenticity of any particular product is negated in favor of sustaining the authority of Bourdain as judge, jury, and executioner of authenticity.
By making it physically possible to access foods from around the world, Bourdain Market will let you choose the scenarios for your own food-centered reality TV show. And just like a reality TV show, Bourdain Market will run roughshod over particulars in its restaging of the real. Nowhere is this more obvious than in the New York Times article that announced the project, a short writeup that required three corrections, including one for the artist’s rendering of the future market that contained fake Chinese characters.
As Benjamin and Baudrillard warn, it is impossible to consciously create an authentic experience. The friction between Chelsea Market, the High Line, the conceit of Bourdain’s own shows, and his new market reveals the hypocrisy of the entire project: Bourdain Market is as authentic, transparent, democratic, and open as basic cable TV.
Now, as Stephen Werther, Bourdain’s business partner, told the New York Times, “people want Tony’s show to come to life.” Never mind the fact that Bourdain does go to real places with live people in his show. What Werther is describing as “coming to life” is not any single thing from a Bourdain show but the relationship between subjects of the show and Bourdain’s seal of approval that holds it all together. Bourdain Market, set to open in about two years on Pier 57 in Manhattan’s Meat Packing district, purports to deliver exclusivity and democracy at the same time by putting remarkable food vendors all under one roof, thus consolidating all the hard work of curation and discovery and saving consumers from having to do any of it.Up until his decision to open a market, Bourdain’s entire business had been capturing exotic dining experiences for television. Bourdain Market, like the World’s Fairs of the 19th and 20th centuries, will invert this business model by bringing people from around the world to a humongous food court so that they may “do” culture. It will provide what Bourdain calls a “democratic space open to and used by all,” a place where “wealthy and working class alike” can congregate in what promises to be the largest food hall in the city. Patrons will munch on prepared foods from both world-renowned and obscure restaurateurs on common tables and select the finest meats from butchers and fisheries. “Think of an Asian night market,” Bourdain tells New York Eater, as if that is a stable and widely understood reference for Americans, before clarifying that it means “eating and drinking at midnight”—something that could just as easily be said about a TGI Fridays. It will be a place that is “transparent and authentic”—unlike, presumably, the nearby Chelsea Market, once a public market by and for New Yorkers, now mainly a tourist destination. (...)
Authenticity is, for marketers and some cultural commentators, what objectivity is for scientists. It masquerades as an absolute, ascertainable quality inherent in situations when in fact it is a function of many contingencies, including subject position, social structure, historical happenstance, economic forces, and cultural norms. While objectivity relies on the expertise and training of scientists who follow certain procedures, authenticity is a product of cultural expertise with its own set of semi-arbitrary rules. Cultural experts are ordained with the power of finding and selling authenticity on the assumption that it exists somewhere, outside the self, and with the right training it can be discovered.
Just as adherence to objectivity is a necessary prerequisite for scientific “truth,” authenticity can seem to anchor taste judgments in some pure transcendent realm beyond the influence of social strategy or economic expediency. Though the aura of authenticity may seem like a matter of the aggressively unique thing in its “real” place, as when Bourdain boasts of tasting exotic foods that “you can’t find anywhere else in the world,” it is actually created in the space between the consumer and the consumed. For Walter Benjamin, aura is born of our desire to bring things closer, to experience the original outside the bounds of technological reproducibility. This desired closeness is two-fold: spatial and emotional, measured in distance and human connection. (...)
Pier 57, the future home of Bourdain Market, is a strange place to anchor a multimillion-dollar argument for the absolute existence of authenticity. The market will be connected to the High Line, a park built on the raised railroad tracks that once carried freight around the docks and shipping piers of the Meatpacking District. The High Line represents a new kind of fun complex that preserves past industrial history as a quaint tourist destination, in which the pieces of decommissioned track compliment the native flora. Bourdain himself, in his latest show The Layover, gives the High Line a minute-long commercial where he calls it “distinctly strange and beautiful,” but he says it with none of the passion and romance he reserves for a well-constructed hot dog. The High Line, through plaques and tour guides, informs visitors that what they are seeing is simultaneously a conscious selection of flora that was endemic to Manhattan Island prior to urbanization and the nostalgic preservation of an industrial infrastructure prior to New York’s latest wave of gentrification. Both of these combine in a mise-en-scène of New York City through different scales of time. There is even a small amphitheater suspended above the street, so that visitors can stare at unfolding city life as if it were theater.
This is all antithetical to Bourdainian authenticity, which he frames as a matter of direct accessibility and individuated distinction. In his shows Bourdain has nothing but disdain for the carefully posed and self-consciously displayed. Everything that, for him, is contained in “the hipster” is a profane act of showmanship, not craftsmanship. To have a truly authentic experience one must identify something as authentic and then take the leap of faith to literally consume it. You put your trust in a local with whom you can imagine you have some sort of noncontractual relationship. Those relations are authentic; tour guides are irredeemable.
But what is Bourdain to his shows’ audiences and the patrons of his future food market, if not their contractually hired tour guide? If Bourdain Market is supposed to make the content of his show—the authenticity of hard-to-access food—“come alive,” then what it will sell is more about the proximity between products (I get to sample elk meat right before finding out what a papaya tastes like) than the food itself. Yet if this is the case, then anything in Bourdain Market must lose a portion of its aura, as papaya and elk are not endemic to the same region, nor the hinterlands of Manhattan. The authenticity of any particular product is negated in favor of sustaining the authority of Bourdain as judge, jury, and executioner of authenticity.
By making it physically possible to access foods from around the world, Bourdain Market will let you choose the scenarios for your own food-centered reality TV show. And just like a reality TV show, Bourdain Market will run roughshod over particulars in its restaging of the real. Nowhere is this more obvious than in the New York Times article that announced the project, a short writeup that required three corrections, including one for the artist’s rendering of the future market that contained fake Chinese characters.
As Benjamin and Baudrillard warn, it is impossible to consciously create an authentic experience. The friction between Chelsea Market, the High Line, the conceit of Bourdain’s own shows, and his new market reveals the hypocrisy of the entire project: Bourdain Market is as authentic, transparent, democratic, and open as basic cable TV.
by David A. Banks and Britney Sumiit-Gil, The New Inquiry | Read more:
Image: uncredited
Addressing 4 billion People In Three Words
Last week in New York, at the Next Billion conference organized by Quartz, Chris Sheldrick, the CEO of What3Words, captured his audience with strong arguments: 75% of the earth population, i.e. four billion people, “don’t exist” because they have no physical address. This cohort of “unaddressed” can’t open a bank account, can’t deal properly with an hospital or an administration, let alone get a delivery. This is a major impediment to global development.
Governments, the Word Bank and various NGOs have poured millions of dollars to launch addressing programs. A country like Ghana tried four times without success. In Brazil, this portion of Rio de Janeiro with its its sparse network of roads and streets looks like an empty land:
Adding the satellite layer, you discover this:
This is one of the world’s largest slums in the world, the Rocinha favela: 355 acres (143 hectares) of intertwined sheds hosting 70,000 people. Translated into density, this amounts to a staggering 120,000 persons per square mile (48,000 per km2). Go figure how to deliver a package or simply how to provide the most basic administrative assistance such as monitoring health or education.
The developing world is not the only one to suffer from poor addressing.
Decades or urbanizations are have not necessarily been associated with discipline when it comes to building a reliable address system. This blog, maintained by a British computer scientist named Michael Tandy, compiles an outstanding series of absurd occurrences in global addressing systems. Here is just one example, an address in Tokyo.
〒100-8994 (zip code), 東京都 (Tokyo-to, i.e. Tokyo prefecture or state) 中央区 (Chuo-ku, i.e. Chuo Ward) 八重洲一丁目 (Yaesu 1-chome, i.e. Yaesu district 1st subdistrict) 5番3号 (block 5 lot 3), 東京中央郵便局 (Tokyo Central Post Office).
Messy addressing systems have measurable consequences. UPS, the world’s largest parcel delivery provider, calculated that if its trucks merely drove one mile less per day, the company would save $50m a year. In United Kingdom, bad addressing costs the Royal Mail £775m per year.
One might say latitude and longitude can solve this. Sure thing. Except that GPS coordinates require 16 digits, 2 characters (+/-/N/S/E/W), 2 decimal points, space and comma, to specify a location of the size of a housing block. Not helpful for a densely populated African village, or a Mumbai slum.
In his previous job, Chris Sheldrick (now 33) had his epiphany when organizing large musical events around the world. Tons of material had to be shipped at a specific location and date/time. After several mishaps, he too tried using GPS coordinates to make dozens of flight cases converge at the right time and place. But people got confused with lat/long, sometimes mixing ones and sevens, etc. After a dramatic mistake that almost ruined a large wedding party in the Italian countryside, he vented his frustration to a mathematician friend who then suggested the following: why not replacing GPS coordinates with actual words that anyone can understand and memorize? Sheldrick’s mathematician pal came up with a simple idea: a combination of three words, in any language, could specify every 3 meters by 3 meters square in the world. More than enough to designate a hut in Siberia or a building doorway in Tokyo. Altogether, 40,000 words combined in triplets label 57 trillion squares. Thus far, the system has been built in 10 langages: English, Spanish, French, German, Italian, Swahili, Portuguese, Swedish, Turkish and, starting next month, Arabic… All together, this lingua franca requires only 5 megabytes of data, small enough to reside in any smartphone and work offline. Each square has its identity in its own language that is not a translation of another. The dictionaries have been refined to avoid homophones or offensive terms, short terms being reserved for the most populated area. And, unlike the GPS lat/long system, What3Words has an autocorrect feature that proposes the right terms if words are misspelled, or even mispronounced since the system is to be used in a voice-recognition navigation system.
by Frédéric Filloux, Monday Note | Read more:
Governments, the Word Bank and various NGOs have poured millions of dollars to launch addressing programs. A country like Ghana tried four times without success. In Brazil, this portion of Rio de Janeiro with its its sparse network of roads and streets looks like an empty land:
Adding the satellite layer, you discover this:
This is one of the world’s largest slums in the world, the Rocinha favela: 355 acres (143 hectares) of intertwined sheds hosting 70,000 people. Translated into density, this amounts to a staggering 120,000 persons per square mile (48,000 per km2). Go figure how to deliver a package or simply how to provide the most basic administrative assistance such as monitoring health or education.
The developing world is not the only one to suffer from poor addressing.
Decades or urbanizations are have not necessarily been associated with discipline when it comes to building a reliable address system. This blog, maintained by a British computer scientist named Michael Tandy, compiles an outstanding series of absurd occurrences in global addressing systems. Here is just one example, an address in Tokyo.
〒100-8994 (zip code), 東京都 (Tokyo-to, i.e. Tokyo prefecture or state) 中央区 (Chuo-ku, i.e. Chuo Ward) 八重洲一丁目 (Yaesu 1-chome, i.e. Yaesu district 1st subdistrict) 5番3号 (block 5 lot 3), 東京中央郵便局 (Tokyo Central Post Office).
Messy addressing systems have measurable consequences. UPS, the world’s largest parcel delivery provider, calculated that if its trucks merely drove one mile less per day, the company would save $50m a year. In United Kingdom, bad addressing costs the Royal Mail £775m per year.
One might say latitude and longitude can solve this. Sure thing. Except that GPS coordinates require 16 digits, 2 characters (+/-/N/S/E/W), 2 decimal points, space and comma, to specify a location of the size of a housing block. Not helpful for a densely populated African village, or a Mumbai slum.
In his previous job, Chris Sheldrick (now 33) had his epiphany when organizing large musical events around the world. Tons of material had to be shipped at a specific location and date/time. After several mishaps, he too tried using GPS coordinates to make dozens of flight cases converge at the right time and place. But people got confused with lat/long, sometimes mixing ones and sevens, etc. After a dramatic mistake that almost ruined a large wedding party in the Italian countryside, he vented his frustration to a mathematician friend who then suggested the following: why not replacing GPS coordinates with actual words that anyone can understand and memorize? Sheldrick’s mathematician pal came up with a simple idea: a combination of three words, in any language, could specify every 3 meters by 3 meters square in the world. More than enough to designate a hut in Siberia or a building doorway in Tokyo. Altogether, 40,000 words combined in triplets label 57 trillion squares. Thus far, the system has been built in 10 langages: English, Spanish, French, German, Italian, Swahili, Portuguese, Swedish, Turkish and, starting next month, Arabic… All together, this lingua franca requires only 5 megabytes of data, small enough to reside in any smartphone and work offline. Each square has its identity in its own language that is not a translation of another. The dictionaries have been refined to avoid homophones or offensive terms, short terms being reserved for the most populated area. And, unlike the GPS lat/long system, What3Words has an autocorrect feature that proposes the right terms if words are misspelled, or even mispronounced since the system is to be used in a voice-recognition navigation system.
by Frédéric Filloux, Monday Note | Read more:
Images: uncredited
Snapchat 101: Learn to Love the World’s Most Confusing Social Network
[ed. This was very helpful. Now I know I will never use Snapchat.]

How old? Well, when a 20-something tried to explain to me how to add a friend in the app, he began talking loudly and slowly. “YOU…PRESS…HERE…OK?”
I’m 31 and a professional technology reviewer. Not exactly Betty White.
Attention everyone born before 1986: It’s not you, it’s Snapchat. The app, now used daily by 100 million people, requires the same initial concentration as assembling IKEA furniture. There are mysterious icons that look like ancient hieroglyphs, a maze of menus not even Pac-Man could maneuver, secret finger presses. And I haven’t even gotten to the fact that many of the messages on the service self-destruct after you look at them.
But we can’t keep shooing Snapchat off our lawns. It’s about to have its Facebook moment. Most of the leading 2016 candidates are posting Snapchat videos and photos from the campaign trail, and the White House just got on board. Celebrities and news outlets are sharing up-to-the-minute updates. The Wall Street Journal launched its own Snapchat Discover channel last week.
So why Snapchat, when there are already three massive social networks to choose from? Because awesome. Well, that’s how millennials would answer. Facebook is for major life updates. (Your friend from third grade just had her 10th baby!) Twitter is for keeping up with news and live events. ( Taylor Swift released a new video…again.) Instagram is for jealousy-inducing photos. (Bora Bora is beautiful; your cubicle is not.)
Snapchat is for bearing witness—telling stories in raw, often humorous, behind-the-scenes clips or messages. If an 80-year-old can climb Everest, we can conquer—and even learn to love—Snapchat. Here’s how I did.
How to Understand Snapchat
Part of what makes understanding Snapchat so difficult are the many different ways of communicating inside the vertical screen. But remember these three elements:
Part of what makes understanding Snapchat so difficult are the many different ways of communicating inside the vertical screen. But remember these three elements:
- Snaps—Snaps are self-destructing photos or quick videos (up to 10 seconds) you send to one or multiple friends. Users send these expiring messages because they allow for more intimate and personal conversations. You’ll find sent and received snaps hiding to the left of the home screen.
- Story—Increasingly, people are broadcasting their snaps for everyone to see in what’s called a story—a series of moments that won’t self-destruct for 24 hours. You’ll find your friends’ stories to the right of the home screen.
- Chat—Snapchat also has one-to-one text chatting. You chat back and forth in the typical way, but when you navigate away from the chat screen, you lose the thread forever. Chats, like snaps, appear to the left of the home screen.
by Joanna Stern, WSJ | Read more:
Image: Carlo Giambarresi
Tuesday, January 12, 2016
David Bowie, Gail Ann Dorsey, Reeves Gabrels
[ed. See also: The invention of David Bowie]
Complex Systems, Feedback Loops, and the Bubble-Crash Cycle
Our expectations for a global economic downturn, including a U.S. recession, have hardened considerably in the past few weeks, with a continued expectation of a retreat in equity prices on the order of 40-55% over the completion of the current cycle as a base case. The immediacy of both concerns would be significantly reduced if we were to observe a shift to uniformly favorable market internals. Last week, market conditions moved further away from that supportive possibility. As I’ve regularly emphasized since mid-2014, market internals are the hinge between an overvalued market that tends to continue higher from an overvalued market that collapses; the hinge between Fed easing that supports the market and Fed easing that does nothing to stem a market plunge; and the hinge between weak leading economic data that subsequently recovers and weak leading economic data that devolves into a recession.
We continue to observe deterioration in what I call the “order surplus” (new orders + order backlogs - inventories) that typically leads economic activity. Indeed, across a variety of national and regional economic surveys, as well as international data, order backlogs have dried up while inventories have expanded. Understand that recessions are not primarily driven by weakness in consumer spending. Year-over-year real personal consumption has only declined in the worst recessions, and year-over-year nominal consumption only declined in 2009, 1938 and 1932. Rather, what collapses in a recession is the inventory component of gross private investment, and as a result of scale-backs in production, real GDP falls relative to real final sales.
Emphatically, recessions are primarily points where the mix of goods and services demanded by the economy becomes misaligned with the mix of goods and services being produced. As consumer preferences shift, technology introduces new products that dominate old ones, or market signals are distorted by policy, the effects always take time to be observed and fully appreciated by all economic participants. Mismatches between demand and production build in the interim, and at the extreme, new industries can entirely replace the need for old ones. Recessions represent the adjustment to those mismatches. Push reasonable adjustments off with policy distortions (like easy credit) for too long, and the underlying mismatches become larger and ultimately more damaging. (...)
While a weak equity market, in and of itself, is not tightly correlated with subsequent economic weakness, equity market weakness combined with weak leading economic data is associated with an enormous jump in the probability of an economic recession. See in particular From Risk to Guarded Expectation of Recession, and When Market Trends Break, Even Borderline Data is Recessionary.
The chart below presents the same data as the one above, except that it shows only values corresponding to periods where the S&P 500 was below its level of 6-months prior (as it is at present). Other values are set to zero. Again, while a uniform improvement in market internals would relieve the immediacy of our present economic and market concerns, we already observe deterioration in leading economic measures that - coupled with financial market behavior - has always been associated with U.S. recessions.
The immediate conclusion that one might draw is that the Federal Reserve made a “policy mistake” by raising interest rates in December. But that would far understate the actual damage contributed by the Fed. No, the real policy mistake was to provoke years of yield-seeking speculation through Ben Bernanke’s deranged policy of quantitative easing, which propagated like a virus to central banks across the globe. The extreme and extended nature of the recent speculative episode means that we do not simply have to worry about a run-of-the-mill recession or an ordinary bear market. We instead have to be concerned about the potential for another global financial crisis, born of years of capital misallocation and expansion of low-quality debt both here in the U.S. and in the emerging economies. For a review of these concerns, see The Next Big Short: The Third Crest of a Rolling Tsunami.
We continue to observe deterioration in what I call the “order surplus” (new orders + order backlogs - inventories) that typically leads economic activity. Indeed, across a variety of national and regional economic surveys, as well as international data, order backlogs have dried up while inventories have expanded. Understand that recessions are not primarily driven by weakness in consumer spending. Year-over-year real personal consumption has only declined in the worst recessions, and year-over-year nominal consumption only declined in 2009, 1938 and 1932. Rather, what collapses in a recession is the inventory component of gross private investment, and as a result of scale-backs in production, real GDP falls relative to real final sales.
Emphatically, recessions are primarily points where the mix of goods and services demanded by the economy becomes misaligned with the mix of goods and services being produced. As consumer preferences shift, technology introduces new products that dominate old ones, or market signals are distorted by policy, the effects always take time to be observed and fully appreciated by all economic participants. Mismatches between demand and production build in the interim, and at the extreme, new industries can entirely replace the need for old ones. Recessions represent the adjustment to those mismatches. Push reasonable adjustments off with policy distortions (like easy credit) for too long, and the underlying mismatches become larger and ultimately more damaging. (...)
While a weak equity market, in and of itself, is not tightly correlated with subsequent economic weakness, equity market weakness combined with weak leading economic data is associated with an enormous jump in the probability of an economic recession. See in particular From Risk to Guarded Expectation of Recession, and When Market Trends Break, Even Borderline Data is Recessionary.
The chart below presents the same data as the one above, except that it shows only values corresponding to periods where the S&P 500 was below its level of 6-months prior (as it is at present). Other values are set to zero. Again, while a uniform improvement in market internals would relieve the immediacy of our present economic and market concerns, we already observe deterioration in leading economic measures that - coupled with financial market behavior - has always been associated with U.S. recessions.
The immediate conclusion that one might draw is that the Federal Reserve made a “policy mistake” by raising interest rates in December. But that would far understate the actual damage contributed by the Fed. No, the real policy mistake was to provoke years of yield-seeking speculation through Ben Bernanke’s deranged policy of quantitative easing, which propagated like a virus to central banks across the globe. The extreme and extended nature of the recent speculative episode means that we do not simply have to worry about a run-of-the-mill recession or an ordinary bear market. We instead have to be concerned about the potential for another global financial crisis, born of years of capital misallocation and expansion of low-quality debt both here in the U.S. and in the emerging economies. For a review of these concerns, see The Next Big Short: The Third Crest of a Rolling Tsunami.
by John P. Hussman, Ph D. Hussman Funds | Read more:
Image: Hussman Funds
Why Americans Dress So Casually
As you look at that hoodie you got as a Christmas or Hanukkah present, you may wonder why you didn't get something a little more fancy as a gift. Don't take it personally. It turns out that Americans are a decidedly casual society when it comes to fashion. In this piece, originally published in September, we examined how that came to be. In this conversation, we explore what happened in America that made us dress so casually.
Look around you, and you'll likely notice a sea of different outfits. You might see similar articles of clothing — even the same ones — worn by different people, but rarely do you find two pairings of tops, bottoms, shoes, and accessories that are exactly alike.
That wasn't always the case, said Deirdre Clemente, a historian of 20th century American culture at the University of Nevada, Las Vegas, whose research focuses on fashion and clothing. Americans were far more formal, and formulaic dressers, not all that long ago. Men wore suits, almost without fail — not just to work, but also at school. And women, for the most part, wore long dresses.
Clemente has written extensively about the evolution of American dress in the 1900s, a period that, she said, was marked, maybe more than anything else, by a single but powerful trend: As everyday fashion broke from tradition, it shed much of its socioeconomic implications — people no longer dress to feign wealth like they once did — and took on a new meaning.
The shift has, above all, led toward casualness in the way we dress. It can be seen on college campuses, in classrooms, where students attend in sweatpants, and in the workplace, where Silicon Valley busy bodies are outfitted with hoodies and T-shirts. That change, the change in how we dress here in America, has been brewing since the 1920s, and owes itself to the rise of specific articles of clothing. What's more, it underscores important shifts in the way we use and understand the shirts and pants we wear.
I spoke with Clemente to learn more about the origins of casual dress, and the staying power of the trend. The interview has been edited for length and clarity.
Let’s start by talking a bit about what you study. You’re a historian, and you focus on American culture as it pertains to fashion. Is that right?
I'm a cultural historian. I’m a 20th century expert, so don’t ask me anything about the Civil War. And my focus is clothing in fashion. So I’m a little bit of a business historian, a little bit of a historian of marketing, and a little bit of a historian of gender. When you kind of mix all of those things together, all those subsections of history, you get what I study.
So that scene from "The Devil Wears Prada," when Meryl Streep criticizes Anne Hathaway for believing she isn’t affected by fashion, it must resonate with you.
Well you know, it’s just so true. People say, "Oh well, you know, I don’t care about fashion." They go to the Gap, they go to Old Navy, and they all dress alike, they wear these uniforms. The thing that I really harp on is that, that in and of itself is a choice, it’s a personal choice, because there are many people who don’t do that. In buying those uniforms, you’re saying something about yourself, and about how you feel about clothing and culture. There is no such thing as an unaffected fashion choice. Anti-fashion is fashion, because it’s a reaction to the current visual culture, a negation of it.
by Roberto A. Ferdman, Washington Post | Read more:
Image: Barry Wetcher
Look around you, and you'll likely notice a sea of different outfits. You might see similar articles of clothing — even the same ones — worn by different people, but rarely do you find two pairings of tops, bottoms, shoes, and accessories that are exactly alike.That wasn't always the case, said Deirdre Clemente, a historian of 20th century American culture at the University of Nevada, Las Vegas, whose research focuses on fashion and clothing. Americans were far more formal, and formulaic dressers, not all that long ago. Men wore suits, almost without fail — not just to work, but also at school. And women, for the most part, wore long dresses.
Clemente has written extensively about the evolution of American dress in the 1900s, a period that, she said, was marked, maybe more than anything else, by a single but powerful trend: As everyday fashion broke from tradition, it shed much of its socioeconomic implications — people no longer dress to feign wealth like they once did — and took on a new meaning.
The shift has, above all, led toward casualness in the way we dress. It can be seen on college campuses, in classrooms, where students attend in sweatpants, and in the workplace, where Silicon Valley busy bodies are outfitted with hoodies and T-shirts. That change, the change in how we dress here in America, has been brewing since the 1920s, and owes itself to the rise of specific articles of clothing. What's more, it underscores important shifts in the way we use and understand the shirts and pants we wear.
I spoke with Clemente to learn more about the origins of casual dress, and the staying power of the trend. The interview has been edited for length and clarity.
Let’s start by talking a bit about what you study. You’re a historian, and you focus on American culture as it pertains to fashion. Is that right?
I'm a cultural historian. I’m a 20th century expert, so don’t ask me anything about the Civil War. And my focus is clothing in fashion. So I’m a little bit of a business historian, a little bit of a historian of marketing, and a little bit of a historian of gender. When you kind of mix all of those things together, all those subsections of history, you get what I study.
So that scene from "The Devil Wears Prada," when Meryl Streep criticizes Anne Hathaway for believing she isn’t affected by fashion, it must resonate with you.
Well you know, it’s just so true. People say, "Oh well, you know, I don’t care about fashion." They go to the Gap, they go to Old Navy, and they all dress alike, they wear these uniforms. The thing that I really harp on is that, that in and of itself is a choice, it’s a personal choice, because there are many people who don’t do that. In buying those uniforms, you’re saying something about yourself, and about how you feel about clothing and culture. There is no such thing as an unaffected fashion choice. Anti-fashion is fashion, because it’s a reaction to the current visual culture, a negation of it.
by Roberto A. Ferdman, Washington Post | Read more:
Image: Barry Wetcher
When Philosophy Lost Its Way
[ed. I can think of one exception - Eric Hoffer.]
Yet despite the richness and variety of these accounts, all of them pass over a momentous turning point: the locating of philosophy within a modern institution (the research university) in the late 19th century. This institutionalization of philosophy made it into a discipline that could be seriously pursued only in an academic setting. This fact represents one of the enduring failures of contemporary philosophy.Take this simple detail: Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university. Against the inclinations of Socrates, philosophers became experts like other disciplinary specialists. This occurred even as they taught their students the virtues of Socratic wisdom, which highlights the role of the philosopher as the non-expert, the questioner, the gadfly.
Philosophy, then, as the French thinker Bruno Latour would have it, was “purified” — separated from society in the process of modernization. This purification occurred in response to at least two events. The first was the development of the natural sciences, as a field of study clearly distinct from philosophy, circa 1870, and the appearance of the social sciences in the decade thereafter. Before then, scientists were comfortable thinking of themselves as “natural philosophers” — philosophers who studied nature; and the predecessors of social scientists had thought of themselves as “moral philosophers.”
The second event was the placing of philosophy as one more discipline alongside these sciences within the modern research university. A result was that philosophy, previously the queen of the disciplines, was displaced, as the natural and social sciences divided the world between them.
This is not to claim that philosophy had reigned unchallenged before the 19th century. The role of philosophy had shifted across the centuries and in different countries. But philosophy in the sense of a concern about who we are and how we should live had formed the core of the university since the church schools of the 11th century. Before the development of a scientific research culture, conflicts among philosophy, medicine, theology and law consisted of internecine battles rather than clashes across yawning cultural divides. Indeed, these older fields were widely believed to hang together in a grand unity of knowledge — a unity directed toward the goal of the good life. But this unity shattered under the weight of increasing specialization by the turn of the 20th century.
Monday, January 11, 2016
A Simple Blood Test For Every Form of Cancer
Catching cancer early is an incredible challenge, but a new way to detect it in the blood would have the potential to totally revolutionize cancer treatment in just a few years.
Illumina, the $25 billion maker of gene sequencing technology, has created a new company that's trying to invent a blood test to identify all cancers in their early stages, something that would be a tremendous help in diagnosing the illness before it is too difficult to treat effectively.
The team behind the announcement is not the first to attempt something of this nature, and previous efforts by other companies have been criticized for having too little research behind them or focusing too much on detection rather than treatment. Crucially, this blood test does not exist yet, and while scientists will be working furiously to try to make it happen, it doesn't mean they will succeed.
But this latest bet is one of the best-funded, with a number of illustrious scientists already involved — and Illumina's backing may give it a critical boost.
The new company, called Grail (as in, it's trying to achieve something considered the Holy Grail for cancer researchers) hopes to have a pan-cancer blood test by 2019, an extremely ambitious goal.
That would mean that anyone could add such a test onto their annual physical — no need for separate tests for different types of lung cancer, prostate cancer, or any other form of the illness.
Grail is launching with $100 million in Series A financing, backed by Illumina, Bill Gates, Sutter Hill Ventures, and Jeff Bezos' Bezos Expeditions. Illumina and Memorial Sloan Kettering Cancer Center are partnering to help launch a study to see if Grail's test can actually do what they hope it will do.
"We look forward to a day in the not too distant future where there would be a simple blood test for every form of cancer," Dr. Richard Klausner, former director of the National Cancer Institute and a board member of Grail, said on a press call on Sunday.
The key to this effort is the ability to detect what's known as circulating tumor DNA, or CTDNA. In recent years, doctors have discovered that the genetic material from cancerous tumors starts circulating in our bodies.
"It's abundantly clear that these molecules are in the blood," Illumina CEO Jay Flatley said on the call. (...)
Flatley is well aware of the minefield Grail is entering. "If you look at this business, it’s littered with failures. With a few exceptions, screening tests have been invariably horrible," he told the MIT Technology Review. "It’s a big challenge."
Illumina, the $25 billion maker of gene sequencing technology, has created a new company that's trying to invent a blood test to identify all cancers in their early stages, something that would be a tremendous help in diagnosing the illness before it is too difficult to treat effectively.
The team behind the announcement is not the first to attempt something of this nature, and previous efforts by other companies have been criticized for having too little research behind them or focusing too much on detection rather than treatment. Crucially, this blood test does not exist yet, and while scientists will be working furiously to try to make it happen, it doesn't mean they will succeed.But this latest bet is one of the best-funded, with a number of illustrious scientists already involved — and Illumina's backing may give it a critical boost.
The new company, called Grail (as in, it's trying to achieve something considered the Holy Grail for cancer researchers) hopes to have a pan-cancer blood test by 2019, an extremely ambitious goal.
That would mean that anyone could add such a test onto their annual physical — no need for separate tests for different types of lung cancer, prostate cancer, or any other form of the illness.
Grail is launching with $100 million in Series A financing, backed by Illumina, Bill Gates, Sutter Hill Ventures, and Jeff Bezos' Bezos Expeditions. Illumina and Memorial Sloan Kettering Cancer Center are partnering to help launch a study to see if Grail's test can actually do what they hope it will do.
"We look forward to a day in the not too distant future where there would be a simple blood test for every form of cancer," Dr. Richard Klausner, former director of the National Cancer Institute and a board member of Grail, said on a press call on Sunday.
The key to this effort is the ability to detect what's known as circulating tumor DNA, or CTDNA. In recent years, doctors have discovered that the genetic material from cancerous tumors starts circulating in our bodies.
"It's abundantly clear that these molecules are in the blood," Illumina CEO Jay Flatley said on the call. (...)
Flatley is well aware of the minefield Grail is entering. "If you look at this business, it’s littered with failures. With a few exceptions, screening tests have been invariably horrible," he told the MIT Technology Review. "It’s a big challenge."
by Kevin Loria, Tech Insider | Read more:
Image: Shutterstock
Subscribe to:
Comments (Atom)








