Monday, August 31, 2015
Sunday, August 30, 2015
The Life and Death of the American Lawn
The hashtag #droughtshaming—which primarily exists, as its name suggests, to publicly decry people who have failed to do their part to conserve water during California’s latest drought—has claimed many victims. Anonymous lawn-waterers. Anonymous sidewalk-washers. The city of Beverly Hills. The tag’s most high-profile shamee thus far, however, has been the actor Tom Selleck. Who was sued earlier this summer by Ventura County’s Calleguas Municipal Water District for the alleged theft of hydrant water, supposedly used to nourish his 60-acre ranch. Which includes, this being California, an avocado farm, and also an expansive lawn.
The case was settled out of court on terms that remain undisclosed, and everyone has since moved on with their lives. What’s remarkable about the whole thing, though—well, besides the fact that Magnum P.I. has apparently become, in his semi-retirement, a gentleman farmer—is how much of a shift all the Selleck-shaming represents, as a civic impulse. For much of American history, the healthy lawn—green, lush, neatly shorn—has been a symbol not just of prosperity, individual and communal, but of something deeper: shared ideals, collective responsibility, the assorted conveniences of conformity. Lawns, originally designed to connect homes even as they enforced the distance between them, are shared domestic spaces. They are also socially regulated spaces. “When smiling lawns and tasteful cottages begin to embellish a country,” Andrew Jackson Downing, one of the nation’s first landscaper-philosophers, put it, “we know that order and culture are established.”
That idea remains, and it means that, even today, the failure to maintain a “smiling lawn” can have decidedly unhappy consequences. Section 119-3 of the county code of Fairfax County, Virginia—a section representative of similar ones on the books in jurisdictions across the country—stipulates that “it is unlawful for any owner of any occupied residential lot or parcel which is less than one-half acre (21,780 square feet) to permit the growth of any grass or lawn area to reach more than twelve (12) inches in height/length.” And while Fairfax County sensibly advises that matters of grass length are best adjudicated among neighbors, it adds, rather sternly, that if the property in question “is vacant or the resident doesn’t seem to care, you can report the property to the county.”
That kind of reporting can result in much more than fines. In 2008, Joe Prudente—a retiree in Florida whose lawn, despite several re-soddings and waterings and weedings, contained some unsightly brown patches—was jailed for “failing to properly maintain his lawn to community standards.” Earlier this year, Rick Yoes, a resident of Grand Prairie, Texas, also spent time behind bars—for the crime, in this case, of the ownership of an overgrown yard. Gerry Suttle, a woman in her mid-70s, recently had a warrant issued for her arrest—she had failed to mow the grass on a lot she owned across the street from her house—until four boys living in her Texas neighborhood heard of her plight in a news report, came over, and mowed the thing themselves.(...)
Which is all to say that lawns, long before Tom Selleck came along, have doubled as sweeping, sodded outgrowths of the Protestant ethic. The tapis vert, or “green carpet”—a concept Americans borrowed not just from French gardens and English estates, but also from the fantastical Italian paintings that imagined modern lawns into existence—became, as installed in the not-yet-united states, a signal that the American colonies aspired to match Europe in, among other things, elitism. (Lawns, in Europe, were an early form of conspicuous consumption, signs that their owners could afford to dedicate grounds to aesthetic, rather than agricultural, purposes—and signs, too, that their owners, in the days before lawnmowers lessened the burden of grass-shearing, could afford to pay scythe-wielding servants to do that labor.) Thomas Jefferson, being Thomas Jefferson, surrounded Monticello not just with neatly rowed crops, but with expanses of sheared grass that served no purpose but to send a message—about Jefferson himself, and about the ambitions of a newly formed nation.
As that country developed, its landscape architects would sharpen lawns’ symbolism: of collectivity, of interlocking destinies, of democracy itself. “It is unchristian,” the landscaper Frank J. Scott wrote in The Art of Beautifying Suburban Home Grounds, “to hedge from the sight of others the beauties of nature which it has been our good fortune to create or secure.” He added, magnanimously, that “the beauty obtained by throwing front grounds open together, is of that excellent quality which enriches all who take part in the exchange, and makes no man poorer.” Lawns became aesthetic extensions of Manifest Destiny, symbols of American entitlement and triumph, of the soft and verdant rewards that result when man’s ongoing battles against nature are finally won. A well-maintained lawn—luxurious in its lushness, implying leisure even as its upkeep had a stubborn way of preventing it—came, too, to represent a triumph of another kind: the order of suburbia over the squalor of the city. A neat expanse of green, blades clipped to uniform length and flowing from home to home, became, as Roman Mars notes, the “anti-broken window.”

That idea remains, and it means that, even today, the failure to maintain a “smiling lawn” can have decidedly unhappy consequences. Section 119-3 of the county code of Fairfax County, Virginia—a section representative of similar ones on the books in jurisdictions across the country—stipulates that “it is unlawful for any owner of any occupied residential lot or parcel which is less than one-half acre (21,780 square feet) to permit the growth of any grass or lawn area to reach more than twelve (12) inches in height/length.” And while Fairfax County sensibly advises that matters of grass length are best adjudicated among neighbors, it adds, rather sternly, that if the property in question “is vacant or the resident doesn’t seem to care, you can report the property to the county.”
That kind of reporting can result in much more than fines. In 2008, Joe Prudente—a retiree in Florida whose lawn, despite several re-soddings and waterings and weedings, contained some unsightly brown patches—was jailed for “failing to properly maintain his lawn to community standards.” Earlier this year, Rick Yoes, a resident of Grand Prairie, Texas, also spent time behind bars—for the crime, in this case, of the ownership of an overgrown yard. Gerry Suttle, a woman in her mid-70s, recently had a warrant issued for her arrest—she had failed to mow the grass on a lot she owned across the street from her house—until four boys living in her Texas neighborhood heard of her plight in a news report, came over, and mowed the thing themselves.(...)
Which is all to say that lawns, long before Tom Selleck came along, have doubled as sweeping, sodded outgrowths of the Protestant ethic. The tapis vert, or “green carpet”—a concept Americans borrowed not just from French gardens and English estates, but also from the fantastical Italian paintings that imagined modern lawns into existence—became, as installed in the not-yet-united states, a signal that the American colonies aspired to match Europe in, among other things, elitism. (Lawns, in Europe, were an early form of conspicuous consumption, signs that their owners could afford to dedicate grounds to aesthetic, rather than agricultural, purposes—and signs, too, that their owners, in the days before lawnmowers lessened the burden of grass-shearing, could afford to pay scythe-wielding servants to do that labor.) Thomas Jefferson, being Thomas Jefferson, surrounded Monticello not just with neatly rowed crops, but with expanses of sheared grass that served no purpose but to send a message—about Jefferson himself, and about the ambitions of a newly formed nation.
As that country developed, its landscape architects would sharpen lawns’ symbolism: of collectivity, of interlocking destinies, of democracy itself. “It is unchristian,” the landscaper Frank J. Scott wrote in The Art of Beautifying Suburban Home Grounds, “to hedge from the sight of others the beauties of nature which it has been our good fortune to create or secure.” He added, magnanimously, that “the beauty obtained by throwing front grounds open together, is of that excellent quality which enriches all who take part in the exchange, and makes no man poorer.” Lawns became aesthetic extensions of Manifest Destiny, symbols of American entitlement and triumph, of the soft and verdant rewards that result when man’s ongoing battles against nature are finally won. A well-maintained lawn—luxurious in its lushness, implying leisure even as its upkeep had a stubborn way of preventing it—came, too, to represent a triumph of another kind: the order of suburbia over the squalor of the city. A neat expanse of green, blades clipped to uniform length and flowing from home to home, became, as Roman Mars notes, the “anti-broken window.”
by Megan Garber, The Atlantic | Read more:
Image: Robert Couse-Baker / FlickrDenali. Finally.
[ed. About time. Actually, way past time.]
It’s official: Denali is now the mountain formerly known as Mount McKinley.
With the approval of President Barack Obama, Interior Secretary Sally Jewell has signed a “secretarial order” to officially change the name, the White House and Interior Department announced Sunday. The announcement comes roughly 24 hours before Obama touches down in Anchorage for a whirlwind tour of Alaska.
Talk of the name change has swirled in Alaska this year since the National Park Service officially registered no objection in a congressional hearing in Washington, D.C.
The tallest mountain in North America has long been known to Alaskans as Denali, its Koyukon Athabascan name, but its official name was not changed with the creation of Denali National Park and Preserve in 1980, 6 million acres carved out for federal protection under the Alaska National Interest Lands Conservation Act. The state changed the name of the park’s tallest mountain to Denali at that time, but the federal government did not.
Jewell’s authority stems from a 1947 federal law that allows her to make changes to geographic names through the U.S. Board on Geographic Names, according to the department.
“I think for people like myself that have known the mountain as Denali for years and certainly for Alaskans, it's something that's been a long time coming,” Jewell told Alaska Dispatch News Sunday.
Every year, the same story plays out in Washington, D.C.: Alaska legislators sometimes file bills to change the name from Mount McKinley to Denali, and every year, someone in the Ohio congressional delegation -- the home state of the 25th President William McKinley -- files legislation to block a name change. (...)
According to the order Jewell signed, there is a policy of deferring action while a matter is under consideration by Congress. So the Ohio delegation’s annual legislative efforts have stalled any federal movement. But the law does allow the interior secretary to take action when the board naming doesn’t act “within a reasonable amount of time,” the order said.
“It's something (former Alaska Gov. Jay Hammond) pushed for back in 1975, and because of an effort to stop it in legislation that has not actually gone anywhere in the last 40 years, the Board of Geographic Names did not take it up,” Jewell said.
As interior secretary, she has authority to make a unilateral decision after a “reasonable time has passed,” Jewell said.
“And I think any of us would think that 40 years is an unreasonable amount of time. So we're delighted to make the name change now, and frankly I'm delighted that President Obama has encouraged the name change consistent with his trip,” Jewell said.
Jewell said the “overwhelming support for many years from the citizens of Alaska is more robust than anything that we have heard from the citizens of Ohio,” and that filing the same legislation year after year has not been accompanied by any “grass roots support” in Ohio. (...)
“I think most of us have always called it Denali. I know that's true in the climbing community and I suspect it has been true in Alaska for a very long time. So it'll just be great to formalize that with our friends at the U.S. Geological Survey and the Board of Geographic Names,” Jewell said.
The name “Denali” is derived from the Koyukon name and is based on a verb theme meaning “high” or “tall,” according to linguist James Kari of the Alaska Native Language Center at the University of Alaska Fairbanks, in the book “Shem Pete’s Alaska.” It doesn't mean "the great one," as is commonly believed, Kari wrote.
The mountain was named for McKinley before he became president, by gold prospector William A. Dickey, who had just received word of McKinley’s nomination as a candidate in 1897. McKinley died without ever setting foot in Alaska, assassinated at the start of his second term in office.
It’s official: Denali is now the mountain formerly known as Mount McKinley.

Talk of the name change has swirled in Alaska this year since the National Park Service officially registered no objection in a congressional hearing in Washington, D.C.
The tallest mountain in North America has long been known to Alaskans as Denali, its Koyukon Athabascan name, but its official name was not changed with the creation of Denali National Park and Preserve in 1980, 6 million acres carved out for federal protection under the Alaska National Interest Lands Conservation Act. The state changed the name of the park’s tallest mountain to Denali at that time, but the federal government did not.
Jewell’s authority stems from a 1947 federal law that allows her to make changes to geographic names through the U.S. Board on Geographic Names, according to the department.
“I think for people like myself that have known the mountain as Denali for years and certainly for Alaskans, it's something that's been a long time coming,” Jewell told Alaska Dispatch News Sunday.
Every year, the same story plays out in Washington, D.C.: Alaska legislators sometimes file bills to change the name from Mount McKinley to Denali, and every year, someone in the Ohio congressional delegation -- the home state of the 25th President William McKinley -- files legislation to block a name change. (...)
According to the order Jewell signed, there is a policy of deferring action while a matter is under consideration by Congress. So the Ohio delegation’s annual legislative efforts have stalled any federal movement. But the law does allow the interior secretary to take action when the board naming doesn’t act “within a reasonable amount of time,” the order said.
“It's something (former Alaska Gov. Jay Hammond) pushed for back in 1975, and because of an effort to stop it in legislation that has not actually gone anywhere in the last 40 years, the Board of Geographic Names did not take it up,” Jewell said.
As interior secretary, she has authority to make a unilateral decision after a “reasonable time has passed,” Jewell said.
“And I think any of us would think that 40 years is an unreasonable amount of time. So we're delighted to make the name change now, and frankly I'm delighted that President Obama has encouraged the name change consistent with his trip,” Jewell said.
Jewell said the “overwhelming support for many years from the citizens of Alaska is more robust than anything that we have heard from the citizens of Ohio,” and that filing the same legislation year after year has not been accompanied by any “grass roots support” in Ohio. (...)
“I think most of us have always called it Denali. I know that's true in the climbing community and I suspect it has been true in Alaska for a very long time. So it'll just be great to formalize that with our friends at the U.S. Geological Survey and the Board of Geographic Names,” Jewell said.
The name “Denali” is derived from the Koyukon name and is based on a verb theme meaning “high” or “tall,” according to linguist James Kari of the Alaska Native Language Center at the University of Alaska Fairbanks, in the book “Shem Pete’s Alaska.” It doesn't mean "the great one," as is commonly believed, Kari wrote.
The mountain was named for McKinley before he became president, by gold prospector William A. Dickey, who had just received word of McKinley’s nomination as a candidate in 1897. McKinley died without ever setting foot in Alaska, assassinated at the start of his second term in office.
The Perfect Poop
It’s the middle of the day for Eric, a 24-year-old research assistant at the Massachusetts Institute of Technology, and nature is calling.
Eric leaves his job and hops a train. Then a bus. Then he walks some more. He passes countless toilets, and he needs to use them, but he doesn’t.
Eventually, Eric arrives at a nondescript men’s room 30 minutes away from MIT. A partition separates two toilets. There’s a square-tiled floor like in any public restroom. It’s unremarkable in every way, with one exception: A pit stop here can save lives.
Eric hangs a plastic collection bucket down inside the toilet bowl and does his business. When he’s finished, he puts a lid on the container, bags it up and walks his stool a few doors down the hall to OpenBiome, a small laboratory northwest of Boston that has developed a way to turn poop from extremely healthy people into medicine for really sick patients.
A lab technician weighs Eric’s “sample.” Over the past 2½ months, Eric has generated 10.6 pounds of poop over 29 visits, enough feces to produce 133 treatments for patients suffering from Clostridium difficile, an infection that kills 15,000 Americans a year and sickens half a million.
To donate, Eric had to pass a 109-point clinical assessment. There is a laundry list of factors that would disqualify a donor: obesity, illicit drug use, antibiotic use, travel to regions with high risk of contracting diseases, even recent tattoos. His stools and blood also had to clear a battery of laboratory screenings to make sure he didn’t have any infections.
After all that screening, only 3% of prospective donors are healthy enough to give. “I had no idea,” he says about his poop. “It turns out that it’s fairly close to perfect.”
And that, unlike most people’s poop, makes Eric’s worth money. OpenBiome pays its 22 active donors $40 per sample. They’re encouraged to donate often, every day if they can. Eric has earned about $1,000.
“It takes us a lot of time and effort to find these donors,” says OpenBiome’s research director, Mark Smith. “When we do find them, we want to keep them as engaged as possible and really want to compensate them for their time.”
Why is Eric’s poop so valuable?
A hundred trillion bacteria live inside your gut, some good, some bad. When patients take antibiotics for infections, sometimes they fail to work; good bacteria gets killed off while bad bacteria — C. difficile — grows unchecked.
The life-saving bacteria from the guts of people like Eric can help. When their healthy microbes are placed inside the intestines of a sick person they can chase out harmful C. difficile bacteria. It’s called a fecal transplant. The treatments are administered bottom-up, through a colonoscopy, or top-down, through a tube in the nose.
OpenBiome’s poop donors have created about 5,000 treatments, and the organization says the results have been stunning. Stinky human waste is an astonishingly simple cure: 90% of the patients get better.
“They’ll actually have this really transformational experience where they’ll be going to the bathroom 20 times a day and then have normal bowel movements sort of immediately or the next day,” Smith says.
The organization’s fecal transplants cost $385 to purchase and are providing a treatment to more than 350 hospitals in 47 states.

Eventually, Eric arrives at a nondescript men’s room 30 minutes away from MIT. A partition separates two toilets. There’s a square-tiled floor like in any public restroom. It’s unremarkable in every way, with one exception: A pit stop here can save lives.
Eric hangs a plastic collection bucket down inside the toilet bowl and does his business. When he’s finished, he puts a lid on the container, bags it up and walks his stool a few doors down the hall to OpenBiome, a small laboratory northwest of Boston that has developed a way to turn poop from extremely healthy people into medicine for really sick patients.
A lab technician weighs Eric’s “sample.” Over the past 2½ months, Eric has generated 10.6 pounds of poop over 29 visits, enough feces to produce 133 treatments for patients suffering from Clostridium difficile, an infection that kills 15,000 Americans a year and sickens half a million.
To donate, Eric had to pass a 109-point clinical assessment. There is a laundry list of factors that would disqualify a donor: obesity, illicit drug use, antibiotic use, travel to regions with high risk of contracting diseases, even recent tattoos. His stools and blood also had to clear a battery of laboratory screenings to make sure he didn’t have any infections.
After all that screening, only 3% of prospective donors are healthy enough to give. “I had no idea,” he says about his poop. “It turns out that it’s fairly close to perfect.”
And that, unlike most people’s poop, makes Eric’s worth money. OpenBiome pays its 22 active donors $40 per sample. They’re encouraged to donate often, every day if they can. Eric has earned about $1,000.
“It takes us a lot of time and effort to find these donors,” says OpenBiome’s research director, Mark Smith. “When we do find them, we want to keep them as engaged as possible and really want to compensate them for their time.”
Why is Eric’s poop so valuable?
A hundred trillion bacteria live inside your gut, some good, some bad. When patients take antibiotics for infections, sometimes they fail to work; good bacteria gets killed off while bad bacteria — C. difficile — grows unchecked.
The life-saving bacteria from the guts of people like Eric can help. When their healthy microbes are placed inside the intestines of a sick person they can chase out harmful C. difficile bacteria. It’s called a fecal transplant. The treatments are administered bottom-up, through a colonoscopy, or top-down, through a tube in the nose.
OpenBiome’s poop donors have created about 5,000 treatments, and the organization says the results have been stunning. Stinky human waste is an astonishingly simple cure: 90% of the patients get better.
“They’ll actually have this really transformational experience where they’ll be going to the bathroom 20 times a day and then have normal bowel movements sort of immediately or the next day,” Smith says.
The organization’s fecal transplants cost $385 to purchase and are providing a treatment to more than 350 hospitals in 47 states.
by CNN Wire | Read more:
Image: Andrea Levy, The Plain DealerSaturday, August 29, 2015
Friday, August 28, 2015
Cancer Cells Programmed Back to Normal
Cancer cells programmed back to normal by US scientists
Scientists have turned cancerous cells back to normal by switching back on the process which stops normal cells from replicating too quickly
Image: Wellcome Collection
Thursday, August 27, 2015
Keith Richards on ‘Crosseyed Heart’
[ed. I want that guitar.]
He’s the archetypal rock guitarist: the genius wastrel, the unimpeachable riff-maker, the architect of a band sound emulated worldwide, the survivor of every excess. Onstage, he is at once a flamboyant figure and a private one, locked in a one-on-one dance with his guitar, working new variations into every song. “I never play the same thing twice,” he said. “I can’t remember what I played before anyway.”
With the Stones in “hibernation” after a tour that ended in 2007, Mr. Richards took two and a half years “immersed in my life twice” to write (with James Fox) a best-selling memoir, “Life,” that re-examined his many sessions, tours, trysts, addictions, mishaps, arrests and accomplishments. After “Life” was published in 2010, he was enjoying being a family man and a grandfather. Retirement was a real possibility.
“I thought, that’s the craziest thing I ever heard,” said Steve Jordan, Mr. Richards’s longtime co-producer and drummer on his solo projects. “He felt comfortable with where he was and what he had done and what he had achieved. But knowing Keith, to not have him pick up an instrument and play, it was weird. When you’re a musician, you don’t retire. You play up until you can’t breathe.”
Mr. Jordan nudged Mr. Richards in a different direction: back into the recording studio to make his first solo album in 23 years, “Crosseyed Heart” (Republic), to be released Sept. 18. “I realized I hadn’t been in the studio since 2004 with the Stones,” Mr. Richards said. “I thought: ‘This is a bit strange. Something in my life is missing.’”
It’s a straightforwardly old-fashioned, rootsy album that could have appeared 20 years ago. The instruments are hand-played, the vocals are scratchy growls, and the songs revisit Mr. Richards’s favorite idioms — blues, country, reggae, Stonesy rock — for some scrappy storytelling. The album was recorded on analog tape. “I love to see those little wheels go around,” Mr. Richards said.
Eased onto a couch at his manager’s downtown Manhattan office, surrounded by merchandise from this year’s Rolling Stones tour and memorabilia dating back decades, Mr. Richards, 71, alternated between a Marlboro and a drink. He was wearing an ensemble only he could pull off: a striped seersucker jacket over a black T-shirt decorated with a Captain America shield, black corduroy jeans and silvery-patterned running shoes. A woven headband in Rastafarian red, gold and green held back his luxuriantly unkempt gray hair. A silver skull ring was, as usual, on his right hand as a reminder, he has said, that “beauty’s only skin deep.”
In a conversation punctuated by his wheezy, conspiratorial growl of a laugh, he was a man at ease with himself as a rock elder. “It’s all a matter of perspective and which end of the telescope you’re looking at,” he said.
“Nobody wants to croak, but nobody wants to get old,” he said. “When the Stones started, we were 18, 19, 20, and the idea of being 30 was absolutely horrendous. Forget about it! And then suddenly you’re 40, and oh, they’re in it for the long haul. So you need to readjust, and of course kids happen and grandchildren, and then you start to see the pattern unfolding. If you make it, it’s fantastic.”
by Jon Pareles, NY Times | Read more:
Image: J. Rose/NetflixCrediJ. Rose/Netflix
How Cities Can Beat the Heat
The greenhouses that sprawl across the coastline of southeastern Spain are so bright that they gleam in satellite photos. Since the 1970s, farmers have been expanding this patchwork of buildings in AlmerÃa province to grow produce such as tomatoes, peppers and watermelons for export. To keep the plants from overheating in the summer, they paint the roofs with white lime to reflect the sunlight.
That does more than just cool the crops. Over the past 30 years, the surrounding region has warmed by 1 °C, but the average air temperature in the greenhouse area has dropped by 0.7 °C.
It's an effect that cities around the world would like to mimic. As Earth's climate changes over the coming decades, global warming will hit metropolitan areas especially hard because their buildings and pavements readily absorb sunlight and raise local temperatures, a phenomenon known as the urban heat island effect. Cities, as a result, stand a greater chance of extreme hot spells that can kill. “Heat-related deaths in the United States outpace—over the last 30 years—all other types of mortality from extreme weather causes,” says Kim Knowlton, a health scientist at Columbia University in New York. “This is not an issue that is going away.”
Some cities hope to stave off that sizzling future. Many are planting trees and building parks, but they have focused the most attention on rooftops—vast areas of unused space that absorb heat from the Sun. In 2009, Toronto, Canada, became the first city in North America to adopt a green-roof policy. It requires new buildings above a certain size to be topped with plants in the hope that they will retain storm water and keep temperatures down. Los Angeles, California, mandated in 2014 that new and renovated homes install 'cool roofs' made of light-coloured materials that reflect sunlight. A French law approved in March calls for the rooftops of new buildings in commercial zones to be partially covered in plants or solar panels.
But the rush to act is speeding ahead of the science. Although cool roofs and green roofs can strongly curb temperatures at the tops of buildings, they do not always yield benefits at the street level, and they may trigger unwanted effects, such as reducing rainfall in some places. “There was a notion that the community had reached a conclusion and there was a one-size-fits-all solution,” says Matei Georgescu, a sustainability scientist at Arizona State University in Tempe. “But that is not the case.”
On top of that, it is unclear whether the limited programmes currently in place will have a measurable effect on temperature—and citizen health—and whether cities will expand their efforts enough to produce results. “If you're just putting green roofs on city hall and schools, it's not going to move the needle,” says Brian Stone Jr, an urban scientist at the Georgia Institute of Technology in Atlanta.

It's an effect that cities around the world would like to mimic. As Earth's climate changes over the coming decades, global warming will hit metropolitan areas especially hard because their buildings and pavements readily absorb sunlight and raise local temperatures, a phenomenon known as the urban heat island effect. Cities, as a result, stand a greater chance of extreme hot spells that can kill. “Heat-related deaths in the United States outpace—over the last 30 years—all other types of mortality from extreme weather causes,” says Kim Knowlton, a health scientist at Columbia University in New York. “This is not an issue that is going away.”
Some cities hope to stave off that sizzling future. Many are planting trees and building parks, but they have focused the most attention on rooftops—vast areas of unused space that absorb heat from the Sun. In 2009, Toronto, Canada, became the first city in North America to adopt a green-roof policy. It requires new buildings above a certain size to be topped with plants in the hope that they will retain storm water and keep temperatures down. Los Angeles, California, mandated in 2014 that new and renovated homes install 'cool roofs' made of light-coloured materials that reflect sunlight. A French law approved in March calls for the rooftops of new buildings in commercial zones to be partially covered in plants or solar panels.
But the rush to act is speeding ahead of the science. Although cool roofs and green roofs can strongly curb temperatures at the tops of buildings, they do not always yield benefits at the street level, and they may trigger unwanted effects, such as reducing rainfall in some places. “There was a notion that the community had reached a conclusion and there was a one-size-fits-all solution,” says Matei Georgescu, a sustainability scientist at Arizona State University in Tempe. “But that is not the case.”
On top of that, it is unclear whether the limited programmes currently in place will have a measurable effect on temperature—and citizen health—and whether cities will expand their efforts enough to produce results. “If you're just putting green roofs on city hall and schools, it's not going to move the needle,” says Brian Stone Jr, an urban scientist at the Georgia Institute of Technology in Atlanta.
by Hannah Hoag, Nature/Scientific American | Read more:
Image: Goldmund Lukic ©iStock.com
Wednesday, August 26, 2015
Americans Are "Fired Up" About First Commercially Available Flamethrowers
Facing possible ban, more Americans are buying new—and legal—$900 flamethrowers
And why would anyone need a handheld flamethrower, you ask? Here are some "ideas" from the Ion Productions' official XM42 website:
- start your bonfire from across the yard
- kill the weeds between your cracks in style
- clearing snow/ice
- controlled burns/ground-clearing of foliage/agricultural
- insect control
[ed. Is this a great country or what?]
Tuesday, August 25, 2015
As the Dow Jones Drops
Let’s review: First, various “emerging economy” exchanges lost value, then China, then Wall Street.
The actual economic contagion started with resource prices. That was driven by reduced demand, primarily from China. Oil prices (only one commodity), already under pressure from moderately increased supply (it was less than boosters make out), were crashed by Saudi Arabia’s decision to increase production rather than cutting it back. There’s plenty of speculation why, the practical result was to trash multiple exchange rates and economies and to encourage everyone to overproduce, breaking OPEC solidarity. I don’t think Saudi Arabia is going to win this bet, whether it was to crush specific countries (Russia, Iran) or to crush American high cost oil production.
During this period we had repeated currency devaluations in an attempt to increase the competitiveness of exports. These devaluations had marginal effect at best, didn’t work at least.
China’s growth had been slowing (thus the reduction in their demand for commodities), they encouraged a stock market bubble as consumers were proving reluctant to continue piling into real-estate. They printed vast amounts of money, at least twenty times as much as Europe, Japan, and the US combined, but exports were no longer leading growth. Regular Chinese and private firms have massive amounts of debt.
To put it simply, China had reached the point where export-led mercantilism was no longer working. They needed to shift to domestic consumer demand. They chose to try and inflate bubbles instead.
Virtually every country in the world was either rolling off a cliff, or struggling to keep their head above water. Most of the South of Europe had never really recovered (Ireland is a partial exception). Latin America was diving, Turkey’s real-estate driven, neo-liberal growth was stalling, India’s “miracle” was always more of a paper tiger than most made out, being concentrated to a minority even as the average number of calories consumed in the country dived.
But this started in China, which is important.
We are now in a situation analogous to the late 19th and early 20th century. America is the global hegemon (as Britain was then), and China is the world’s most important economy (America was then.) China is the global manufacturer. It buys the most resources, which is what most of the world sells, since most countries have given up manufacturing most goods for themselves. It prints the most money, dwarfing America and Europe. Its rich people are driving up real-estate prices all over the globe.
Yes, yes, by some measures the US economy is still “bigger,” but those measures are even more inflated than inflated and bogus Chinese ones. China is the key maker of goods. There are a few other countries that also make goods as the most important (not largest, most important) part of their economy. Everyone else is a commodity producer, a financier, or trying to sell intangibles (intellectual property, whether inventions or fiction or branding).
So what and how China does now matters most, economically. The contagion started in China, spread to emerging economies, money fled to the US and a few other safe havens, China’s economy continued to stall, its stock market fell despite radical attempts to keep it inflated, and that has now come home to New York.
Some are worried this is 1929, but in China. I have been stating for years that the big one would start in China. Whether this is it, we won’t know for a while (just as they did not know in 1929 that it was 1929).
Welcome to the new world. The US and Europe put a LOT of effort into moving as much industrial production as possible to China. China just promised that a very few people would get very rich doing it, and those people made sure it happened. (Look up the profit margins on iPhones.)
I will note that there are still bubbles. Real-estate bubbles (Canada, Britain, a few important US cities, Australia, etc.) and a vast amount of highly leveraged derivatives have been pumped back out since the 2008 crash, since no one actually bothered to regulate or forbid them. And banks and financial companies are now larger and fewer, making the economy and financial markets both more subject to contagion.
The elites learned from 2008 that the important thing to do in a financial crisis is to just print enough money and relax enough accounting rules–extend and pretend. That will be the play again this time if this contagion turns truly serious. I would guess that it will work, sort of: More zombies will be created, they will need higher profits, the real economy will be even more stagnant.
The actual economic contagion started with resource prices. That was driven by reduced demand, primarily from China. Oil prices (only one commodity), already under pressure from moderately increased supply (it was less than boosters make out), were crashed by Saudi Arabia’s decision to increase production rather than cutting it back. There’s plenty of speculation why, the practical result was to trash multiple exchange rates and economies and to encourage everyone to overproduce, breaking OPEC solidarity. I don’t think Saudi Arabia is going to win this bet, whether it was to crush specific countries (Russia, Iran) or to crush American high cost oil production.
During this period we had repeated currency devaluations in an attempt to increase the competitiveness of exports. These devaluations had marginal effect at best, didn’t work at least.

To put it simply, China had reached the point where export-led mercantilism was no longer working. They needed to shift to domestic consumer demand. They chose to try and inflate bubbles instead.
Virtually every country in the world was either rolling off a cliff, or struggling to keep their head above water. Most of the South of Europe had never really recovered (Ireland is a partial exception). Latin America was diving, Turkey’s real-estate driven, neo-liberal growth was stalling, India’s “miracle” was always more of a paper tiger than most made out, being concentrated to a minority even as the average number of calories consumed in the country dived.
But this started in China, which is important.
We are now in a situation analogous to the late 19th and early 20th century. America is the global hegemon (as Britain was then), and China is the world’s most important economy (America was then.) China is the global manufacturer. It buys the most resources, which is what most of the world sells, since most countries have given up manufacturing most goods for themselves. It prints the most money, dwarfing America and Europe. Its rich people are driving up real-estate prices all over the globe.
Yes, yes, by some measures the US economy is still “bigger,” but those measures are even more inflated than inflated and bogus Chinese ones. China is the key maker of goods. There are a few other countries that also make goods as the most important (not largest, most important) part of their economy. Everyone else is a commodity producer, a financier, or trying to sell intangibles (intellectual property, whether inventions or fiction or branding).
So what and how China does now matters most, economically. The contagion started in China, spread to emerging economies, money fled to the US and a few other safe havens, China’s economy continued to stall, its stock market fell despite radical attempts to keep it inflated, and that has now come home to New York.
Some are worried this is 1929, but in China. I have been stating for years that the big one would start in China. Whether this is it, we won’t know for a while (just as they did not know in 1929 that it was 1929).
Welcome to the new world. The US and Europe put a LOT of effort into moving as much industrial production as possible to China. China just promised that a very few people would get very rich doing it, and those people made sure it happened. (Look up the profit margins on iPhones.)
I will note that there are still bubbles. Real-estate bubbles (Canada, Britain, a few important US cities, Australia, etc.) and a vast amount of highly leveraged derivatives have been pumped back out since the 2008 crash, since no one actually bothered to regulate or forbid them. And banks and financial companies are now larger and fewer, making the economy and financial markets both more subject to contagion.
The elites learned from 2008 that the important thing to do in a financial crisis is to just print enough money and relax enough accounting rules–extend and pretend. That will be the play again this time if this contagion turns truly serious. I would guess that it will work, sort of: More zombies will be created, they will need higher profits, the real economy will be even more stagnant.
by Ian Welsh | Read more:
Image: gavatron
Monday, August 24, 2015
Whales Found Dead In ‘Mortality Event’ In Alaska
Image: Dr. Bree Witteveen/Alaska Sea Grant Marine Advisory Program
Subscribe to:
Posts (Atom)