Thursday, April 25, 2013

20 Pounds? Not Too Bad, for an Extinct Fish


For most fishermen a 20-pound trout is a trophy, but for Paiute tribe members and fish biologists here the one Matt Ceccarelli caught was a victory.

That Lahontan cutthroat trout he caught last year, a remnant of a strain that is possibly the largest native trout in North America, is the first confirmed catch of a fish that was once believed to have gone extinct. The fish has been the focus of an intense and improbable federal and tribal effort to restore it to its home waters.

“I was in awe,” said Mr. Ceccarelli, 32, an engineer from Sparks, Nev., of the speckled trout with hues of olive and rose.

Early settlers told stories of Pyramid Lake Lahontan cutthroats that weighed more than 60 pounds, though the official world record was a 41-pounder caught by a Paiute man in 1925. The explorer who discovered this electric-blue oasis in 1844, John Fremont, called them “salmon trout.” Mark Twain raved about their flavor. Clark Gable, the actor, chased them. President Bill Clinton and tribe members called for their restoration. (...)

In the late 19th and early 20th centuries, fishermen netted scores of Lahontan cutthroats to feed miners and loggers gnawing at the Sierra Nevada Mountains. But the Truckee River, where the fish spawned, was dammed, and its level dropped as water was taken for irrigation. It was also polluted with chemicals and sawdust. And Lake Tahoe was stocked with a nonnative char called lake trout, which gobble baby cutthroat. By the mid-1940s, all the native trout in Pyramid Lake and Lake Tahoe were dead and the strain was declared extinct. (...)

In the late 1970s, a fish biologist identified what he thought were surviving specimens of the vanished Pyramid Lake strain of Lahontan cutthroat in a small creek near a 10,000-foot mountain on the border of Nevada and Utah called Pilot Peak. A Utah man used buckets to stock the rugged stream with trout in the early 1900s, but made no record, federal biologists say. Geneticists recently compared cutthroats from the Pilot Peak stream with mounts of giant Pyramid Lake trout and discovered an exact DNA match.

“They are the originals,” said Corene Jones, 39, the broodstock coordinator for the Lahontan National Fish Hatchery in Gardnerville, Nev.

In 1995, United States Fish and Wildlife Service biologists harvested cutthroat eggs from Pilot Peak and brought them to the Gardnerville hatchery, just a few years before a devastating wildfire scorched the mountain and killed off the creek. In 2006 federal officials, in cooperation with the tribe, began stocking Pyramid Lake with what many now call Pilot Peak cutthroats. They waited to see how the fish might readapt to its ancestral home.

The answer came from ecstatic anglers. Late last year, a Reno man caught and released a 24-pounder. David Hamel, 27, of Reno, just did the same with a pair of 20-pound cutthroats.

“Biggest fish of my life,” he said. “Amazing.”

by Nate Schweber, NY Times |  Read more:
Image: Winslow Homer: Two Trout (1891) via:

How Not to Die


Unwanted treatment is American medicine’s dark continent. No one knows its extent, and few people want to talk about it. The U.S. medical system was built to treat anything that might be treatable, at any stage of life—even near the end, when there is no hope of a cure, and when the patient, if fully informed, might prefer quality time and relative normalcy to all-out intervention.

In 2009, my father was suffering from an advanced and untreatable neurological condition that would soon kill him. (I wrote about his decline in an article for this magazine in April 2010.) Eating, drinking, and walking were all difficult and dangerous for him. He ate, drank, and walked anyway, because doing his best to lead a normal life sustained his morale and slowed his decline. “Use it or lose it,” he often said. His strategy broke down calamitously when he agreed to be hospitalized for an MRI test. I can only liken his experience to an alien abduction. He was bundled into a bed, tied to tubes, and banned from walking without help or taking anything by mouth. No one asked him about what he wanted. After a few days, and a test that turned up nothing, he left the hospital no longer able to walk. Some weeks later, he managed to get back on his feet; unfortunately, by then he was only a few weeks from death. The episode had only one positive result. Disgusted and angry after his discharge from the hospital, my father turned to me and said, “I am never going back there.” (He never did.)

What should have taken place was what is known in the medical profession as The Conversation. The momentum of medical maximalism should have slowed long enough for a doctor or a social worker to sit down with him and me to explain, patiently and in plain English, his condition and his treatment options, to learn what his goals were for the time he had left, and to establish how much and what kind of treatment he really desired. Alas, evidence shows that The Conversation happens much less regularly than it should, and that, when it does happen, information is typically presented in a brisk, jargony way that patients and families don’t really understand. Many doctors don’t make time for The Conversation, or aren’t good at conducting it (they’re not trained or rewarded for doing so), or worry their patients can’t handle it.

This is a problem, because the assumption that doctors know what their patients want turns out to be wrong: when doctors try to predict the goals and preferences of their patients, they are “highly inaccurate,” according to one summary of the research, published by Benjamin Moulton and Jaime S. King in The Journal of Law, Medicine & Ethics. Patients are “routinely asked to make decisions about treatment choices in the face of what can only be described as avoidable ignorance,” Moulton and King write. “In the absence of complete information, individuals frequently opt for procedures they would not otherwise choose.” (...)

Angelo Volandes was born in 1971, in Brooklyn, to Greek immigrants. His father owned a diner. He and his older sister were the first in their family to go to college—Harvard, in his case. In Cambridge, he got a part-time job cooking for an elderly, childless couple, who became second parents to him. He watched as the wife got mortally sick, he listened to her labored breathing, he talked with her and her husband about pain, death, the end of life. Those conversations led him to courses in medical ethics, which he told me he found abstract and out of touch with “the clinical reality of being short of breath; of fear; of anxiety and suffering; of medications and interventions.” He decided to go to medical school, not just to cure people but “to learn how people suffer and what the implications of dying and suffering and understanding that experience are like.” Halfway through med school at Yale, on the recommendation of a doctor he met one day at the gym, he took a year off to study documentary filmmaking, another of his interests. At the time, it seemed a digression.

On the very first night of his postgraduate medical internship, when he was working the graveyard shift at a hospital in Philadelphia, he found himself examining a woman dying of cancer. She was a bright woman, a retired English professor, but she seemed bewildered when he asked whether she wanted cardiopulmonary resuscitation if her heart stopped beating. So, on an impulse, he invited her to visit the intensive-care unit. By coincidence, she witnessed a “code blue,” an emergency administration of CPR. “When we got back to the room,” Volandes remembered, “she said, ‘I understood what you told me. I am a professor of English—I understood the words. I just didn’t know what you meant. It’s not what I had imagined. It’s not what I saw on TV.’ ” She decided to go home on hospice. Volandes realized that he could make a stronger, clearer impression on patients by showing them treatments than by trying to describe them.

He spent the next few years punching all the tickets he could: mastering the technical arts of doctoring, credentialing himself in medical ethics, learning statistical techniques to perform peer-reviewed clinical trials, joining the Harvard faculty and the clinical and research staff of Massachusetts General Hospital. He held on to his passion, though. During a fellowship at Harvard in 2004, he visited Dr. Muriel Gillick, a Harvard Medical School professor and an authority on late-life care. Volandes “was very distressed by what he saw clinically being done to people with advanced dementia,” Gillick recalls. “He was interested in writing an article about how treatment of patients with advanced dementia was a form of abuse.” Gillick talked him down. Some of what’s done is wrong, she agreed, but raging against it would not help. The following year, with her support, Volandes began his video project.

The first film he made featured a patient with advanced dementia. It showed her inability to converse, move about, or feed herself. When Volandes finished the film, he ran a randomized clinical trial with a group of nine other doctors. All of their patients listened to a verbal description of advanced dementia, and some of them also watched the video. All were then asked whether they preferred life-prolonging care (which does everything possible to keep patients alive), limited care (an intermediate option), or comfort care (which aims to maximize comfort and relieve pain). The results were striking: patients who had seen the video were significantly more likely to choose comfort care than those who hadn’t seen it (86 percent versus 64 percent). Volandes published that study in 2009, following it a year later with an even more striking trial, this one showing a video to patients dying of cancer. Of those who saw it, more than 90 percent chose comfort care—versus 22 percent of those who received only verbal descriptions. The implications, to Volandes, were clear: “Videos communicate better than just a stand-alone conversation. And when people get good communication and understand what’s involved, many, if not most, tend not to want a lot of the aggressive stuff that they’re getting.”

by Jonathan Rauch, The Atlantic |  Read more:
Image: Eric Ogden

Facebook Home Propaganda Makes Selfishness Contagious


The new ads for Facebook Home are propaganda clips. Transforming vice into virtue, they’re social engineering spectacles that use aesthetic tricks to disguise the profound ethical issues at stake. This isn’t an academic concern: Zuckerberg’s vision (as portrayed by the ads) is being widely embraced — if the very recent milestone of half a million installations is anything to go by.

Critics have already commented on how the ads exploit our weakness for escapist fantasy so we can feel good about avoiding conversation and losing touch with our physical surroundings. And they’ve called out Zuckerberg’s hypocrisy: “Isn’t the whole point of Facebook supposed to be that it’s a place to keep up with, you know, family members? So much for all that high-minded talk about connecting people.”
Think off-camera and outside the egocentric perspective framed by the ads.

However, the dismissive reviews miss an even deeper and more consequential point about the messages conveyed by the ads: that to be cool, worthy of admiration and emulation, we need to be egocentric. We need to care more about our own happiness than our responsibilities towards others.

Let’s examine the most egregious Facebook ad of them all: “Dinner” (in the video above). On the surface, it portrays an intergenerational family meal where a young woman escapes from the dreariness of her older relative’s boring cat talk by surreptitiously turning away from the feast and instead feasting her eyes on Facebook Home. With a digital nod to the analog “Calgon, Take Me Away” commercials, the young woman is automatically, frictionlessly transported to a better place: full of enchanting rock music, ballerinas, and snowball fights.

But let’s break Zuckerberg’s spell and shift our focus away from Selfish Girl. Think off-camera and outside the egocentric perspective framed by the ad. Reflect instead on the people surrounding her.

Ignored Aunt will soon question why she’s bothering to put in effort with her distant younger niece. Eventually, she’ll adapt to the Facebook Home-idealized situation and stop caring. In a scene that Facebook won’t run, Selfish Girl will come to Ignored Aunt for something and be ignored herself: Selfishness is contagious, after all. Once it spreads to a future scene where everyone behaves like Selfish Girl, with their eyes glued to their own Home screens, the Facebook ads portend the death of family gatherings.

More specifically, they depict the end of connecting through effort. Because unlike the entertaining and lively Chatheads the ad recommend we put on our personalized network interfaces and Home screens, we don’t get to choose floating family members. It’s a dystopian situation when everyone matches our interests and we don’t feel obliged to try to connect with those folks: people with whom it’s initially difficult to find common ground.

So why doesn’t the “Dinner” ad depress us? Well that’s where the clever propaganda comes in — the ads give Selfish Girl special license: Everyone else behaves responsibly except for her. Moreover, her irresponsible behavior doesn’t affect what others do. (...)

So what, big deal, some argue about these ads. Unfortunately, the message of technological efficiency and frictionless sharing is increasingly being depicted as an appropriate social ethic beyond Silicon Valley.

by Evan Selinger, Wired |  Read more:
Video: theofficialfacebook

Wednesday, April 24, 2013


Downtown LA
via:

Jon Spencer Blues Explosion



Nestow Sakaczbia. Men stumble on stones, not mountains.
via:

The Big One?

We are at a mysterious fork in the road. One path leads to years, perhaps decades, of spread of a new type of influenza, occasionally making people sick and killing about 18 percent of them. It's not a pleasant route, strewn as it is with uncertainties, but no terror seems to lurk on its horizon. The other path, however, wrenches the gut with fear, as it brings worldwide transmission of a dangerous new form of flu that could spread unchecked throughout humanity, testing global solidarity, vaccine production, hospital systems and humanity's most basic family and community instincts.

There may be some minor footpaths along the way, heading to other alternatives, but they can't be discerned at this moment. At this writing, 108 cases of H7N9 flu, as the new virus has been dubbed, have been confirmed, and one asymptomatic carrier of the virus has been identified. Twenty-two of the cases have proven fatal, and nine people have been cured of the new flu. The remainder are still hospitalized, many in severe condition suffering multiple organ failures. As the flu czar of the World Health Organization (WHO), Dr. Keiji Fukuda, tersely put it to reporters last week, "Anything can happen. We just don't know."

On this tenth anniversary of China's April 2003 admission that the SARS virus had spread across that country -- under cloak of official secrecy, spawning a pandemic of a previously unknown, often lethal disease -- Beijing finds itself once again in a terrible position via-a-vis the microbial and geopolitical worlds. In both the SARS and current H7N9 influenza cases, China watched the microbe's historic path unfold during a period of enormous political change. And the politics got in the way of appropriate threat assessment. (...)

I covered the SARS epidemic in Hong Kong and throughout mainland China, and there are more than a few aspects of the current H7N9 situation that provoke feelings of déjà vu. As was the case in 2003, Beijing now has new leaders, President Xi Jinping and Premier Li Keqiang, who assumed office on March 14, 2013. As was the case with SARS in 2003, information regarding the new H7N9 flu did not start to flow publicly until after safe installation of the new leadership. And during the months between the Communist Party's closed meetings that selected Xi and Li and March 14th, the country was rocked by scandals, including murder and billions of dollars' worth of financial shenanigans, pitting one Communist Party faction against another. Both the SARS and H7N9 outbreaks unfolded in atmospheres of political intrigue and secrecy.

Today, with the future path of the new influenza still uncertain, Beijing faces conundrums similar to those it confronted after publicly admitting to SARS. May Day, one of China's biggest travel holidays, is approaching. Travel restrictions might be warranted to prevent nationwide spread if the virus is now thought to be geographically confined, and if further evidence shows that people can act as carriers and transmitters of H7N9. But the economic and geopolitical consequences of clamping down on social mobility are profound, particularly now that China's economic growth is slowing.

In 2003, Beijing warned the public to limit travel, but did not actually barricade the capital and set up health checkpoints in all of the nation's train, bus, shipping, and air travel stations until it was too late. I watched tens of thousands of fearful migrant workers and students -- impelled by rumors of forced quarantines targeting those without permanent Beijing residency papers -- flee the capital by trains over the days between the April 20 admission and May Day holiday, taking the SARS virus to every region of the country. Having lost control of geographic spread, China had no choice but to assume the entire country was infected, and create an extraordinarily expensive, nationwide response. I witnessed construction of Xiaotangshan SARS Hospital, a 1,500-bed quarantine facility erected in only eight days, complete with isolation rooms, dedicated sewer and water filtration systems, negative air pressure flow, and state-of-the-art nursing stations. That astounding feat was repeated all over the country, with quarantine hospitals built in five to 10 days in every region. As I traveled around China by car, I was stopped roughly every 50 miles by police and subjected to thermometer checks. Any individual anywhere in the country that evidenced a fever was immediately placed in one of the newly erected quarantine facilities, and would remain there indefinitely -- no visitors allowed. In Beijing, such fever stations were ubiquitous: Anybody with an abnormal temperature was immediately packed off to a military-run quarantine site or Ditan Hospital for Infectious Diseases, where even the doctors and nurses were on lockdown, forbidden to see their families for weeks. Knowing that the virus was spreading inside of hospitals, terrified physicians and nurses jumped out of windows and patients hid in their homes until May 15, when the central government declared it a high crime, punishable even by death, to hide or spread SARS cases.

That is how by July 5, 2003, China stopped SARS -- with a nationwide find-the-fever campaign that could not possibly be executed in a country that places civil liberties above the rights of the state. I have often thought about the fever stations I encountered in the mountains of Shanxi, where coal truck drivers were compelled to submit to fever checks while people in bio-containment space suits sprayed antimicrobials all over their vehicles' cabs. I've tried to imagine such fever stations positioned along America's superhighways: Visions of angry drivers pulling shotguns on public health nurses and highway patrol officers always dance thru my head. Few countries could today manage a nationwide fever/quarantine campaign akin to China's SARS effort.

Indeed, I'm not sure the China of 2013 could pull off the feat it executed in 2003. Thanks to Weibo, China's equivalent of Twitter, and dozens of other Internet-posting possibilities, very little about this flu outbreak has remained secret for long. Any perceived violation of patients' rights or individual dignity is getting a virtual shout-out. And though President Xi and top health officials have already noted that travel over May Day might be unwise, and Hong Kong has signaled anxiety about the pending tsunami of mainland visitors, possibly bringing H7N9 their way, it seems unimaginable that today's government could close the perimeter of any major city, let alone Shanghai, the epicenter of H7N9, with population of some 23 million people.

by Laurie Garrett, Foreign Policy |  Read more:
Image: STR/AFP/Getty Images

Fabio Hurtado, The Letter
via:

Manolo Millares (Spanish, 1926–1972), Untitled, 1961
via:

Microbes: The Trillions of Creatures Governing Your Health



Of all the cases Barbara Warner has faced as a pediatrician specializing in newborns, the one that sticks hardest in her mind involved a couple who had been trying for years to have children. Finally, in 1997, the woman was pregnant. She was in her mid-40s. “This was her last chance,” says Warner. Then, too soon, she gave birth to twins. The first child died at two weeks of respiratory failure, at the time the most common killer of preterm babies.

A week later—it happened to be Thanksgiving Day—Warner folded down the blanket on the surviving twin, and even now she draws in her breath at the memory. The baby’s belly was reddened, shining and so swollen “you could have bounced a nickel off it.”

It was necrotizing enterocolitis, or NEC, little known outside neonatal intensive care units, but dreaded there as a sudden, fast-moving bacterial inflammation of the gut. On the operating table, a surgeon opened the baby boy’s abdomen and immediately closed it again. The intestinal tract from stomach to rectum was already dead. Warner, in tears, returned the child to die in the arms of his shattered parents.

“It’s 15 years later, and there’s nothing new,” Warner says bleakly as she moves among her tiny patients, each one covered in tubes and bathed in soft violet light, in a clear plastic incubator. NEC is still one of the leading killers of preterm babies. But that may soon change, thanks to a startling new way of looking at who we are and how we live.

Over the past few years, advances in genetic technology have opened a window into the amazingly populous and powerful world of microbial life in and around the human body—the normal community of bacteria, fungi and viruses that makes up what scientists call the microbiome. It’s Big Science, involving vast international research partnerships, leading edge DNA sequencing technology and datasets on a scale to make supercomputers cringe. It also promises the biggest turnaround in medical thinking in 150 years, replacing the single-minded focus on microbes as the enemy with a broader view that they are also our essential allies.

The subject matter is both humble and intimate. In Warner’s neonatal care unit at St. Louis Children’s Hospital, researchers studying NEC have analyzed every diaper of almost every very low-weight baby delivered there over the past three years. They don’t expect to find a single pathogen, some killer virus or bacteria, the way medical discovery typically happened in the past. Instead, says Phillip Tarr, a Washington University pediatric gastroenterologist who collaborates with Warner, they want to understand the back-and-forth among hundreds of microbial types in the newborn’s gut—to recognize when things go out of balance. Their goal is to identify the precise changes that put a baby on track to developing NEC and, for the first time, give neonatal care units crucial advance warning.

A separate research group demonstrated early this year that secretions from certain beneficial microbes seem to relieve the deadly inflammation characteristic of NEC. So doctors may soon see into life-or-death processes that until now have been hidden, and take action to address them.

The new insights into NEC suggest why the microbiome suddenly seems so important to almost everything in the medical and biological worlds, even our understanding of what it means to be human. We tend to think that we are exclusively a product of our own cells, upwards of ten trillion of them. But the microbes we harbor add another 100 trillion cells into the mix. The creature we admire in the mirror every morning is thus about 10 percent human by cell count. By weight, the picture looks prettier (for once): Altogether an average adult’s commensal microbes weigh about three pounds, roughly as much as the human brain. And while our 21,000 or so human genes help make us who we are, our resident microbes possess another eight million or so genes, many of which collaborate behind the scenes handling food, tinkering with the immune system, turning human genes on and off, and otherwise helping us function. John Donne said “no man is an island,” and Jefferson Airplane said “He’s a peninsula,” but it now looks like he’s actually a metropolis.

by Richard Conniff, Smithsonian |  Read more:
Image: Stephanie Dalton Cowan

David Byrne & Fatboy Slim Feat. Florence Welch


[ed. From the new "poperetta" by David Byrne. See also: A Rise to Power, Disco Round Included]

Probably the first thing you need to know about “Here Lies Love,” the musical conceived by David Byrne and running at the Public Theater through May 19, is that although it is about Imelda Marcos, the former first lady of the Philippines, her famous collection of shoes is neither mentioned nor shown.

That said, shoes are something audience members should consider: the Public’s LuEsther Hall has been transformed into an ’80s-style disco, and the audience is meant to stand, mill around or, if the spirit moves, dance through the entire 85-minute show. (There are a few seats for those who cannot.)

For Mr. Byrne, disco — both the form and the atmosphere it evokes — is a more vivid symbol of Mrs. Marcos than footwear; her infatuation with that music drew him to her as a potential subject. Having read “The Emperor,” Ryszard Kapuscinski’s biography of Haile Selassie, he became fascinated with autocrats who lived in a kind of surreal, theatrical bubble they create for themselves.

“I read that Imelda Marcos loved going to discos and that she had a mirror ball in her New York apartment and turned the roof of the palace in Manila into a disco,” Mr. Byrne said. “Here’s a kind of music that’s hedonistic and transcendent, that transports you to another world, and to me that captures some of what a powerful person is feeling. So it seemed like a natural soundtrack to this particular megalomaniac’s story.”

by Allan Koznin, NY Times |  Read more:

Tuesday, April 23, 2013


Elizaveta Porodina, Plug In Babies
via:

How the Trailer Park Could Save Us All


Residents call life at Pismo Dunes Senior Park “Pismodise.” Park manager Louise Payne calls it “a holding tank for the great beyond.” Louise has short hair and blunt bleached bangs that give her the air of a preteen skateboarder, but at 72 she’s often found rolling by the park’s 333 trailers in her electric golf cart, alternating between her roles as mother hen and whip-cracker. California is a notoriously youthful culture, but eventually the perpetually young get very old. If they’re lucky enough to live in Pismodise, which is on the Central Coast, they can exit its palm-lined entrance, cross the road, amble across the capacious sand of Pismo State Beach, and dip their toes in the Pacific Ocean while contemplating eternity (or a cocktail).

To move into Pismodise you must meet four conditions: Be 55 or older, keep your dog under 20 pounds, be present when guests stay at your home, and be comfortable with what most Americans consider a very small house. “If you need more than 800 square feet I can’t help you,” says Louise with a shrug. There seems to be some leeway on the dog’s weight. The unofficial rules are no less definite: If you are attending the late-afternoon cocktail session on the porch of Space 329, bring your own can, bottle, or box to drink. If you are fighting with other residents, you still have to greet them when you run into them. Make your peace with the word “trailer trash.”

No one in California aspires to be old or to live in a trailer, but we need to be more open to the possibilities inherent in both. Every day since January 1, 2011, some 10,000 American baby boomers have retired, and that will continue until 2030, when people over 65 will make up 19 percent of the population (up from 13 percent today). Old is the new boom and it is changing the culture and the conversation. (Have you seen all the sexy talk in Betty White’s reality show?) In Washington, D.C., anxiety about the decreasing proportion of workers to retirees underlies the frenzied discussion of “entitlement reform.”

Baby boomers aren’t going to retire the way their parents did. They are poorer and more likely to live alone. They can’t depend on pensions, and the real-estate bubble destroyed almost 50 percent of their wealth. Today one in six seniors lives in poverty, and that proportion is rising; the generation of Americans now facing retirement is so financially ill prepared that half of them have less than $10,000 in the bank. The coming swell of retirees will strain our current system to its limits—in terms of not only health care, but also incidental things like road signs, which are hard for drivers over 65 to read in a majority of American cities and towns.

Emily Greenfield, an assistant professor at the Rutgers School of Social Work, who researchers elder-care networks, says a change is occurring under our feet, whether we see it or not: “Baby boomers have critical mass—they’re covertly revolutionizing society again” as they retire.

One of the biggest questions facing the nation with regard to aging boomers is: Where are they going to live? The options amount to a tangle of euphemisms and politically correct titles: independent living, nursing homes, aging-in-place, naturally occurring retirement communities (NORCs), retirement village, memory-care units, age-restricted communities. All this complexity disguises a simple fact about money, happiness, and aging: Seniors who can live on their own cost the country relatively little—they even contribute to the economy. But those who move into nursing homes start to run up a significant tab—starting at $52,000 a year. People who are isolated and lonely end up in nursing homes sooner. Hence, finding ways to keep people living on their own, socially engaged, healthy, happy, and out of care isn’t just a personal or family goal—it’s a national priority. Among seniors’ living options, there is one we overlook: mobile homes. Time-tested, inhabited by no fewer than three million seniors already, but notoriously underloved, manufactured-homes can provide organic communities and a lifestyle that is healthy, affordable, and green, and not incidentally, fun. But in order to really see their charms, we need to change a mix of bad policies and prejudice.

by Lisa Margonelli, Pacific Standard | Read more:
Photo: Arnaldo Abba

The Rise of Big Data

Everyone knows that the Internet has changed how businesses operate, governments function, and people live. But a new, less visible technological trend is just as transformative: “big data.” Big data starts with the fact that there is a lot more information floating around these days than ever before, and it is being put to extraordinary new uses. Big data is distinct from the Internet, although the Web makes it much easier to collect and share data. Big data is about more than just communication: the idea is that we can learn from a large body of information things that we could not comprehend when we used only smaller amounts.

In the third century BC, the Library of Alexandria was believed to house the sum of human knowledge. Today, there is enough information in the world to give every person alive 320 times as much of it as historians think was stored in Alexandria’s entire collection -- an estimated 1,200 exabytes’ worth. If all this information were placed on CDs and they were stacked up, the CDs would form five separate piles that would all reach to the moon.

This explosion of data is relatively new. As recently as the year 2000, only one-quarter of all the world’s stored information was digital. The rest was preserved on paper, film, and other analog media. But because the amount of digital data expands so quickly -- doubling around every three years -- that situation was swiftly inverted. Today, less than two percent of all stored information is nondigital.
We can learn from a large body of information things that we could not comprehend when we used only smaller amounts.

Given this massive scale, it is tempting to understand big data solely in terms of size. But that would be misleading. Big data is also characterized by the ability to render into data many aspects of the world that have never been quantified before; call it “datafication.” For example, location has been datafied, first with the invention of longitude and latitude, and more recently with GPS satellite systems. Words are treated as data when computers mine centuries’ worth of books. Even friendships and “likes” are datafied, via Facebook.

This kind of data is being put to incredible new uses with the assistance of inexpensive computer memory, powerful processors, smart algorithms, clever software, and math that borrows from basic statistics. Instead of trying to “teach” a computer how to do things, such as drive a car or translate between languages, which artificial-intelligence experts have tried unsuccessfully to do for decades, the new approach is to feed enough data into a computer so that it can infer the probability that, say, a traffic light is green and not red or that, in a certain context, lumière is a more appropriate substitute for “light” than léger.

Using great volumes of information in this way requires three profound changes in how we approach data. The first is to collect and use a lot of data rather than settle for small amounts or samples, as statisticians have done for well over a century. The second is to shed our preference for highly curated and pristine data and instead accept messiness: in an increasing number of situations, a bit of inaccuracy can be tolerated, because the benefits of using vastly more data of variable quality outweigh the costs of using smaller amounts of very exact data. Third, in many instances, we will need to give up our quest to discover the cause of things, in return for accepting correlations. With big data, instead of trying to understand precisely why an engine breaks down or why a drug’s side effect disappears, researchers can instead collect and analyze massive quantities of information about events and everything that is associated with them, looking for patterns that might help predict future occurrences. Big data helps answer what, not why, and often that’s good enough.

The Internet has reshaped how humanity communicates. Big data is different: it marks a transformation in how society processes information. In time, big data might change our way of thinking about the world. As we tap ever more data to understand events and make decisions, we are likely to discover that many aspects of life are probabilistic, rather than certain.

by Kenneth Neil Cukier and Viktor Mayer-Schoenberger, Foreign Affairs | Read more:
Image: John Elk/ Getty Images

The Mistress and the Narcotraficante

Elena first met Hernán at a bar. She was in her early twenties, hanging out in a Juárez club frequented by people involved in the drug world, people who partied hard and were always flush with cash. Elena spotted Hernán across the room and asked a friend to introduce them. She was aggressive that way. She was also strikingly attractive and had a wild streak that made her uninterested in stable men with stable careers.

Elena and Hernán (all the names in this piece are pseudonyms) soon became a couple, of sorts—he already had a wife and children, and other mistresses. But Elena was different than the docile women he was accustomed to. If he pushed her she pushed back. She was not afraid of his violent character—her father was abusive, as were many of the men she’d been with since her adolescence, when she’d discovered her sexuality. That discovery had given her a power she’d never before experienced, as if something unknown and unanticipated had opened up within her. She had felt no fear that night at the bar when she walked across the room to meet Hernán, only a sense of opportunity.

In the cartel culture, braggadocio is the lingua franca, and flash and pretense often mask substance. Elena figured out quickly that Hernán was the real deal. For all his dime-a-dozen narco posturing—the abundance of cash, the ever-present gun, the gold jewelry—there was plenty of evidence pointing to his status as a midlevel narco within the Juárez cartel. Hernán was, in fact, one of many operators who helped the Juárez cartel move product across the border. He was something of an entrepreneur who ran his own crew, recruited his own mules, and sometimes invested his own money in his deals. He operated as a franchise of sorts, although he was under the control of the cartel.

Elena saw one of the first signs of Hernán's status during an encounter with the municipal police. One afternoon she and Hernán were in his new pickup truck speeding down the Avenida de las Americas, one of Juárez’s main boulevards. The windows were down and the sound system was blasting narcocorridos. Elena and Hernán were having a grand time. They’d been on a partying spree that had lasted several days. Suddenly, a police patrol car was in pursuit, lights flashing. Hernán cursed, but pulled over. When the officer approached the truck and recognized Hernán, his entire demeanor changed. “I’m sorry, sir,” Elena remembered the officer saying. “Can we escort you anywhere?” The Juárez cartel owned the police.

As a child in Juárez, Elena had grown up in roiling poverty, but she was outgoing and spunky and for a long time there was an inner optimism that transcended the reality of her family’s economic circumstances. In elementary school she’d even imagined herself becoming an archaeologist or an astronaut.

Elena’s father, though, was gruff with the children and abused their mother. He drank and partied with his friends, and they never knew if he would come home at night. He barely provided for the family; Elena’s older brothers helped support the household even though they were only adolescents and had to drop out of school to do so.

As a teenager, Elena was always out and about. The boys and men who wanted her were legion. There were nights when Elena didn’t return home, and she would walk in the door when she damned well felt like it. Her mother deemed her incorrigible. When she was 14, she ran away. For almost nine months her family didn’t know where she was. She slept at her girlfriends’ houses or stayed with men in the cheap motel rooms where they spent the night. Elena felt no fear in this abandon; she was full of the self-confidence that comes with commanding beauty.

by Ricardo C. Ainslie, Texas Monthly |  Read more:
Image: AP

cjeremyprice, December Duluth, 12x12
via:

January Full Moon - George Ault
via:

Averageness


In attractiveness studies, averageness is one of the characteristics of physical beauty in which the average phenotype, i.e. outward appearance, of the individual theoretically characterizes averaged genotypes, thus indicating health and fertility. The majority of averageness studies and theories have to do with photographic overlay studies, in which images are morphed together. Other factors involved in measuring attractiveness are symmetry, youthfulness and similarity (like-attracts-like).

In 1883, Francis Galton, cousin of Charles Darwin, devised a technique called composite photography, described in detail in Inquiries in Human Faculty and its Development, which he believed could be used to identify 'types' by appearance, which he hoped would aid medical diagnosis, and even criminology through the identification of typical criminal faces. In short, he wondered if certain groups of people had certain facial characteristics. To find this answer, he created photographic composite images of the faces of vegetarians and criminals to see if there was a typical facial appearance for each. Galton overlaid multiple images of faces onto a single photographic plate so that each individual face contributed roughly equally to a final composite face. While the resultant “averaged” faces did little to allow the a priori identification of either criminals or vegetarians, Galton observed that the composite image was more attractive than the component faces. Similar observations were made in 1886 by Stoddard, who created composite faces of members of the National Academy of Sciences and graduating seniors of Smith College. This phenomenon is now known as "averageness-effect", that is highly physically attractive tend to be indicative of the average traits of the population.

In 1990, one of the first computer-based photographic attractiveness rating studies was conducted. During this year psychologists Langlois and Roggman wanted to systematically examine whether mathematical averageness is linked with facial attractiveness. To test this, they selected photographs of 192 male and female Caucasian faces; each of which was computer scanned and digitized. They then made computer-processed composites of each image, as 2-, 4-, 8-, 16-, and 32-face composites, averaged by pixel. These faces, as well as the component faces, were rated for attractiveness by 300 judges on a 5-point Likert scale (1 = very unattractive, 5 = very attractive). The results showed that the 32-composite face was the most visually attractive of all the faces.

Work on isolated populations suggest that preferences for averageness appear to be universal. In addition, while isolated people prefer average faces from their own race, they do not show any preference for average faces of other races to which they are not exposed. This makes sense since they should have no knowledge of what an average face looks like. This suggests that it is averageness alone, that is making a face attractive rather than some other artifact that results from the averaging techniques.

via: Wikipedia |  Read more:
Image via: