Thursday, February 21, 2013

We Couldn't Stop Looking

Aperture magazine, which just celebrated its sixtieth anniversary, has published Aperture Magazine Anthology: The Minor White Years, 1952-1976, a marvelous anthology devoted to Minor White, its founder and long-time editor, that collects the best of the critical writings on photography that appeared in the magazine in those years. Leafing through its pages and seeing the familiar cover photographs brought back many memories, since I worked for Aperture between 1967 and 1970 and played a small part in the production of eleven issues of the magazine and a few books. (...)

Though I was listed on the masthead of the magazine as “business manager,” I did everything that needed to be done around the office. I answered the rare phone calls, opened and answered mail from subscribers, swept the floor, paid the bills, delivered proofs to the printers, and was on call for various emergencies when an issue was in production, since there were no other full-time employees. These visits to the printers, engravers, and compositors—who in dimly-lit lofts with grimy skylights and windows in lower Manhattan and Brooklyn plied their trade using old-fashioned letterpresses with the skill to achieve the range and depth of tone in the black-and-white photographs we published—were especially memorable. One entered a world that had hardly changed in a century to be met by some gaunt, ghostly-pale old man who looked like a character out of Dickens, who would then be joined, as he squinted over the image I had brought, by two equally venerable fellow workers. They would study it a long time, hardly exchanging a word, until one of them would indicate an area of the photograph with his finger and another one of them would either shake his head or nod in agreement. (...)

In one of the older issues, Minor White had an essay called “What is Meant by ‘Reading’ Photographs” that made a big impression on me. He writes in it about hearing photographers often say that if they could write they would not take pictures. With me, I realized, it was the other way around. If I could take pictures, I would not write poems—or at least, this is what I thought every time I fell in love with some photograph in the office, in many cases with one that I had already seen, but somehow, to my surprise, failed to properly notice before. There is a wonderful moment when we realize that the picture we’ve been looking at for a long time has become a part of us as much as some childhood memory or some dream we once had. The attentive eye makes the world interesting. A good photograph, like a good poem, is a self-contained little universe inexhaustible to scrutiny.

by Charles Simic, NY Review of Books | Read more:
Photo: George Peet

Lily Furedi  Subway, 1934
via:

Jacques Henri Lartigue (1894 – 1986)

Max Dalton, Guitar Lessons
via:

Eat Bray Love


The corruption of Anthony Bourdain, the return of Emeril Lagasse, and the state of food television.

A decade ago Emeril Lagasse was omnipresent, sprinkling catchphrases and cayenne nightly in front of a live studio audience. A lumbering, rump roast of a man who cooked like Paul Prudhomme but talked like the Gorton's Fisherman, Emeril was the unlikely poster boy of the transformation of what had once quaintly been known as "cooking shows" into a more unwieldy behemoth: "food TV." Fueled by the insatiable advertising needs of the Food Network and a viewing public suddenly interested in distinguishing deglazing from deveining, the staid format established by Julia Child and Jacques Pepin was chucked into the garbage like spoiled milk. It was no longer enough to stand behind a stove top and instruct. The new goal was to entertain. Chefs were required to prep themselves right alongside their mise en place, to garnish their dishes not with parsley but with personality.

And so, beginning in 1997, Emeril applied essence and kicked things up to varying notches. He employed a soft jazz band and cooked with Pat Benatar. He made garlic an applause line and convinced untold millions of Americans to try their hand at something called Urky Lurky. The goal remained ostensibly the same, despite the extra volume: to make home cooking appear doable and fun. But the extra noise soon began to drown out the message. Emeril endorsed toothpaste and floor mats and allowed someone to talk him into starring in an NBC sitcom. Eventually, the demands of celebrity scraped Emeril's plate clean, and by the time Emeril Live's goose was finally cooked in 2007, he'd unwittingly set the table for an entire generation of cheesy blasters still to come.

With no way to serve us an actual steak, the Food Network rebranded itself in desperate search of sizzle. The hyperactive appetite of television — for youth, for spark, for drama more genetically modified than a tomato in December — is far more demanding than any mere gastronome. And so the TV part of the equation began to outweigh the food. Legit cooks like Mario Batali and Michael Chiarello also went out the door. The rise of the hubris-devouring succubus that is The Next Food Network Star — and the long-term cheap replacement labor it provided — meant that their expertise was expendable, easily sacrificed on the altar of accessibility. Cooking isn't all that difficult, but cooking well absolutely is. And so the second generation of Food Network shows focused on making everything as easy as humanly possible, an interchangeable cavalcade of shortcuts and time-savers and "healthy alternatives," an endless slate of chipper idiots demonstrating idiotproof ways to successfully make sandwiches. The rest of the schedule was given over to a series of increasingly ludicrous competitions: The honorable Japanese Iron Chef begat a tarnished American version. Cupcake Battles escalated into Halloween Wars. Newer shows promised to reveal — and humiliate — the Worst Cooks in America. There's the even more execrable Rachael vs. Guy Celebrity Cook-Off, which revels in subhuman incompetence.

The schizophrenic network seemed committed to the idea of separating its viewership into either cartoony warriors or overmatched civilians, presenting the kitchen as either a battleground or a ticking time bomb. Food itself was either impossibly out of reach or beside the point, like fat floating on the surface of a broken sauce.

Fermenting just beneath Emeril's rise, the sourdough to his bubbly yeast, was Anthony Bourdain. The acerbic former junkie turned professional raconteur talked more smack than he'd ever injected, particularly on the subject of chefs on TV. (Emeril, for example, was both an "Ewok" and a hack.) Bourdain was a proud and snarly outsider, a thoroughly undistinguished line cook lifer suddenly handed a bullhorn on the back of a surprise bestseller. The chip on his shoulder was the size of a Yukon Gold. But, first on the Food Network and then on the Travel Channel, Bourdain proved himself to be a peerless ambassador for the extremes of cooking, high and low.

He was never half the chef Emeril was — something he'd be the first to admit — but he was twice as good on camera. No Reservations, which recently ended a triumphant nine-year run, was consistently one of the best things on television, a gorgeously shot valentine to global food culture. Bourdain's snark was always as much of an affectation as the earring and cigarettes — both now mercifully discarded — and so I never found him off-putting. Rather, I found him brilliantly and persuasively respectful, making the case that eating a raw seal eyeball or a bowl full of deep-fried crickets aren't isolated acts of gross-out machismo but a way to connect with people and traditions that existed long before Cool Ranch Doritos Locos Tacos — and will hopefully survive long after that abomination is wiped from the earth.

At its foul-mouthed best, Tony Bourdain's shtick is absolutely empowering, but not in the faux-populist manner of a Sandra Lee or Guy Fieri. What's made his voice so important is his steadfast refusal to coddle anything but eggs. Unlike most food shows, the central message of No Reservations was actually, no, you can't do this; you can't cook it, you can't re-create it, you can't dumb it down. Bourdain was a knight-errant of good taste, a champion of expertise and authenticity. Real food experiences, he argued, whether at a sushi counter in Tokyo or a hot dog stand in Chicago, are worth seeking out. Appreciation is just as important as enthusiasm.

by Andy Greenwald, Grantland |  Read more:
Photo: ABC

The Robot Will See You Now


Harley lukov didn’t need a miracle. He just needed the right diagnosis. Lukov, a 62-year-old from central New Jersey, had stopped smoking 10 years earlier—fulfilling a promise he’d made to his daughter, after she gave birth to his first grandchild. But decades of cigarettes had taken their toll. Lukov had adenocarcinoma, a common cancer of the lung, and it had spread to his liver. The oncologist ordered a biopsy, testing a surgically removed sample of the tumor to search for particular “driver” mutations. A driver mutation is a specific genetic defect that causes cells to reproduce uncontrollably, interfering with bodily functions and devouring organs. Think of an on/off switch stuck in the “on” direction. With lung cancer, doctors typically test for mutations called EGFR and ALK, in part because those two respond well to specially targeted treatments. But the tests are a long shot: although EGFR and ALK are the two driver mutations doctors typically see with lung cancer, even they are relatively uncommon. When Lukov’s cancer tested negative for both, the oncologist prepared to start a standard chemotherapy regimen—even though it meant the side effects would be worse and the prospects of success slimmer than might be expected using a targeted agent.

But Lukov’s true medical condition wasn’t quite so grim. The tumor did have a driver—a third mutation few oncologists test for in this type of case. It’s called KRAS. Researchers have known about KRAS for a long time, but only recently have they realized that it can be the driver mutation in metastatic lung cancer—and that, in those cases, it responds to the same drugs that turn it off in other tumors. A doctor familiar with both Lukov’s specific medical history and the very latest research might know to make the connection—to add one more biomarker test, for KRAS, and then to find a clinical trial testing the efficacy of KRAS treatments on lung cancer. But the national treatment guidelines for lung cancer don’t recommend such action, and few physicians, however conscientious, would think to do these things.

Did Lukov ultimately get the right treatment? Did his oncologist make the connection between KRAS and his condition, and order the test? He might have, if Lukov were a real patient and the oncologist were a real doctor. They’re not. They are fictional composites developed by researchers at the Memorial Sloan-Kettering Cancer Center in New York, in order to help train—and demonstrate the skills of—IBM’s Watson supercomputer. Yes, this is the same Watson that famously went on Jeopardy and beat two previous human champions. But IBM didn’t build Watson to win game shows. The company is developing Watson to help professionals with complex decision making, like the kind that occurs in oncologists’ offices—and to point out clinical nuances that health professionals might miss on their own.

Information technology that helps doctors and patients make decisions has been around for a long time. Crude online tools like WebMD get millions of visitors a day. But Watson is a different beast. According to IBM, it can digest information and make recommendations much more quickly, and more intelligently, than perhaps any machine before it—processing up to 60 million pages of text per second, even when that text is in the form of plain old prose, or what scientists call “natural language.”

That’s no small thing, because something like 80 percent of all information is “unstructured.” In medicine, it consists of physician notes dictated into medical records, long-winded sentences published in academic journals, and raw numbers stored online by public-health departments. At least in theory, Watson can make sense of it all. It can sit in on patient examinations, silently listening. And over time, it can learn. Just as Watson got better at Jeopardy the longer it played, so it gets better at figuring out medical problems and ways of treating them the more it interacts with real cases. Watson even has the ability to convey doubt. When it makes diagnoses and recommends treatments, it usually issues a series of possibilities, each with its own level of confidence attached.

Medicine has never before had a tool quite like this. And at an unofficial coming-out party in Las Vegas last year, during the annual meeting of the Healthcare Information and Management Systems Society, more than 1,000 professionals packed a large hotel conference hall, and an overflow room nearby, to hear a presentation by Marty Kohn, an emergency-room physician and a clinical leader of the IBM team training Watson for health care. Standing before a video screen that dwarfed his large frame, Kohn described in his husky voice how Watson could be a game changer—not just in highly specialized fields like oncology but also in primary care, given that all doctors can make mistakes that lead to costly, sometimes dangerous, treatment errors.

Drawing on his own clinical experience and on academic studies, Kohn explained that about one-third of these errors appear to be products of misdiagnosis, one cause of which is “anchoring bias”: human beings’ tendency to rely too heavily on a single piece of information. This happens all the time in doctors’ offices, clinics, and emergency rooms. A physician hears about two or three symptoms, seizes on a diagnosis consistent with those, and subconsciously discounts evidence that points to something else. Or a physician hits upon the right diagnosis, but fails to realize that it’s incomplete, and ends up treating just one condition when the patient is, in fact, suffering from several. Tools like Watson are less prone to those failings. As such, Kohn believes, they may eventually become as ubiquitous in doctors’ offices as the stethoscope.

“Watson fills in for some human limitations,” Kohn told me in an interview. “Studies show that humans are good at taking a relatively limited list of possibilities and using that list, but are far less adept at using huge volumes of information. That’s where Watson shines: taking a huge list of information and winnowing it down.”

by Jonathan Cohn, The Atlantic |  Read more:
Illustration: Bart Cooke

Bitter Pill: Why Medical Bills Are Killing Us


When Sean Recchi, a 42-year-old from Lancaster, Ohio, was told last March that he had non-Hodgkin’s lymphoma, his wife Stephanie knew she had to get him to MD Anderson Cancer Center in Houston. Stephanie’s father had been treated there 10 years earlier, and she and her family credited the doctors and nurses at MD Anderson with extending his life by at least eight years.

Because Stephanie and her husband had recently started their own small technology business, they were unable to buy comprehensive health insurance. For $469 a month, or about 20% of their income, they had been able to get only a policy that covered just $2,000 per day of any hospital costs. “We don’t take that kind of discount insurance,” said the woman at MD Anderson when Stephanie called to make an appointment for Sean.

Stephanie was then told by a billing clerk that the estimated cost of Sean’s visit — just to be examined for six days so a treatment plan could be devised — would be $48,900, due in advance. Stephanie got her mother to write her a check. “You do anything you can in a situation like that,” she says. The Recchis flew to Houston, leaving Stephanie’s mother to care for their two teenage children.

About a week later, Stephanie had to ask her mother for $35,000 more so Sean could begin the treatment the doctors had decided was urgent. His condition had worsened rapidly since he had arrived in Houston. He was “sweating and shaking with chills and pains,” Stephanie recalls. “He had a large mass in his chest that was … growing. He was panicked.”

Nonetheless, Sean was held for about 90 minutes in a reception area, she says, because the hospital could not confirm that the check had cleared. Sean was allowed to see the doctor only after he advanced MD Anderson $7,500 from his credit card. The hospital says there was nothing unusual about how Sean was kept waiting. According to MD Anderson communications manager Julie Penne, “Asking for advance payment for services is a common, if unfortunate, situation that confronts hospitals all over the United States.”

Diagnosed with non-Hodgkin’s lymphoma at age 42. Total cost, in advance, for Sean’s treatment plan and initial doses of chemotherapy: $83,900. Charges for blood and lab tests amounted to more than $15,000; with Medicare, they would have cost a few hundred dollars

The total cost, in advance, for Sean to get his treatment plan and initial doses of chemotherapy was $83,900.

Why? (...)

When I asked MD Anderson to comment on the charges on Recchi’s bill, the cancer center released a written statement that said in part, “The issues related to health care finance are complex for patients, health care providers, payers and government entities alike … MD Anderson’s clinical billing and collection practices are similar to those of other major hospitals and academic medical centers.”

The hospital’s hard-nosed approach pays off. Although it is officially a nonprofit unit of the University of Texas, MD Anderson has revenue that exceeds the cost of the world-class care it provides by so much that its operating profit for the fiscal year 2010, the most recent annual report it filed with the U.S. Department of Health and Human Services, was $531 million. That’s a profit margin of 26% on revenue of $2.05 billion, an astounding result for such a service-intensive enterprise.

The president of MD Anderson is paid like someone running a prosperous business. Ronald DePinho’s total compensation last year was $1,845,000. That does not count outside earnings derived from a much publicized waiver he received from the university that, according to the Houston Chronicle, allows him to maintain unspecified “financial ties with his three principal pharmaceutical companies.”

DePinho’s salary is nearly triple the $674,350 paid to William Powers Jr., the president of the entire University of Texas system, of which MD Anderson is a part. This pay structure is emblematic of American medical economics and is reflected on campuses across the U.S., where the president of a hospital or hospital system associated with a university — whether it’s Texas, Stanford, Duke or Yale — is invariably paid much more than the person in charge of the university.  (...)

Recchi’s bill and six others examined line by line for this article offer a closeup window into what happens when powerless buyers — whether they are people like Recchi or big health-insurance companies — meet sellers in what is the ultimate seller’s market.

The result is a uniquely American gold rush for those who provide everything from wonder drugs to canes to high-tech implants to CT scans to hospital bill-coding and collection services. In hundreds of small and midsize cities across the country — from Stamford, Conn., to Marlton, N.J., to Oklahoma City — the American health care market has transformed tax-exempt “nonprofit” hospitals into the towns’ most profitable businesses and largest employers, often presided over by the regions’ most richly compensated executives. And in our largest cities, the system offers lavish paychecks even to midlevel hospital managers, like the 14 administrators at New York City’s Memorial Sloan-Kettering Cancer Center who are paid over $500,000 a year, including six who make over $1 million.

Taken as a whole, these powerful institutions and the bills they churn out dominate the nation’s economy and put demands on taxpayers to a degree unequaled anywhere else on earth. In the U.S., people spend almost 20% of the gross domestic product on health care, compared with about half that in most developed countries. Yet in every measurable way, the results our health care system produces are no better and often worse than the outcomes in those countries.

by Steven Brill, Time |  Read more:
Photo: uncredited

That’s Not A Droid, That’s My Girlfriend

Osamu Kozaki’s life in Tokyo is, by his own admission, often a lonely one. The 35-year-old, an engineer who designs industrial robots, has had few relationships with women in his life. Those few have almost always gone badly.

So when Kozaki’s girlfriend, Rinko Kobayakawa, sends him a message, his day brightens up. The relationship started more than three years ago, when Kobayakawa was a prickly 16-year-old working in her school library, a quiet girl who shut out the world with a pair of earphones that blasted punk music.

Kozaki sums up Kobayakawa’s personality with one word: tsundere – a popular term in Japan’s otaku geek culture, which describes a certain feminine ideal. It refers to the kind of girl who starts out hostile but whose heart gradually grows warmer. And that’s what has happened; over time, Kobayakawa has changed. These days, she spends much of her day sending affectionate missives to her boyfriend, inviting him on dates, or seeking his opinion when she wants to buy a new dress or try a new hairstyle.

But while Kozaki has aged, Kobayakawa has not. After three years, she’s still 16. She always will be. That’s because she is a simulation; Kobayakawa only exists inside a computer.

Kozaki’s girlfriend has never been born. She will never die. Technically, she has never lived. She may be deleted, but Kozaki would never let that happen.

Because he’s in love.

Kozaki is one of hundreds of thousands of Japanese who have bought Love Plus, a game released on the Nintendo DS in 2009, which is intended to simulate the experience of high-school romance with one of three pre-programmed teen girl characters. For a sizable number of loyal male gamers, it has become something more: a relationship that, if not entirely like dating a real woman, comes close as a source of affection.

“I really do love her,” Kozaki explains, when he and two of his friends meet with me in a coffee shop in Akihabara, the Tokyo neighbourhood at the centre of Japan’s otakuculture. Kozaki fully expects the game to be a lifelong commitment. “If someone were to ask me to stop, I don’t think I could do it,” he says.

Kozaki recounts what happened when an updated version of the game came out; which meant he had to move his saved data onto a new program. Kozaki couldn’t come at the idea of having two simultaneous versions of his virtual girlfriend in existence, so he asked a friend to delete the old saved data for him. It was — almost — as if he had arranged for someone to be murdered, he says. “I cried when he pushed that delete button,” he says, acknowledging that it sounds strange. “It was as if I crossed a border line from reality.”

by Aubrey Belford, The Global Mail |  Read more:
Photo: Aubrey Belford

Wednesday, February 20, 2013


Banksy (see also: Banksy Mural Mystery Deepens)

Social Work in the Tenderloin Will Kill Something Inside of You


The Tenderloin is widely acknowledged as the most hellish neighborhood in San Francisco. Out of the city's ten most violent crime plots, the Tenderloin is home to seven. Recent stats estimate the neighborhood has an average of three major crimes per hour, including one-third of the city’s drug offenses, with a yearly mean of two crimes per resident. The population is made up of more than 6,000 homeless people and contains one-fourth of the city’s HIV-positive drug users. Filthy sidewalks and vacant buildings peppered with single-occupancy hotel rooms provide a home to all levels of drugs and prostitution.

My friend Lorian has been employed as a social worker in the Tenderloin for several years now. Her tweets about it (things like: “today: 4 dead clients, 1 murdered provider, 1 client defecated in the lobby, 1 dead dog, & 1 facebook friend posted pictures of nachos.”) got me curious as to what her job is like. She was kind enough to answer some of my questions.


VICE: I imagine it varies greatly, but can you describe your average workday?

Lorian: The first thing is getting through the door at 9 AM. We usually have to step over clients or random strangers passed out on the benches from drinking and/or using since God knows when. The smell is the first thing that hits you—a stench of urine, feces, poor hygiene—it's really at its strongest in the morning, but you get used to it throughout the day. Then we check our voicemail. Twenty messages from the same two or three clients who either scream their financial requests over and over, simply sit there and breathe, or tell you that witches are under their beds waiting for the next blood sacrifice. Paranoid clients like to fixate on witches, Satan, etc. Anyway, we get ready to open and hand out checks to the clients who are either on daily budgets, or who make random check requests. The budgeted clients are the most low-functioning, as they can be restricted to as little as $7 per day in order to curb their harm reduction. They'll go and spend that $7 on whatever piece of crack they can find, and then two hours later they're back, begging for more money. Clients will find some really brilliant ways to beg. When we're not dealing with clients out in the lobby, which can involve anything from handing out checks to cleaning up blood to clearing the floor for folks having seizures, we're usually dealing with the government agency assholes over at Social Security. I personally work with around 200 clients, so the paperwork and filing can be extraordinary. My “average day” starts at 9 AM and lasts until 7 or 8 PM.

by Blake Butler, Vice |  Read more:
Images: uncredited

Sehriyar Cem - Untitled 92, 2011 acrylic on canvas 39.4 x 35.4 in
via:

P.G. Wodehouse
via:

The Extraordinary Science of Addictive Junk Food


On the evening of April 8, 1999, a long line of Town Cars and taxis pulled up to the Minneapolis headquarters of Pillsbury and discharged 11 men who controlled America’s largest food companies. Nestlé was in attendance, as were Kraft and Nabisco, General Mills and Procter & Gamble, Coca-Cola and Mars. Rivals any other day, the C.E.O.’s and company presidents had come together for a rare, private meeting. On the agenda was one item: the emerging obesity epidemic and how to deal with it. While the atmosphere was cordial, the men assembled were hardly friends. Their stature was defined by their skill in fighting one another for what they called “stomach share” — the amount of digestive space that any one company’s brand can grab from the competition.

James Behnke, a 55-year-old executive at Pillsbury, greeted the men as they arrived. He was anxious but also hopeful about the plan that he and a few other food-company executives had devised to engage the C.E.O.’s on America’s growing weight problem. “We were very concerned, and rightfully so, that obesity was becoming a major issue,” Behnke recalled. “People were starting to talk about sugar taxes, and there was a lot of pressure on food companies.” Getting the company chiefs in the same room to talk about anything, much less a sensitive issue like this, was a tricky business, so Behnke and his fellow organizers had scripted the meeting carefully, honing the message to its barest essentials. “C.E.O.’s in the food industry are typically not technical guys, and they’re uncomfortable going to meetings where technical people talk in technical terms about technical things,” Behnke said. “They don’t want to be embarrassed. They don’t want to make commitments. They want to maintain their aloofness and autonomy.”

A chemist by training with a doctoral degree in food science, Behnke became Pillsbury’s chief technical officer in 1979 and was instrumental in creating a long line of hit products, including microwaveable popcorn. He deeply admired Pillsbury but in recent years had grown troubled by pictures of obese children suffering from diabetes and the earliest signs of hypertension and heart disease. In the months leading up to the C.E.O. meeting, he was engaged in conversation with a group of food-science experts who were painting an increasingly grim picture of the public’s ability to cope with the industry’s formulations — from the body’s fragile controls on overeating to the hidden power of some processed foods to make people feel hungrier still. It was time, he and a handful of others felt, to warn the C.E.O.’s that their companies may have gone too far in creating and marketing products that posed the greatest health concerns.

The discussion took place in Pillsbury’s auditorium. The first speaker was a vice president of Kraft named Michael Mudd. “I very much appreciate this opportunity to talk to you about childhood obesity and the growing challenge it presents for us all,” Mudd began. “Let me say right at the start, this is not an easy subject. There are no easy answers — for what the public health community must do to bring this problem under control or for what the industry should do as others seek to hold it accountable for what has happened. But this much is clear: For those of us who’ve looked hard at this issue, whether they’re public health professionals or staff specialists in your own companies, we feel sure that the one thing we shouldn’t do is nothing.”

As he spoke, Mudd clicked through a deck of slides — 114 in all — projected on a large screen behind him. The figures were staggering. More than half of American adults were now considered overweight, with nearly one-quarter of the adult population — 40 million people — clinically defined as obese. Among children, the rates had more than doubled since 1980, and the number of kids considered obese had shot past 12 million. (This was still only 1999; the nation’s obesity rates would climb much higher.) Food manufacturers were now being blamed for the problem from all sides — academia, the Centers for Disease Control and Prevention, the American Heart Association and the American Cancer Society. The secretary of agriculture, over whom the industry had long held sway, had recently called obesity a “national epidemic.”

Mudd then did the unthinkable. He drew a connection to the last thing in the world the C.E.O.’s wanted linked to their products: cigarettes. First came a quote from a Yale University professor of psychology and public health, Kelly Brownell, who was an especially vocal proponent of the view that the processed-food industry should be seen as a public health menace: “As a culture, we’ve become upset by the tobacco companies advertising to children, but we sit idly by while the food companies do the very same thing. And we could make a claim that the toll taken on the public health by a poor diet rivals that taken by tobacco.”

“If anyone in the food industry ever doubted there was a slippery slope out there,” Mudd said, “I imagine they are beginning to experience a distinct sliding sensation right about now.” (...)

What happened next was not written down. But according to three participants, when Mudd stopped talking, the one C.E.O. whose recent exploits in the grocery store had awed the rest of the industry stood up to speak. His name was Stephen Sanger, and he was also the person — as head of General Mills — who had the most to lose when it came to dealing with obesity. Under his leadership, General Mills had overtaken not just the cereal aisle but other sections of the grocery store. The company’s Yoplait brand had transformed traditional unsweetened breakfast yogurt into a veritable dessert. It now had twice as much sugar per serving as General Mills’ marshmallow cereal Lucky Charms. And yet, because of yogurt’s well-tended image as a wholesome snack, sales of Yoplait were soaring, with annual revenue topping $500 million. Emboldened by the success, the company’s development wing pushed even harder, inventing a Yoplait variation that came in a squeezable tube — perfect for kids. They called it Go-Gurt and rolled it out nationally in the weeks before the C.E.O. meeting. (By year’s end, it would hit $100 million in sales.)

According to the sources I spoke with, Sanger began by reminding the group that consumers were “fickle.” (Sanger declined to be interviewed.) Sometimes they worried about sugar, other times fat. General Mills, he said, acted responsibly to both the public and shareholders by offering products to satisfy dieters and other concerned shoppers, from low sugar to added whole grains. But most often, he said, people bought what they liked, and they liked what tasted good. “Don’t talk to me about nutrition,” he reportedly said, taking on the voice of the typical consumer. “Talk to me about taste, and if this stuff tastes better, don’t run around trying to sell stuff that doesn’t taste good.”

To react to the critics, Sanger said, would jeopardize the sanctity of the recipes that had made his products so successful. General Mills would not pull back. He would push his people onward, and he urged his peers to do the same. Sanger’s response effectively ended the meeting.

“What can I say?” James Behnke told me years later. “It didn’t work. These guys weren’t as receptive as we thought they would be.” Behnke chose his words deliberately. He wanted to be fair. “Sanger was trying to say, ‘Look, we’re not going to screw around with the company jewels here and change the formulations because a bunch of guys in white coats are worried about obesity.’ ”

The meeting was remarkable, first, for the insider admissions of guilt. But I was also struck by how prescient the organizers of the sit-down had been. Today, one in three adults is considered clinically obese, along with one in five kids, and 24 million Americans are afflicted by type 2 diabetes, often caused by poor diet, with another 79 million people having pre-diabetes. Even gout, a painful form of arthritis once known as “the rich man’s disease” for its associations with gluttony, now afflicts eight million Americans.

The public and the food companies have known for decades now — or at the very least since this meeting — that sugary, salty, fatty foods are not good for us in the quantities that we consume them. So why are the diabetes and obesity and hypertension numbers still spiraling out of control? It’s not just a matter of poor willpower on the part of the consumer and a give-the-people-what-they-want attitude on the part of the food manufacturers. What I found, over four years of research and reporting, was a conscious effort — taking place in labs and marketing meetings and grocery-store aisles — to get people hooked on foods that are convenient and inexpensive. I talked to more than 300 people in or formerly employed by the processed-food industry, from scientists to marketers to C.E.O.’s. Some were willing whistle-blowers, while others spoke reluctantly when presented with some of the thousands of pages of secret memos that I obtained from inside the food industry’s operations. What follows is a series of small case studies of a handful of characters whose work then, and perspective now, sheds light on how the foods are created and sold to people who, while not powerless, are extremely vulnerable to the intensity of these companies’ industrial formulations and selling campaigns.

by Michael Moss, NY Times |  Read more:
Image: Grant Cornett

Tuesday, February 19, 2013

Zero-Days

Welcome to the Malware-Industrial Complex

Every summer, computer security experts get together in Las Vegas for Black Hat and DEFCON, conferences that have earned notoriety for presentations demonstrating critical security holes discovered in widely used software. But while the conferences continue to draw big crowds, regular attendees say the bugs unveiled haven’t been quite so dramatic in recent years.

One reason is that a freshly discovered weakness in a popular piece of software, known in the trade as a “zero-day” vulnerability because the software makers have had no time to develop a fix, can be cashed in for much more than a reputation boost and some free drinks at the bar. Information about such flaws can command prices in the hundreds of thousands of dollars from defense contractors, security agencies and governments.

This trade in zero-day exploits is poorly documented, but it is perhaps the most visible part of a new industry that in the years to come is likely to swallow growing portions of the U.S. national defense budget, reshape international relations, and perhaps make the Web less safe for everyone.

Zero-day exploits are valuable because they can be used to sneak software onto a computer system without detection by conventional computer security measures, such as antivirus packages or firewalls. Criminals might do that to intercept credit card numbers. An intelligence agency or military force might steal diplomatic communications or even shut down a power plant.

It became clear that this type of assault would define a new era in warfare in 2010, when security researchers discovered a piece of malicious software, or malware, known as Stuxnet. Now widely believed to have been a project of U.S. and Israeli intelligence (U.S. officials have yet to publicly acknowledge a role but have done so anonymously to the New York Times and NPR), Stuxnet was carefully designed to infect multiple systems needed to access and control industrial equipment used in Iran’s nuclear program. The payload was clearly the work of a group with access to government-scale resources and intelligence, but it was made possible by four zero-day exploits for Windows that allowed it to silently infect target computers. That so many precious zero-days were used at once was just one of Stuxnet’s many striking features.

Since then, more Stuxnet-like malware has been uncovered, and it’s involved even more complex techniques (see “The Antivirus Era Is Over”). It is likely that even more have been deployed but escaped public notice. Meanwhile, governments and companies in the United States and around the world have begun paying more and more for the exploits needed to make such weapons work, says Christopher Soghoian, a principal technologist at the American Civil Liberties Union.

“On the one hand the government is freaking out about cyber-security, and on the other the U.S. is participating in a global market in vulnerabilities and pushing up the prices,” says Soghoian, who says he has spoken with people involved in the trade and that prices range from the thousands to the hundreds of thousands. Even civilian law-enforcement agencies pay for zero-days, Soghoian says, in order to sneak spy software onto suspects’ computers or mobile phones.

by Tom Simonite, MIT Technology Review |  Read more:
Image: Dan Page

Steve Javiel, Inspiration
via:

No Comments

[ed. Excerpted from a recent post on The Big Picture (one of the best financial blogs on the internet) titled "Why I Am Considering Getting Rid of Comments". Duck Soup gets hardly a fraction of the traffic TBP does, but Mr Ritholtz' post goes a long way in explaining why I've been reluctant to implement a comments section here. If you're interested in this type of issue he provides a couple of links that are well worth reading, including: How to Spot - and Defeat - Disruption on the Internet and COINTELPRO Techniques for Dilution, Misdirection and Control of an Internet Forum.]

Since I began this humble blog almost 11 years, 25,000 posts and 110 million page views ago, it has managed (despite my best efforts) to accumulate half a million comments.

This was never my intention.

I created this blog, in the words of Daniel Boorstin, to figure out what I think. It is where I gather my favorite charts, quotes, links and assorted ideas. The blog is simply a diary of random thoughts of a person working in finance. Think of it as the musings of an intelligent investor who, despite studying his subject for decades, still puzzles over many aspects of it.

Overall, the goal with this blog has been an attempt to discern the objective “Truth” (whatever that means) in an industry that does its best to hide that truth from public view. When I do uncover a small measure of truth, I enjoy sharing the discovery here. (...)

Managing blog comments has become an increasingly time consuming job. Policing the spammers, trolls, haters, and other purveyors of falsehoods has become a larger time suck than I am willing to accept. Dealing with such cretins hardens your outlook and shortens your temper more than I care for. Perhaps this is the reason so many high profile blogs have closed down their comments altogether.

Were I to shut down my comments, it would be for a reason I have not seen enumerated elsewhere: The intellectually disingenuous rhetorical sleight of hand that has become a substitute for legitimate debate. (See this and this). I simply do not have the time nor the interest in correcting every half-truth and lie. But I have even less interest in polluting the blog with this sort of nonsense.

Therein lay my quandry. A harsh solution beckons.

by Barry Ritholtz, The Big Picture |  Read more:

Karen Tusinski
via:

Michelle BladeDay 359. 366 Days of the Apocalypse. 2012.
Acrylic ink on paper, 8 x 10”
via: