Saturday, November 9, 2013

Why the Truth About MSG is So Easy to Swallow


In 1908, over a bowl of seaweed soup, Japanese scientist Kikunae Ikeda asked a question that would change the food industry forever: what gave dashi, a ubiquitous Japanese soup base, its meaty flavor? In Japanese cuisine, dashi, a fermented base made from boiled seaweed and dried fish, was widely used by chefs to add extra oomph to meals–pairing well with other savory, but meatless foods like vegetables and soy. For some reason that was generally accepted but inexplicable, dashi made these meatless foods meaty–and Ikeda was determined to find out why.

Ikeda was able to isolate the main substance of dashi–the seaweed Laminaria japonica. He then took the seaweed and ran it through a series of chemical experiments, using evaporation to isolate a specific compound within the seaweed. After days of evaporating and treating the seaweed, he saw the development of a crystalline form. When he tasted the crystals, he recognized the distinct savory taste that dashi lent to other foods, a taste that he deemed umami, from the Japanese umai (delicious.) It was a breakthrough that challenged a cornerstone of culinary thinking: instead of four tastes—sweet, salty, bitter and sour—there were now five. A new frontier of taste had been discovered, and Ikeda wasted no time monopolizing on his discovery.

He determined the molecular formula of the crystals: C5H9NO4, the same as glutamic acid, an amino acid designated as non-essential because the human body, as well as a large smattering of other plants and animals is able to produce it on its own. In the body, glutamic acid is often found as glutamate, a different compound that has one less hydrogen atom. Glutamate is one of the most abundant excitatory neurotransmitters in brain, playing a crucial role in memory and learning. The FDA estimates that the average adult consumes 13 grams of it a day from the protein in food. Non-meat food sources like tomatoes and Parmesan cheese have high levels of glutamic acid.

In 1909, Ikeda began mass-producing Ajinomoto (meaning “essence of taste”), an additive that came out of his creation of the first method of industrially producing glutamate by way of fermented vegetable proteins. The resulting sodium salt form of glutamic acid (the acid with just a single sodium molecule) became famous for its ability to imbue a meaty flavor into dishes, or just naturally enhance the flavor of food. It was touted as a nutritional wonder, helping bland but nutritious food become delicious. A growing number of Japanese housewives used the product, and by the 1930s, recipes included Ajinomoto use in their directions. The sodium salt of glutamic acid remains prevalent today–anyone who has eaten KFC or Doritos has ingested it; it’s just known by a different name: monosodium glutamate, or MSG.

Few letters have the power to stop conversation in its tracks more than MSG, one of the most infamous additives in the food industry. The three little letters carry so much negative weight that they’re often whispered sheepishly or, more often, decidedly preceded by the modifier “NO” that seems to make everyone breathe a collective sigh of relief when they go out to eat. Nobody wants MSG in their food—the protest goes—it causes headaches, stomachaches, dizziness and general malaise. It’s unhealthy and, maybe even worse, unsexy, used by lazy chefs as an excuse for flavor, not an enhancement.

On the other side of the spectrum lies umami: few foodie buzzwords pop off the lips with such entertaining ease. Enterprising young chefs like David Chang (of Momofuku fame) and Adam Fleischman, of the LA-based chain Umami Burger, have built their culinary careers on the basis of the fifth taste, revitalizing an interest in the meaty-depth of umami. It’s difficult to watch the Food Network or Travel Channel or any food-based program without hearing mention of the taste wunderkind, a host or chef cooing over the deep umami flavors of a Portobello mushroom. Where MSG is scary, umami is exciting.

What few people understand is that the hated MSG and the adored umami are chemically related: umami is tasted by the very receptors that MSG targets. At a MAD Symposium in Denmark, a TED-like conference for the food industry, Chang spoke about MSG and umami: “For me, the way that I’m looking at umami, it’s the same way I look at MSG. It’s one in the same.” But if chefs like Chang (neither inept nor lazy when it comes to flavor, as his Michelin stars would attest to) are down with MSG, why does the additive retain such a bad reputation?

by Natasha Geiling, Smithsonian | Read more:
Image: Wikipedia

Friday, November 8, 2013


CL4v.4 by studio Judith
via:

America the Possible

When it comes to social conditions, it’s important to recognize that nearly 50 million Americans now live in poverty—one in six. If you’re in poverty in America, you’re living on less than $400 per week for a family of four. Poverty is the bleeding edge of a more pervasive American shortcoming—massive economic insecurity. About half of American families now live paycheck to paycheck, are financially fragile, and earn less than needed to cover basic living expenses, let alone save for the future.

Back in 1928, right before the Great Depression, the richest 1 percent of Americans received 24 percent of the country’s total income. Starting with the New Deal, public policy favored greater equality and a strong middle class, so that by 1976, the share of the richest 1 percent of households had dropped to 9 percent. But then the great re-redistribution began in the 1980s, so that by 2007, right before the Great Recession, the richest 1 percent had regained its 1928 position—with 24 percent of income.

As for national security, the U.S. now spends almost as much on the military as the rest of the world combined. If one totals military and other U.S. security spending, the total easily climbs to over $1 trillion annually, about two-thirds of all discretionary federal spending. In what has been called a key feature of the American Empire, America now garrisons the world. Although the Pentagon officially reports that we maintain a mere 660 military bases in 38 countries, if one adds the unreported bases in Afghanistan, Saudi Arabia, and elsewhere, there are likely as many as 1,000 U.S. military sites around the world. By 2010, we had covert operations deployed in an estimated 40 percent of the world’s 192 nations. On the home front, in 2010, the Washington Post reported that the top-secret world the government created in response to 9/11 now contains some 1,300 government entities and 1,900 private companies all working on programs related to counterterrorism, homeland security, and intelligence in some 10,000 locations across the United States.

When you’ve got an armful of hammers, every problem looks like a nail, and the U.S. has tended to seek military solutions to problems that might be addressed otherwise. The costs have been phenomenally high. When all told, our wars since 9/11 will cost us over $4 trillion and more than 8,000 American lives, with another 99,000 U.S. troops already wounded in action or evacuated for serious illness.

Another sorrow is the huge, draining psychological burden that U.S. actions have on its citizens. We see our own military, the CIA, and U.S. contractors engaged in torture and prisoner abuse, large killings of innocent civilians, murders and the taking of body parts as souvenirs, renditions, drone assassinations, military detention without trial, collaboration with unsavory regimes, and more.

Meanwhile, outside our borders, a world of wounds has festered without much help, and often with harm, from the United States. We are neglecting so many problems—from world poverty, underdevelopment, and climate change to emerging shortages of food and water and energy, biological impoverishment, and transnational organized crime.

The following are among the many treaties ratified by all nations, except for a few rogue states—and the United States: the Convention of the Rights of the Child, the Convention Against All Forms of Discrimination Against Women, the Land Mine Convention, the International Criminal Court convention, the Biodiversity Convention, the Law of the Sea, the Kyoto Protocol of the Climate Convention, and the Convention on Persistent Organic Pollutants. The U.S. is the main reason we do not now have a World Environment Organization.

In these respects and in many others, the U.S. posture in the world reflects a radical imbalance: a hugely disproportionate focus on the military and on economic issues and a tragic neglect of some of the most serious challenges we and the world now confront.

These many challenges require farsighted, strong, and effective government leadership and action. Inevitably, then, the path to responding to these challenges leads to the political arena, where a vital, muscular democracy steered by an informed and engaged citizenry is needed. That’s the democracy we need, but, unfortunately, it is not the democracy we have. Right now, Washington isn’t even trying to seriously address most of these challenges. Neglect, stalemate, and denial rule the day. It is estimated that American politics is more polarized today than at any time since Reconstruction. Polarization, of course, is father to gridlock. Gridlock and stalemate are the last thing our country needs now.

The American political system is in deep trouble for another reason—it is moving from democracy to plutocracy and corporatocracy, supported by the ascendancy of market fundamentalism and a strident antiregulation, antigovernment, antitax ideology. The hard truth is that our political system today is simply incapable of meeting the great challenges described here. What we have is third-rate governance at a time when the challenges we face require first-rate governance.

America thus confronts a daunting array of challenges in the maintenance of our people’s well-being, in the conduct of our international affairs, in the management of our planet’s natural assets, and in the workings of our politics. Taken together, these challenges place in grave peril much that we hold dear.

The America we must seek for our children and grandchildren is not the America we have today. If we are going to change things for the better, we must first understand the forces that led us to this sea of troubles. When big problems emerge across the entire spectrum of national life, it cannot be due to small reasons. We have encompassing problems because of fundamental flaws in our economic and political system. By understanding these flaws, we can end them and move forward in a very different direction.

I think America got off course for two primary reasons. In recent decades we failed to build consistently on the foundations laid by the New Deal, by Franklin Roosevelt’s Four Freedoms and his Second Bill of Rights, and Eleanor Roosevelt’s Universal Declaration of Human Rights. Instead, we unleashed a virulent, fast-growing strain of corporate-consumerist capitalism. “Ours is the Ruthless Economy,” say Paul Samuelson and William Nordhaus in their influential textbook, Macroeconomics. And indeed it is. In its ruthlessness at home and abroad, it creates a world of wounds. As it strengthens and grows, those wounds deepen and multiply.

Such an economy begs for restraint and guidance in the public interest—control that can only be provided by government. Yet, at this point, the captains of our economic life and those who have benefited disproportionately from it have largely taken over our political life. Corporations, long identified as our principal economic actors, are now also our principal political actors. The result is a combined economic and political system—the operating system upon which our society runs—of great power and voraciousness, pursuing its own economic interests without serious concern for the values of fairness, justice, or sustainability that democratic government might have provided.

Our political economy has evolved and gathered force in parallel with the course of the Cold War and the growth of the American Security State. The Cold War and the rise of the American Empire have powerfully affected the nature of the political-economic system—strengthening the already existing prioritization of economic growth, giving rise to the military-industrial complex, and draining time, attention, and money away from domestic needs and emerging international challenges. This diversion of attention and resources continues with our response to international terrorism.

So what are this operating system’s key features, which have been given such free rein by these developments? First, ours is an economy that prioritizes economic growth above all else. We think of growth as an unalloyed good, but this growth fetish is a big source of our problems. We’ve had plenty of growth in recent decades—growth while wages stagnated, jobs fled our borders, life satisfaction flat-lined, social capital eroded, poverty and inequality mounted, and the environment declined. Today, U.S. GDP has regained its prerecession level, but 15 percent of American workers still can’t find full-time jobs.

Another key feature of today’s dysfunctional operating system is how powerfully the profit motive affects corporate behavior. Today’s corporations have been called “externalizing machines,” so committed are they to keeping the real costs of their activities off their books. Profit can be increased by keeping wages low and real social, environmental, and economic costs externalized—borne by society at large and not by the firm. One can get some measure of these external costs from a recent analysis of three thousand of the world’s biggest companies. It concluded that paying for their external environmental costs would erase at least a third of their profits. Profits can also be increased through subsidies, tax breaks, regulatory loopholes, and other gifts from government. Together, these external costs and subsidies lead to dishonest prices, which in turn lead consumers to spur on businesses that do serious damage to people and planet.

Given such emphasis on inexorable growth and profit, the constant spread of the market into new areas can be very costly environmentally and socially. As Karl Polanyi described in his 1944 book, The Great Transformation, “To allow the market mechanism to be sole director of the fate of human beings and their natural environment . . . would result in the demolition of society. . . . Nature would be reduced to its elements, neighborhoods and landscapes defiled, rivers polluted, military safety jeopardized, the power to produce food and raw materials destroyed.” With its emphasis on privatization, commercialization, and commodification, American capitalism has carried this demolition forward with a vengeance.

But the system that drives the capitalism we have today includes other elements.

by James Gustave Speth, Orion |  Read more:
Image: Gary Waters

Paulina


[ed. Not many 10 yr. olds have a set up like this but she seems to make good use of it.] 

Are Computers Making Society More Unequal?


Ever since inequality began rising in the U.S., in the nineteen-seventies, people have debated its causes. Some argue that rising inequality is mainly the result of specific policy choices—cuts to education, say, or tax breaks for the wealthy; others argue that it’s an expression of larger, structural forces. For the last few years, Tyler Cowen, an economist at George Mason University and a widely read blogger, has been one of the most important voices on the latter side. In 2011, in an influential book called “The Great Stagnation,” Cowen argued that the American economy had exhausted the “low-hanging fruit”—cheap land, new technology, and high marginal returns on education—that had powered its earlier growth; the real story wasn’t inequality per se, but rather a general and inevitable economic slowdown from which only a few sectors of the economy were exempt. It was not a comforting story.

“Average Is Over,” Cowen’s new book, is a sequel to, and elaboration upon, “The Great Stagnation.” In many ways, it’s even less comforting. It’s not just, Cowen writes, that the old economy, built on factory work and mid-level office jobs, has stagnated. It’s that the nature of work itself is changing, largely because of the increasing power of intelligent machines. Smart software, Cowen argues, is transforming almost everything about work, and ushering in an era of “hyper-meritocracy.” It makes workers redundant, by doing their work for them. It makes work more unforgiving, by tracking our mistakes. And it creates an entirely new class of workers: people who know how to manage and interpret computer systems, and whose work, instead of competing with the software, augments and extends it. Over the next several decades, Cowen predicts, wages for that new class of workers will grow rapidly, while the rest will be left behind. Inequality will be here to stay, and that will affect not only how we work, but where and how we live.

If we want a preview of work in the twenty-twenties and twenty-thirties, Cowen writes, we should look to the areas where computer intelligence is already making a big difference: areas like dating, medicine, and even chess. This interview with Cowen has been edited and condensed from two conversations.

In “Average Is Over,” you argue that inequality will grow in the U.S. for the next several decades. Why?

There are three main reasons inequality is here to stay, and will likely grow. The first is just measurement of worker value. We’re doing a lot to measure what workers are contributing to businesses, and, when you do that, very often you end up paying some people less and other people more. The second is automation—especially in terms of smart software. Today’s workplaces are often more complicated than, say, a factory for General Motors was in 1962. They require higher skills. People who have those skills are very often doing extremely well, but a lot of people don’t have them, and that increases inequality. And the third point is globalization. There’s a lot more unskilled labor in the world, and that creates downward pressure on unskilled labor in the United States. On the global level, inequality is down dramatically—we shouldn’t forget that. But within each country, or almost every country, inequality is up.

You think that intelligent software, especially, will make the labor market more unequal. Why is that the case?

Because of the cognitive requirements of working with smart software. And it’s also about training. There’s a big digital divide in this country.

One of the most interesting sections of the book is about “freestyle” chess competitions, in which humans and computers play on teams together—often the computers make the moves, but sometimes the humans intervene. How has chess software changed the “labor market” in chess players?

When humans team up with computers to play chess, the humans who do best are not necessarily the strongest players. They’re the ones who are modest, and who know when to listen to the computer. Often, what the human adds is knowledge of when the computer needs to look more deeply. If you’re a really good freestyle player, you consult a bunch of different programs, which have different properties, and you analyze the game position on all of them. You try to spot, very quickly, where the programs disagree, and you tell them to look more deeply there. They may disagree along a number of lines, and then you have to make some judgments. That’s hard—but the good humans do that better than computers do. Even very strong computers don’t have that meta-rational sense of when things are ambiguous. Today, the human-plus-machine teams are better than machines by themselves. It shows how there may always be room for a human element.

You believe that, in the future, the most well-compensated workers will be something like freestyle chess players.

Think in terms of this future middle-class job: You read medical scans, and you work alongside a computer. The computer does most of the judging, but there are some special or unusual scans where you say, “Hmm, that’s not quite right—I need a doctor to look at this again and study it more carefully.” You’ll need to know something about medicine, but it won’t be the same as being a doctor. You’ll need to know something about how these programs work, but it won’t be the same as being a programmer. You’ll need to be really good at judging, and being dispassionate, and you’ll have to have a sense of what computers can and cannot do. It’s about working with the machine: knowing when to hold back, when to intervene.

Or take business negotiations. In the early stages of negotiation software, on your smartphone, there may be programs that listen to the pitch of a voice, or that test for stress. You’ll just ask the program, “Was he lying? Was he eager to do business with me?” Maybe the computer will be right sixty per cent of the time. That’s useful information, but it’s still going to be wrong a lot. And in a given negotiation, you’ll be reading off many programs, and you’ll have to decide which of those programs is more relevant.

by Joshua Rothman, New Yorker |  Read more:
Image: Simone Casetta/Anzenberger/Redux

Can't Buy Me Love: How Romance Wrecked Traditional Marriage

Despite the fondness among certain politicians and pundits for “traditional marriage,” a nostalgic-sounding concept that conjures a soft-focus Polaroid of grandma and grandpa, few consider the actual roots of our marital traditions, when matrimony was little more than a business deal among unequals. Even today, legal marriage isn’t measured by the affection between two people, but by the ability of a couple to share Social Security and tax benefits. In reality, it’s the idea of marrying for love that’s untraditional.

For most of recorded human history, marriage was an arrangement designed to maximize financial stability. Elizabeth Abbott, the author of “A History of Marriage” explains that in ancient times, marriage was intended to unite various parts of a community, cementing beneficial economic relationships. “Because it was a financial arrangement, it was conceived of and operated as such. It was a contract between families. For example, let’s say I’m a printer and you make paper, we might want a marriage between our children because that will improve our businesses.” Even the honeymoon, often called the “bridal tour,” was a communal affair, with parents, siblings, and other close relatives traveling together to reinforce their new familial relationships. (...)

Though the murky concept known as “love” has been recorded for all of human history, it was almost never a justification for marriage. “Love was considered a reason not to get married,” says Abbott. “It was seen as lust, as something that would dissipate. You could have love or lust for your mistress, if you’re a man, but if you’re a woman, you had to suppress it. It was condemned as a factor in marriage.”

In fact, for thousands of years, love was mostly seen as a hindrance to marriage, something that would inevitably cause problems. “Most societies have had romantic love, this combination of sexual passion, infatuation, and the romanticization of the partner,” says Coontz. “But very often, those things were seen as inappropriate when attached to marriage. The southern French aristocracy believed that true romantic love was only possible in an adulterous relationship, because marriage was a political, economic, and mercenary event. True love could only exist without it.”

By the 19th century, the friction between love and money had come to a head. As the Western world advanced towards a more modern, industrialized society built on wage labor, emotional bonds became more private, focused more on immediate family and friends than communal celebrations. Simultaneously, mass media helped make sentimental inclinations a larger part of popular culture, with the flourishing of holidays likeValentine’s Day and nostalgic hobbies like scrapbooking.

Culturally speaking, love was in the air, and the union of Queen Victoria and Prince Albert in 1840 only served to seal the deal. Though Victoria and Albert’s marriage was sanctioned by their royal families, it was also hailed as a true “love match,” cementing the new ideal of romantic partnership. Their nuptials also coincided with the proliferation of early print media, making the event visible to readers all across Europe and North America.

“With Victoria’s wedding, you had endless reporting and tons of illustrations,” Abbott says. “Between two and four weeks after Victoria was married, magazines reproduced every last aspect of her wedding. Queen Victoria chose orange blossoms for her wreath, and an elaborate, white dress with this ridiculously long train in the back, and every detail was sent across the ocean and read voraciously by women in ladies’ magazines. Her wedding became the model because everyone knew about it.” To this day, many stereotypical elements of American weddings are still drawn from Victoria’s, particularly the tradition of wearing a white dress.

by Hunter Oatman-Stanford, Collectors Weekly |  Read more:
Image: uncredited

Thursday, November 7, 2013


Jonas Wood,  Shio Shrine, 2010
via:

Guy Harkness, Whangapoua Bach
via: flickr

The Strange Mystery of the Cancer Anomaly


Here’s an interesting puzzle. The common conception is that your risk of cancer starts off small in early life and increases as you get older. So children and young people are less likely to develop the disease than somebody who is middle aged who in turn is less likely to develop cancer than a centenarian.

Not so. Epidemiologists have long known that the incidence of most cancers increases until it reaches a maximum at a certain age and then drops dramatically as people get older. The anomaly is well studied, found all over the world and true for many different types of cancer (see diagrams above).

That raises an important question: how come? After all, cancer is thought to start when a series of genetic alterations accumulate in a cell. These changes prevent ordinary cell functions such as DNA repair and so on and these malfunctions eventually trigger cancerous growth. Clearly, if these changes accumulate over time, the risk of developing the disease must also increase over time.

And therein lies the puzzle, one that has stumped oncologists and epidemiologists alike for years. What could possibly explain the discrepancy between the data and this entirely reasonable model of cancer development?

Today, we get an answer thanks to the work of James Brody a biomedical engineer at the University of California, Irvine. He shows that there is no discrepancy between the data and the model of cancer development provided that one additional assumption is true. This is that the population can be divided into two groups—one group that is susceptible to cancer and a much larger group that is not susceptible to cancer.

by Physics arXiv Blog |  Read more:
Image: uncredited

The Hidden Technology That Makes Twitter Huge

Consider the tweet. It’s short—140 characters and done—but hardly simple. If you open one up and look inside, you’ll see a remarkable clockwork, with 31 publicly documented data fields. Why do these tweets, typically born of a stray impulse, need to carry all this data with them?

While a tweet thrives in its timeline, among the other tweets, it’s also designed to stand on its own, forever. Any tweet might show up embedded inside a million different websites. It may be called up and re-displayed years after posting. For all their supposed ephemerality, tweets have real staying power.

Once born, they’re alone and must find their own way to the world, like a just-hatched sea turtle crawling to the surf. Luckily they have all of the information they need in order to make it: A tweet knows the identity of its creator, whether bot or human, as well as the location from which it originated, the date and time it went out, and dozens of other little things—so that wherever it finds itself, the tweet can be reconstituted. Millennia from now an intelligence coming across a single tweet could, like an archaeologist pondering a chunk of ancient skull, deduce an entire culture.

Twitter’s Nov. 7 initial public offering marks the San Francisco-based company’s coming-out party, the moment when it graduates from its South of Market beginnings and takes its place as one of the Internet’s most valuable properties, without ever turning a profit. What’s perhaps most remarkable about Twitter’s rise is how little the service has evolved from the original core concept of the 140-character tweet—which is to say, not at all. It’s tempting to view tweeting as silly and trivial, and Twitter itself as overhyped and overvalued. But there’s some sophisticated, supple, and even revolutionary technology at work. Appreciating Twitter’s machinery is key to understanding how an idea so simple changed the way millions of people advertise their existences to the world.

by Paul Ford, Bloomberg Businessweek |  Read more:
Image: David Parkins