Saturday, July 21, 2012

The Genetics of Stupidity

What if we’ve been thinking about the genetics of intelligence from completely the wrong angle? Intelligence (as indexed by IQ or the general intelligence factor “g”) is clearly highly heritable in humans – people who are more genetically similar are also more similar in this factor. (Genetic variance has been estimated as explaining ~75% of variance in g, depending on age and other factors). There must therefore be genetic variants in the population that affect intelligence – so far, so good. But the search for such variants has, at its heart, an implicit assumption: that these variants affect intelligence in a fairly specific way – that they will occur in genes “for intelligence”.

An implication of that phrase is that mutations in those genes were positively selected for at some stage in humanity’s descent from our common ancestor with apes, on the basis of conferring increased intelligence. This seems a fairly reasonable leap to make – such genes must exist and, if variation in these genes in humanity’s evolution could affect intelligence, then maybe variation in those same genes can explain variation within the human species.

The problem with that logic is that we are talking about two very different types of variation. On the one hand, mutations that arose during human evolution that conferred increased intelligence (through whatever mechanism) will have been positively selected for and fixed in the population. How this happened is unknown of course, but one can imagine an iterative process, where initial small changes in, say, the timing of processes of brain development led to small increases in intelligence. Increased cognitive capabilities could have led in turn to the emergence of crude communication and culture, opening up what has been called the “cognitive niche” – creating an environment where further increases in intelligence became selectively more and more advantageous – a runaway process, where genetic changes bootstrap on cultural development in a way that reinforces their own adaptiveness.

That’s all nice, though admittedly speculative, but those mutations are the ones that we would expect to not vary in human populations – they would now be fixed. In particular, there is little reason to expect that there would exist new mutations in such genes, present in some but not all humans, which act to further increase intelligence. This is simply a matter of probabilities: the likelihood of a new mutation in some such gene changing its activity in a way that is advantageous is extremely low, compared to the likelihood of it either having no effect or being deleterious. There are simply many more ways of screwing something up than of improving it.

That is true for individual proteins and it is true at a higher level, for organismal traits that affect fitness (the genetic components of which have presumably been optimised by millions of years of evolution). Mutations are much more likely to cause a decrement in such traits than to improve them. So maybe we’re thinking about the genetics of g from the wrong perspective – maybe we should be looking for mutations that decrease intelligence from some Platonic ideal of a “wild-type” human. Thinking in this way – about “mutations that affect” a trait, rather than “genes for” the trait – changes our expectations about the type of variation that could be contributing to the heritability of the trait.

Mutations that lower intelligence could be quite non-specific, diverse and far more idiosyncratic. The idea of a finite, stable and discrete set of variants that specifically contribute to intelligence levels and that simply get shuffled around in human populations may be a fallacy. That view is supported by the fact that genome-wide association studies for common variants affecting intelligence have so far come up empty.

Various researchers have suggested that g may be simply an index of a general fitness factor – an indirect measure of the mutational load of an organism. The idea is that, while we all carry hundreds of deleterious mutations, some of us carry more than others, or ones with more severe effects. These effects in combination can degrade the biological systems of development and physiology in a general way, rendering them less robust and less able to generate our Platonic, ideal phenotype. In this model, it is not the idea that specific mutations have specific effects on specific traits that matters so much – it is that the overall load cumulatively reduces fitness through effects at the systems level. This means that the mutations affecting intelligence in one person may be totally different from those affecting it in another – there will be no genes “for intelligence”.

Direct evidence for this kind of effect of mutational load was found recently in a study by Ronald Yeo and colleagues, showing that the overall burden of rare copy number variants (deletions or duplications of segments of chromosomes) negatively predicts intelligence (r = -0.3).

If g really is an index of a general fitness factor, then it should be correlated with other indices of fitness. This indeed appears to be the case. G is weakly positively correlated with height, for example, and also strongly correlated with various measures of health and longevity.

by Kevin Mitchell, Wiring the Brain | Read more:

A Real-Life Fairy Tale, Long in the Making: Searching for Sugar Man


[ed. See also:Cold Facts]

It’s a real-life tale of talent disregarded, bad luck and missed opportunities, with an improbable stop in the Hamptons and a Hollywood conclusion: A singer-songwriter is signed to a contract in the late 1960s after producers with ties to Motown Records see him playing in a smoky Detroit nightclub called the Sewer. He makes a pair of albums that sell almost nothing and then drops out of sight. So why, 40 years later, would anyone feel compelled to make a movie about this obscure artist, known professionally as Rodriguez?

Because, as it turns out, on the other side of the globe, in South Africa, Rodriguez had become as popular as the Rolling Stones or Elvis Presley. But he never knew of that success. He never saw a penny in royalties from it, and he spent decades doing manual labor to make ends meet and raise his three daughters. It wasn’t until fans in South Africa, trying to verify rumors he was dead, tracked him down through the Internet and brought him there to perform to adoring multitudes, that his career was resuscitated.

“This was the greatest, the most amazing, true story I’d ever heard, an almost archetypal fairy tale,” said Malik Bendjelloul, the Swedish director of “Searching for Sugar Man,” a documentary that opens on Friday in New York and Los Angeles. “It’s a perfect story. It has the human element, the music aspect, a resurrection and a detective story.”

Because of an odd confluence of circumstances it is also a story unlikely ever to occur again. In the era before the World Wide Web, South Africans, living under apartheid and isolated from the main currents of pop culture by domestic censorship and international sanctions, had no idea that Rodriguez languished in anonymity elsewhere. The singer himself compounded the situation by seeking to live as inconspicuously as possible.

On another, somewhat more oblique level, Mr. Bendjelloul acknowledged, “Searching for Sugar Man” can also be interpreted as a meditation on the fickle and arbitrary nature of celebrity and fame. We live in a culture, the film suggests, in which talent and quality sometimes go ignored, and when they get belated recognition, even that is often through happenstance.

“I’ve produced a lot of big-name artists with big hits, like Peter Frampton and Jerry Lee Lewis, but I’ve never worked with anyone as talented as Rodriguez,” Steve Rowland, who produced the singer’s second album, “Coming From Reality,” said in a telephone interview from his home in Palm Springs, Calif. “I never understood why he didn’t become a big star, so to see him rise like a phoenix from the ashes, it’s just as inexplicable, but it makes me really, really happy this is going on for him, because he’s a wonderful, humble person, and he really deserves it.” 

by Larry Rohter, NY Times |  Read more:
Photo: Nicole Bengiveno


Friday, July 20, 2012


Fairfield Porter: Forsythia and Pear in Bloom (1968)
via:

The xx

When Fashion Meets Fishing, the Feathers Fly


The most enthusiastic customers at the Eldredge Brothers Fly Shop of late are not looking to buy fly fishing reels or snag stripers. They are here to make a fashion statement.

In an improbable collision of cutting-edge chic and a hobby that requires drab waders, fly fishing shops around the country are suddenly inundated with stylish women looking to get in on the latest trend: long, colorful feathers that are bonded or clipped into hair.

Demand for the feathers, before now exclusively the domain of fly fishermen, who use them to tie flies, has created a shortage, forcing up the price and causing fly shops and hairdressers to compete for the elusive plumes.

“I’ve been out for probably a month,” said Bill Thompson, the owner of North Country Angler in North Conway, N.H. “There is that worry that next year, fishermen won’t have materials they’ll need.”

The circumstances are especially strange because a proudly stodgy and tradition-bound industry content to hide from the world beyond the river is competing in this niche marketplace with a fad that may not last as long as a trout’s spawning season.

“For someone to use them as a fashion statement is just sacrilegious,” said Bob Brown, 65, a fly fisherman who lives in an recreational vehicle parked in Kennebunk, Me. He said he had been tying flies for 50 years and this is the first time he had ever heard of a feather shortage.

“They’ve been genetically bred for fly tying, and that’s what they should be used for,” Mr. Brown said.

Fly fishing feathers — which individually are called hackles and as a group called saddles — are harvested from roosters painstakingly bred to grow supple feathers. It takes more than a year for a rooster to grow feathers long and pliable enough for use by fly fishermen. Because no one could have predicted the fashion trend, there are not enough to go around.

by Katie Zezima, NY Times |  Read more:
Photo: Craig Dilger

You Walk Wrong

Walking is easy. It’s so easy that no one ever has to teach you how to do it. It’s so easy, in fact, that we often pair it with other easy activities—talking, chewing gum—and suggest that if you can’t do both simultaneously, you’re some sort of insensate clod. So you probably think you’ve got this walking thing pretty much nailed. As you stroll around the city, worrying about the economy, or the environment, or your next month’s rent, you might assume that the one thing you don’t need to worry about is the way in which you’re strolling around the city.

Well, I’m afraid I have some bad news for you: You walk wrong.

Look, it’s not your fault. It’s your shoes. Shoes are bad. I don’t just mean stiletto heels, or cowboy boots, or tottering espadrilles, or any of the other fairly obvious foot-torture devices into which we wincingly jam our feet. I mean all shoes. Shoes hurt your feet. They change how you walk. In fact, your feet—your poor, tender, abused, ignored, maligned, misunderstood feet—are getting trounced in a war that’s been raging for roughly a thousand years: the battle of shoes versus feet.

Last year, researchers at the University of the Witwatersrand in Johannesburg, South Africa, published a study titled “Shod Versus Unshod: The Emergence of Forefoot Pathology in Modern Humans?” in the podiatry journal The Foot. The study examined 180 modern humans from three different population groups (Sotho, Zulu, and European), comparing their feet to one another’s, as well as to the feet of 2,000-year-old skeletons. The researchers concluded that, prior to the invention of shoes, people had healthier feet. Among the modern subjects, the Zulu population, which often goes barefoot, had the healthiest feet while the Europeans—i.e., the habitual shoe-wearers—had the unhealthiest. One of the lead researchers, Dr. Bernhard Zipfel, when commenting on his findings, lamented that the American Podiatric Medical Association does not “actively encourage outdoor barefoot walking for healthy individuals. This flies in the face of the increasing scientific evidence, including our study, that most of the commercially available footwear is not good for the feet.”

Okay, so shoes can be less than comfortable. If you’ve ever suffered through a wedding in four-inch heels or patent-leather dress shoes, you’ve probably figured this out. But does that really mean we don’t walk correctly? (Yes.) I mean, don’t we instinctively know how to walk? (Yes, sort of.) Isn’t walking totally natural? Yes—but shoes aren’t.

“Natural gait is biomechanically impossible for any shoe-wearing person,” wrote Dr. William A. Rossi in a 1999 article in Podiatry Management. “It took 4 million years to develop our unique human foot and our consequent distinctive form of gait, a remarkable feat of bioengineering. Yet, in only a few thousand years, and with one carelessly designed instrument, our shoes, we have warped the pure anatomical form of human gait, obstructing its engineering efficiency, afflicting it with strains and stresses and denying it its natural grace of form and ease of movement head to foot.” In other words: Feet good. Shoes bad.

Perhaps this sounds to you like scientific gobbledygook or the ravings of some radical back-to-nature nuts. In that case, you should listen to Galahad Clark. Clark is 32 years old, lives in London, and is about as unlikely an advocate for getting rid of your shoes as you could find. For one, he’s a scion of the Clark family, as in the English shoe company C&J Clark, a.k.a. Clarks, founded in 1825. Two, he currently runs his own shoe company. So it’s a bit surprising when he says, “Shoes are the problem. No matter what type of shoe. Shoes are bad for you.”

This is especially grim news for New Yorkers, who (a) tend to walk a lot, and (b) tend to wear shoes while doing so.

I know what you’re thinking: If shoes are so bad for me, what’s my alternative?

Simple. Walk barefoot.

Okay, now I know what you’re thinking: What’s my other alternative?

by Adam Sternberg, New York Magazine | Read more:
Photo: Tom Schierlitz

She Did Not Turn, by David Inshaw

War Is Betrayal

We condition the poor and the working class to go to war. We promise them honor, status, glory, and adventure. We promise boys they will become men. We hold these promises up against the dead-end jobs of small-town life, the financial dislocations, credit card debt, bad marriages, lack of health insurance, and dread of unemployment. The military is the call of the Sirens, the enticement that has for generations seduced young Americans working in fast food restaurants or behind the counters of Walmarts to fight and die for war profiteers and elites.

The poor embrace the military because every other cul-de-sac in their lives breaks their spirit and their dignity. Pick up Erich Maria Remarque’s All Quiet on the Western Front or James Jones’s From Here to Eternity. Read Henry IV. Turn to the Iliad. The allure of combat is a trap, a ploy, an old, dirty game of deception in which the powerful, who do not go to war, promise a mirage to those who do.

I saw this in my own family. At the age of ten I was given a scholarship to a top New England boarding school. I spent my adolescence in the schizophrenic embrace of the wealthy, on the playing fields and in the dorms and classrooms that condition boys and girls for privilege, and came back to my working-class relations in the depressed former mill towns in Maine. I traveled between two universes: one where everyone got chance after chance after chance, where connections and money and influence almost guaranteed that you would not fail; the other where no one ever got a second try. I learned at an early age that when the poor fall no one picks them up, while the rich stumble and trip their way to the top.

Those I knew in prep school did not seek out the military and were not sought by it. But in the impoverished enclaves of central Maine, where I had relatives living in trailers, nearly everyone was a veteran. My grandfather. My uncles. My cousins. My second cousins. They were all in the military. Some of them—including my Uncle Morris, who fought in the infantry in the South Pacific during World War II—were destroyed by the war. Uncle Morris drank himself to death in his trailer. He sold the hunting rifle my grandfather had given to me to buy booze.

He was not alone. After World War II, thousands of families struggled with broken men who, because they could never read the approved lines from the patriotic script, had been discarded. They were not trotted out for red-white-and-blue love fests on the Fourth of July or Veterans Day.

The myth of war held fast, despite the deep bitterness of my grandmother—who acidly denounced what war had done to her only son—and of others like her. The myth held because it was all the soldiers and their families had. Even those who knew it to be a lie—and I think most did—were loath to give up the fleeting moments of recognition, the only times in their lives they were told they were worth something.

“For it’s Tommy this, an’ Tommy that, an’ ‘Chuck him out, the brute!’” Rudyard Kipling wrote. “But it’s ‘Saviour of ’is country’ when the guns begin to shoot.”

Any story of war is a story of elites preying on the weak, the gullible, the marginal, the poor. I do not know of a single member of my graduating prep school class who went into the military. You could not say this about the high school class that graduated the same year in Mechanic Falls, Maine.

by Chris Hedges, Boston Review |  Read more:
Photograph by Teddy Wade, U.S. Army

Global Warming's Terrifying New Math

If the pictures of those towering wildfires in Colorado haven't convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the "largest temperature departure from average of any season on record." The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet's history.

Not that our leaders seemed to notice. Last month the world's nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn't even attend. It was "a ghost of the glad, confident meeting 20 years ago," the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls "once thronged by multitudes." Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I've spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we're losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn't yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world's nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the "most important gathering since the Second World War, given what is at stake." As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: "This is our chance. If we miss it, it could take years before we get a new and better one. If ever."

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving "Copenhagen Accord" that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. "Copenhagen is a crime scene tonight," an angry Greenpeace official declared, "with the guilty men and women fleeing to the airport." Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized "the scientific view that the increase in global temperature should be below two degrees Celsius." And in the very next paragraph, it declared that "we agree that deep cuts in global emissions are required... so as to hold the increase in global temperature below two degrees Celsius." By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we've raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. "Any number much above one degree involves a gamble," writes Kerry Emanuel of MIT, a leading authority on hurricanes, "and the odds become less and less favorable as the temperature goes up." Thomas Lovejoy, once the World Bank's chief biodiversity adviser, puts it like this: "If we're seeing what we're seeing today at 0.8 degrees Celsius, two degrees is simply too much." NASA scientist James Hansen, the planet's most prominent climatologist, is even blunter: "The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster." At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: "Some countries will flat-out disappear." When delegates from developing nations were warned that two degrees would represent a "suicide pact" for drought-stricken Africa, many of them started chanting, "One degree, one Africa."

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it's fair to say that it's the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world's carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can't raise the temperature more than two degrees Celsius – it's become the bottomest of bottom lines. Two degrees.

by Bill McKibben, Rolling Stone |  Read more:
Illustration by Edel Rodriguez

Thursday, July 19, 2012

Michelle Jenneke, Junior World Championships, Barcelona 2012


The Future of Manufacturing Is in America, Not China


A furor broke out last week after it was reported that the uniforms of U.S. Olympians would be manufactured in China. "They should take all the uniforms, put them in a big pile, and burn them," said an apoplectic Sen. Harry Reid. The story tapped into the anger -- and fear -- that Americans feel about the loss of manufacturing to China. Seduced by government subsidies, cheap labor, lax regulations, and a rigged currency, U.S. industry has rushed to China in recent decades, with millions of American jobs lost. It is these fears, rather than the Olympic uniforms themselves, that triggered last week's congressional uproar.

But Ralph Lauren berets aside, the larger trends show that the tide has turned, and it is China's turn to worry. Many CEOs, including Dow Chemicals' Andrew Liveris, have declared their intentions to bring manufacturing back to the United States. What is going to accelerate the trend isn't, as people believe, the rising cost of Chinese labor or a rising yuan. The real threat to China comes from technology. Technical advances will soon lead to the same hollowing out of China's manufacturing industry that they have to U.S industry over the past two decades.

Several technologies advancing and converging will cause this.

First, robotics. The robots of today aren't the androids or Cylons that we are used to seeing in science fiction movies, but specialized electromechanical devices run by software and remote control. As computers become more powerful, so do the abilities of these devices. Robots are now capable of performing surgery, milking cows, doing military reconnaissance and combat, and flying fighter jets. Several companies, such Willow Garage, iRobot, and 9th Sense, sell robot-development kits for which university students and open-source communities are developing ever more sophisticated applications.

The factory assembly that China is currently performing is child's play compared to the next generation of robots -- which will soon become cheaper than human labor. One of China's largest manufacturers, Taiwan-based Foxconn Technology Group, announced last August that it plans to install one million robots within three years to do the work that its workers in China presently do. It has found even low-cost Chinese labor to be too expensive and demanding.

Then there is artificial intelligence (AI) -- software that makes computers, if not intelligent in the human sense, at least good enough to fake it. This is the basic technology that IBM's Deep Blue computer used to beat chess grandmaster Garry Kasparov in 1997 and that enabled IBM's Watson to beat TV-show Jeopardy champions in 2011. AI is making it possible to develop self-driving cars, voice-recognition systems such as the iPhone's Siri, and Face.com, the face-recognition software Facebook recently acquired.

by Vivek Wadhwa, Foreign Policy |  Read more:
Stephen Brashear/Getty Images

The Triumph of the Family Farm


We buried my grandfather last spring. He had died in his sleep in his own bed at 95, so, as funerals go, it wasn’t a grim occasion. But it was a historic one for our small rural community. My great-grandparents were early settlers, arriving in 1913 and farming the land throughout their lives. My grandfather continued that tradition, and now rests next to them on a hillside overlooking the family homestead.

If you’re a part of the roughly 99 percent of the North American population that doesn’t work on a farm, you might guess at what comes next—many a lament has been written about the passing of the good old days in rural areas, the family farm’s decline, and the inevitable loss of the homestead. But in many respects, that narrative itself is obsolete. That’s certainly true in my family’s case: The Freeland farm is still being cultivated by my father. And it is bigger and more prosperous than ever.

My dad farms 3,200 acres of his own, and rents another 2,400—all told, a territory seven times the size of Central Park. Last year, he produced 3,900 tonnes (or metric tons) of wheat, 2,500 tonnes of canola, and 1,400 tonnes of barley. (That’s enough to produce 13 million loaves of bread, 1.2 million liters of vegetable oil, and 40,000 barrels of beer.) His revenue last year was more than $2 million, and he admits to having made “a good profit,” but won’t reveal more than that. The farm has just three workers, my dad and his two hired men, who farm with him nine months of the year. For the two or three weeks of seeding and harvest, my dad usually hires a few friends to help out, too.

My father farms in northern Alberta, but his story is typical of large-scale family farmers across North America. Urbanites may picture farmers as hip heritage-pig breeders returning to the land, or a struggling rural underclass waging a doomed battle to hang on to their patrimony as agribusiness moves in. But these stereotypes are misleading. In 2010, of all the farms in the United States with at least $1 million in revenues, 88 percent were family farms, and they accounted for 79 percent of production. Large-scale farmers today are sophisticated businesspeople who use GPS equipment to guide their combines, biotechnology to boost their yields, and futures contracts to hedge their risk. They are also pretty rich. (...)

Big Money has noticed these trends, and is beginning to pile in. “We are seeing a tremendous uptick in allocations and interest in farmland,” says Chris Erickson of HighQuest Partners, an agricultural consultancy and investor. Erickson told me that big institutional investors—pension funds, insurance companies—have recently been making investments in farmland ranging from “the several hundred millions to the billions.” Erickson said this broad interest is new, and is driven by the fact that “the fundamentals are changing dramatically.”

by Chrystia Freeland, The Atlantic |  Read more:
Photo: David Johnston

Robert Glasper Experiment - Afro Blue (Feat. Erykah Badu)


Wednesday, July 18, 2012

Into the Wild

Marko Cheseto is almost late to class. He enters the lobby of the social sciences building at 9:58 a.m., two minutes before his public speaking lecture begins. He is in no rush, plodding slowly amid the blur of backpacks and students. He stands out: 28 years old, long and spindly, a black man on the mostly white campus of the University of Alaska Anchorage, a Kenyan among mostly in-state students. His skin is as dark as an Alaskan winter morning; patches of frostbite char his cheeks like eyeblack. His lips are dry and crevassed. He is the most famous person on campus, a star runner. And he's pushing a two-wheeled walker.

A blond girl stops him. "Marko!" she says.

"Hellll-oooo!" he replies, voice arching.

"Can I give you a hug?"

"Okay, just don't push me!" he says in fast, accented English. She moves in gently. Marko embraces her with his left arm, his right hand steadying himself. For two months, Marko has envisioned this January morning: First day of spring semester senior year, a chance to prove that he's still the same old sweet, sarcastic, eager-to-entertain Marko. A few nights ago at a UAA basketball game, girls had hugged him in droves. Three former teammates surrounded him for a picture and posted it on Facebook. Marko had ambled around without his walker, showing off, perhaps too much.

Now Marko says goodbye to the blonde and rolls into an elevator. Before the doors close, an older woman whom Marko doesn't know juts toward the narrowing window and whispers, "We love you." The elevator rings open on the second floor, and Marko pushes to Room 251. He rolls toward the desks, then stops like a car that's halfway through a wrong turn.

Those desks -- the normal desks -- aren't for him anymore. He turns toward the lone handicap table, twists and falls into his seat straight-legged, then glances down at the shiny black shoes covering his new plastic stubs.

Those used to be his feet.

During an August night in 2008, Marko Cheseto walked onto a plane in Nairobi bound for Alaska. His feet were his own. He had only $100 in his pockets. His luggage totaled one bag containing two outfits. He was raised in Ptop, a village of 1,000 in the western Kenyan mountains, elevation 8,000 feet -- a foggy, damp region without running water or electricity or roads, where the Pokot dialect of Swahili was spoken. His father, Dickson, farmed, built houses and herded animals, many of which he sold to help purchase a one-way ticket to Anchorage, where the third oldest of his 11 children would attend college on a cross-country and track scholarship.

Nobody from Marko's village had ever left to go to school in America, never mind Alaska. Running was not the route out of Ptop as it was in so many other poor villages in Kenya's highlands. But running was something he always did well. After he graduated from a Nairobi two-year college in 2006 and was earning a modest living as a teacher, he noticed that runners -- inferior runners, he felt -- were leaving on scholarship for U.S. colleges. America meant money, and those who left were expected to share it to help back home.

So Marko chased a new life in hopes of improving his family's old one. He wanted, in the words of his cousin Nicholas Atudonyang, "to be a role model for the guys in his village." He enrolled in one of the running academies in Eldoret, training twice daily in the 6,000-foot elevation, and had moderate success in local races. That got his name on American recruiters' prospect lists. Michael Friess, the track and cross-country coach at Alaska Anchorage, already had one star Kenyan on his roster, David Kiplagat, and wanted to add more. Friess, a loving hard-ass who's been UAA's head coach for 22 of his 50 years, offered Marko a full scholarship, without even meeting him

At first, his parents didn't want Marko to leave, fearing that they'd have to support him again. But he argued that although his teaching job was fine for him, his father could desperately use extra income to supplement his typical earnings of $200 a year. In Alaska, Marko said, he'd work part time and send home a few hundred dollars a year. His parents acquiesced, selling farm animals and asking members of their extended family to help cover Marko's expenses. So Marko, seated in the rear, a few rows behind another runner bound for UAA, Alfred Kangogo, flew from Nairobi to Amsterdam to Minneapolis to Anchorage. All he'd heard about Alaska was that it was dark 24 hours a day. But when they arrived in the evening, the sun shining, Alfred turned to Marko and said, "Just like home."  (...)

But the ease with which Marko and his fellow Kenyans got along with other students belied the fact that getting beyond the surface was difficult. The Kenyans were too busy being unspoken breadwinners to date much. Friess, worried that they were stretched too thin, told them they couldn't begin work at 6 a.m. anymore. They adjusted by working later. They simply carried on, each handling the pressure in his own way. David was driven, eventually graduating with a degree in finance and economics. Alfred was relentless, earning the nickname Bulldog. And Marko tried to be perfect, putting on a positive front even during the occasional month when he didn't earn enough to send any money home. After he paid rent and his school expenses, much of his $450 take-home was spoken for. Usually he was able to save up and wire $100 every few months.

by Seth Wickersham, ESPN |  Read more:
Photo: Jose Mandojana for ESPN The Magazine

Dying in Court

Gloria Taylor, a Canadian, has amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease. Over a period of a few years, her muscles will weaken until she can no longer walk, use her hands, chew, swallow, speak, and ultimately, breathe. Then she will die. Taylor does not want to go through all of that. She wants to die at a time of her own choosing.

Suicide is not a crime in Canada, so, as Taylor put it: “I simply cannot understand why the law holds that the able-bodied who are terminally ill are allowed to shoot themselves when they have had enough because they are able to hold a gun steady, but because my illness affects my ability to move and control my body, I cannot be allowed compassionate help to allow me to commit an equivalent act using lethal medication.”

Taylor sees the law as offering her a cruel choice: either end her life when she still finds it enjoyable, but is capable of killing herself, or give up the right that others have to end their lives when they choose. She went to court, arguing that the provisions of the Criminal Code that prevent her from receiving assistance in dying are inconsistent with the Canadian Charter of Rights and Freedoms, which gives Canadians rights to life, liberty, personal security, and equality.

The court hearing was remarkable for the thoroughness with which Justice Lynn Smith examined the ethical questions before her. She received expert opinions from leading figures on both sides of the issue, not only Canadians, but also authorities in Australia, Belgium, the Netherlands, New Zealand, Switzerland, the United Kingdom, and the United States. The range of expertise included general medicine, palliative care, neurology, disability studies, gerontology, psychiatry, psychology, law, philosophy, and bioethics.
Many of these experts were cross-examined in court. Along with Taylor’s right to die, decades of debate about assistance in dying came under scrutiny.

Last month, Smith issued her judgment. The case, Carter v. Canada, could serve as a textbook on the facts, law, and ethics of assistance in dying.

by Peter Singer, Project Syndicate |  Read more:
Illustration: Margaret Scott

Yellow Bather, Keith Vaughan. English (1912 - 1977)
via:

32 Innovations That Will Change Your Tomorrow


The electric light was a failure.

Invented by the British chemist Humphry Davy in the early 1800s, it spent nearly 80 years being passed from one initially hopeful researcher to another, like some not-quite-housebroken puppy. In 1879, Thomas Edison finally figured out how to make an incandescent light bulb that people would buy. But that didn’t mean the technology immediately became successful. It took another 40 years, into the 1920s, for electric utilities to become stable, profitable businesses. And even then, success happened only because the utilities created other reasons to consume electricity. They invented the electric toaster and the electric curling iron and found lots of uses for electric motors. They built Coney Island. They installed electric streetcar lines in any place large enough to call itself a town. All of this, these frivolous gadgets and pleasurable diversions, gave us the light bulb.

We tend to rewrite the histories of technological innovation, making myths about a guy who had a great idea that changed the world. In reality, though, innovation isn’t the goal; it’s everything that gets you there. It’s bad financial decisions and blueprints for machines that weren’t built until decades later. It’s the important leaps forward that synthesize lots of ideas, and it’s the belly-up failures that teach us what not to do.

When we ignore how innovation actually works, we make it hard to see what’s happening right in front of us today. If you don’t know that the incandescent light was a failure before it was a success, it’s easy to write off some modern energy innovations — like solar panels — because they haven’t hit the big time fast enough.

Worse, the fairy-tale view of history implies that innovation has an end. It doesn’t. What we want and what we need keeps changing. The incandescent light was a 19th-century failure and a 20th- century success. Now it’s a failure again, edged out by new technologies, like LEDs, that were, themselves, failures for many years.

That’s what this issue is about: all the little failures, trivialities and not-quite-solved mysteries that make the successes possible. This is what innovation looks like. It’s messy, and it’s awesome.

by Maggie Koerth-Baker, NY Times |  Read more:
Illustration: Chris Nosenzo

Don’t Indulge. Be Happy.


How much money do you need to be happy? Think about it. What’s your number?

Many of us aren’t satisfied with how much we have now. That’s why we’re constantly angling for a raise at work, befriending aged relatives and springing, despite long odds, for lottery scratch tickets.

Is it crazy to question how much money you need to be happy? The notion that money can’t buy happiness has been around a long time — even before yoga came into vogue. But it turns out there is a measurable connection between income and happiness; not surprisingly, people with a comfortable living standard are happier than people living in poverty.

The catch is that additional income doesn’t buy us any additional happiness on a typical day once we reach that comfortable standard. The magic number that defines this “comfortable standard” varies across individuals and countries, but in the United States, it seems to fall somewhere around $75,000. Using Gallup data collected from almost half a million Americans, researchers at Princeton found that higher household incomes were associated with better moods on a daily basis — but the beneficial effects of money tapered off entirely after the $75,000 mark.

Why, then, do so many of us bother to work so hard long after we have reached an income level sufficient to make most of us happy? One reason is that our ideas about the relationship between money and happiness are misguided. In research we conducted with a national sample of Americans, people thought that their life satisfaction would double if they made $55,000 instead of $25,000: more than twice as much money, twice as much happiness. But our data showed that people who earned $55,000 were just 9 percent more satisfied than those making $25,000. Nine percent beats zero percent, but it’s still kind of a letdown when you were expecting a 100 percent return.

Interestingly, and usefully, it turns out that what we do with our money plays a far more important role than how much money we make. Imagine three people each win $1 million in the lottery. Suppose one person attempts to buy every single thing he has ever wanted; one puts it all in the bank and uses the money only sparingly, for special occasions; and one gives it all to charity. At the end of the year, they all would report an additional $1 million of income. Many of us would follow the first person’s strategy, but the latter two winners are likely to get the bigger happiness bang for their buck.

by Elizabeth Dunn and Michael Norton, NY Times |  Read more:
Illustration: Brock Davis