Thursday, April 18, 2013
Cyborg Dreams
Today, depending on your favoured futurist prophet, a kind of digital Elysium awaits us all. Over millennia, we have managed to unshackle ourselves from the burdens of time and space — from heat, cold, hunger, thirst, physical distance, mechanical effort — along a trajectory seemingly aimed at abstraction. Humanity’s collective consciousness is to be uploaded into the super-Matrix of the near future — or augmented into cyborg immortality, or out-evolved by self-aware machine minds. Whatever happens, the very meat of our physical being is to be left behind.
Except, of course, so far we remain thorougly embodied. Flesh and blood. There is just us, slumped in our chairs, at our desks, inside our cars, stroking our smartphones and tablets. Peel back the layers of illusion, and what remains is not a brain in a jar — however much we might fear or hunger for this — but a brain within a body, as remorselessly obedient to that body’s urges and limitations as any paleolithic hunter-gatherer.
It’s a point that has been emphasised by much recent research into thought and behaviour. To quote from Thinking, Fast and Slow (2011) by Nobel laureate Daniel Kahneman, ‘cognition is embodied; you think with your body, not only with your brain’. Yet when it comes to culture's cutting edge, there remains an overwhelming tendency to treat embodiment not as a central condition of being human that our tools ought to serve, but rather as an inconvenience to be eliminated.
One of my favourite accounts of our genius for unreality is a passage from the David Foster Wallace essay ‘E Unibus Pluram: Television and US Fiction’ (1990), in which he describes, with escalating incredulity, the layers of illusion involved in watching television.
First comes the artifice of performance. ‘Illusion is that we’re voyeurs here at all,’ he writes, ‘the “voyees” behind the screen’s glass are only pretending ignorance. They know perfectly well we’re out there.’ Then there’s the capturing of these performances, ‘the second layer of glass, the lenses and monitors via which technicians and arrangers apply ingenuity to hurl the visible images at us’. And then there are the nestled layers of artificiality involved in scripting, devising and selling the scenarios to be filmed, which aren’t ‘people in real situations that do or even could go on without consciousness of Audience’.
After this comes the actual screen that we’re looking at: not what it appears to show, but its physical reality in ‘analog waves and ionised streams and rear-screen chemical reactions throwing off phosphenes in grids of dots not much more lifelike than Seurat’s own impressionist “statements” on perceptual illusion’.
But even this is only the warm-up. Because — ‘Good lord’ he exclaims in climax — ‘the dots are coming out of our furniture, all we’re really spying on is our furniture; and our very own chairs and lamps and bookspines sit visible but unseen at our gaze’s frame...’
There’s a certain awe at our capacity for self-deception, here — if ‘deception’ is the right word for the chosen, crafted unrealities in play. But Foster Wallace’s ‘good lord’ is also a cry of awakening into uncomfortable truth. (...)
At the start of the 1990s, screens — whether televisions or computers, deployed for work or leisure — were bulky, static objects. For those on the move and lucky enough to employ a weight-lifting personal assistant, a Macintosh ‘portable’ cost $6,500 and weighed 7.2 kilos (close to 16 lbs). For everyone else, computing was a crude, solitary domain, inaccessible to anyone other than aficionados.
Today, just two decades on from Foster Wallace’s ‘E Unibus Pluram’, we inhabit an age of extraordinary intimacy with screen-based technologies. As well as our home and office computers, and the 40-inch-plus glories of our living room screens, few of us are now without the tactile, constant presence of at least one smart device in our pocket or bag.
These are tools that can feel more like extensions of ourselves than separate devices: the first thing we touch when we wake up in the morning, the last thing we touch before going to bed at night. Yet what they offer is a curious kind of intimacy — and the ‘us’ to which all this is addressed doesn’t often look or feel much like a living, breathing human being.
by Tom Chatfield, Aeon | Read more:
Except, of course, so far we remain thorougly embodied. Flesh and blood. There is just us, slumped in our chairs, at our desks, inside our cars, stroking our smartphones and tablets. Peel back the layers of illusion, and what remains is not a brain in a jar — however much we might fear or hunger for this — but a brain within a body, as remorselessly obedient to that body’s urges and limitations as any paleolithic hunter-gatherer.
It’s a point that has been emphasised by much recent research into thought and behaviour. To quote from Thinking, Fast and Slow (2011) by Nobel laureate Daniel Kahneman, ‘cognition is embodied; you think with your body, not only with your brain’. Yet when it comes to culture's cutting edge, there remains an overwhelming tendency to treat embodiment not as a central condition of being human that our tools ought to serve, but rather as an inconvenience to be eliminated.
One of my favourite accounts of our genius for unreality is a passage from the David Foster Wallace essay ‘E Unibus Pluram: Television and US Fiction’ (1990), in which he describes, with escalating incredulity, the layers of illusion involved in watching television.
First comes the artifice of performance. ‘Illusion is that we’re voyeurs here at all,’ he writes, ‘the “voyees” behind the screen’s glass are only pretending ignorance. They know perfectly well we’re out there.’ Then there’s the capturing of these performances, ‘the second layer of glass, the lenses and monitors via which technicians and arrangers apply ingenuity to hurl the visible images at us’. And then there are the nestled layers of artificiality involved in scripting, devising and selling the scenarios to be filmed, which aren’t ‘people in real situations that do or even could go on without consciousness of Audience’.
After this comes the actual screen that we’re looking at: not what it appears to show, but its physical reality in ‘analog waves and ionised streams and rear-screen chemical reactions throwing off phosphenes in grids of dots not much more lifelike than Seurat’s own impressionist “statements” on perceptual illusion’.
But even this is only the warm-up. Because — ‘Good lord’ he exclaims in climax — ‘the dots are coming out of our furniture, all we’re really spying on is our furniture; and our very own chairs and lamps and bookspines sit visible but unseen at our gaze’s frame...’
There’s a certain awe at our capacity for self-deception, here — if ‘deception’ is the right word for the chosen, crafted unrealities in play. But Foster Wallace’s ‘good lord’ is also a cry of awakening into uncomfortable truth. (...)
At the start of the 1990s, screens — whether televisions or computers, deployed for work or leisure — were bulky, static objects. For those on the move and lucky enough to employ a weight-lifting personal assistant, a Macintosh ‘portable’ cost $6,500 and weighed 7.2 kilos (close to 16 lbs). For everyone else, computing was a crude, solitary domain, inaccessible to anyone other than aficionados.
Today, just two decades on from Foster Wallace’s ‘E Unibus Pluram’, we inhabit an age of extraordinary intimacy with screen-based technologies. As well as our home and office computers, and the 40-inch-plus glories of our living room screens, few of us are now without the tactile, constant presence of at least one smart device in our pocket or bag.
These are tools that can feel more like extensions of ourselves than separate devices: the first thing we touch when we wake up in the morning, the last thing we touch before going to bed at night. Yet what they offer is a curious kind of intimacy — and the ‘us’ to which all this is addressed doesn’t often look or feel much like a living, breathing human being.
by Tom Chatfield, Aeon | Read more:
Photo by Allen Donikowski/Flickr/Getty
Japanese Roots
Unearthing the origins of the Japanese is a much harder task than you might guess. Among world powers today, the Japanese are the most distinctive in their culture and environment. The origins of their language are one of the most disputed questions of linguistics. These questions are central to the self-image of the Japanese and to how they are viewed by other peoples. Japan’s rising dominance and touchy relations with its neighbors make it more important than ever to strip away myths and find answers.
The search for answers is difficult because the evidence is so conflicting. On the one hand, the Japanese people are biologically undistinctive, being very similar in appearance and genes to other East Asians, especially to Koreans. As the Japanese like to stress, they are culturally and biologically rather homogeneous, with the exception of a distinctive people called the Ainu on Japan’s northernmost island of Hokkaido. Taken together, these facts seem to suggest that the Japanese reached Japan only recently from the Asian mainland, too recently to have evolved differences from their mainland cousins, and displaced the Ainu, who represent the original inhabitants. But if that were true, you might expect the Japanese language to show close affinities to some mainland language, just as English is obviously closely related to other Germanic languages (because Anglo-Saxons from the continent conquered England as recently as the sixth century a.d.). How can we resolve this contradiction between Japan’s presumably ancient language and the evidence for recent origins?
Archeologists have proposed four conflicting theories. Most popular in Japan is the view that the Japanese gradually evolved from ancient Ice Age people who occupied Japan long before 20,000 b.c. Also widespread in Japan is a theory that the Japanese descended from horse-riding Asian nomads who passed through Korea to conquer Japan in the fourth century, but who were themselves—emphatically—not Koreans. A theory favored by many Western archeologists and Koreans, and unpopular in some circles in Japan, is that the Japanese are descendants of immigrants from Korea who arrived with rice-paddy agriculture around 400 b.c. Finally, the fourth theory holds that the peoples named in the other three theories could have mixed to form the modern Japanese.
When similar questions of origins arise about other peoples, they can be discussed dispassionately. That is not so for the Japanese. Until 1946, Japanese schools taught a myth of history based on the earliest recorded Japanese chronicles, which were written in the eighth century. They describe how the sun goddess Amaterasu, born from the left eye of the creator god Izanagi, sent her grandson Ninigi to Earth on the Japanese island of Kyushu to wed an earthly deity. Ninigi’s great-grandson Jimmu, aided by a dazzling sacred bird that rendered his enemies helpless, became the first emperor of Japan in 660 b.c. To fill the gap between 660 b.c. and the earliest historically documented Japanese monarchs, the chronicles invented 13 other equally fictitious emperors. Before the end of World War II, when Emperor Hirohito finally announced that he was not of divine descent, Japanese archeologists and historians had to make their interpretations conform to this chronicle account. Unlike American archeologists, who acknowledge that ancient sites in the United States were left by peoples (Native Americans) unrelated to most modern Americans, Japanese archeologists believe all archeological deposits in Japan, no matter how old, were left by ancestors of the modern Japanese. Hence archeology in Japan is supported by astronomical budgets, employs up to 50,000 field-workers each year, and draws public attention to a degree inconceivable anywhere else in the world.
Why do they care so much? Unlike most other non-European countries, Japan preserved its independence and culture while emerging from isolation to create an industrialized society in the late nineteenth century. It was a remarkable achievement. Now the Japanese people are understandably concerned about maintaining their traditions in the face of massive Western cultural influences. They want to believe that their distinctive language and culture required uniquely complex developmental processes. To acknowledge a relationship of the Japanese language to any other language seems to constitute a surrender of cultural identity.
What makes it especially difficult to discuss Japanese archeology dispassionately is that Japanese interpretations of the past affect present behavior. Who among East Asian peoples brought culture to whom? Who has historical claims to whose land? These are not just academic questions. For instance, there is much archeological evidence that people and material objects passed between Japan and Korea in the period a.d. 300 to 700. Japanese interpret this to mean that Japan conquered Korea and brought Korean slaves and artisans to Japan; Koreans believe instead that Korea conquered Japan and that the founders of the Japanese imperial family were Korean.
Thus, when Japan sent troops to Korea and annexed it in 1910, Japanese military leaders celebrated the annexation as the restoration of the legitimate arrangement of antiquity. For the next 35 years, Japanese occupation forces tried to eradicate Korean culture and to replace the Korean language with Japanese in schools. The effort was a consequence of a centuries-old attitude of disdain. Nose tombs in Japan still contain 20,000 noses severed from Koreans and brought home as trophies of a sixteenth-century Japanese invasion. Not surprisingly, many Koreans loathe the Japanese, and their loathing is returned with contempt.
What really was the legitimate arrangement of antiquity? Today, Japan and Korea are both economic powerhouses, facing each other across the Korea Strait and viewing each other through colored lenses of false myths and past atrocities. It bodes ill for the future of East Asia if these two great peoples cannot find common ground. To do so, they will need a correct understanding of who the Japanese people really are.
The search for answers is difficult because the evidence is so conflicting. On the one hand, the Japanese people are biologically undistinctive, being very similar in appearance and genes to other East Asians, especially to Koreans. As the Japanese like to stress, they are culturally and biologically rather homogeneous, with the exception of a distinctive people called the Ainu on Japan’s northernmost island of Hokkaido. Taken together, these facts seem to suggest that the Japanese reached Japan only recently from the Asian mainland, too recently to have evolved differences from their mainland cousins, and displaced the Ainu, who represent the original inhabitants. But if that were true, you might expect the Japanese language to show close affinities to some mainland language, just as English is obviously closely related to other Germanic languages (because Anglo-Saxons from the continent conquered England as recently as the sixth century a.d.). How can we resolve this contradiction between Japan’s presumably ancient language and the evidence for recent origins?
Archeologists have proposed four conflicting theories. Most popular in Japan is the view that the Japanese gradually evolved from ancient Ice Age people who occupied Japan long before 20,000 b.c. Also widespread in Japan is a theory that the Japanese descended from horse-riding Asian nomads who passed through Korea to conquer Japan in the fourth century, but who were themselves—emphatically—not Koreans. A theory favored by many Western archeologists and Koreans, and unpopular in some circles in Japan, is that the Japanese are descendants of immigrants from Korea who arrived with rice-paddy agriculture around 400 b.c. Finally, the fourth theory holds that the peoples named in the other three theories could have mixed to form the modern Japanese.
When similar questions of origins arise about other peoples, they can be discussed dispassionately. That is not so for the Japanese. Until 1946, Japanese schools taught a myth of history based on the earliest recorded Japanese chronicles, which were written in the eighth century. They describe how the sun goddess Amaterasu, born from the left eye of the creator god Izanagi, sent her grandson Ninigi to Earth on the Japanese island of Kyushu to wed an earthly deity. Ninigi’s great-grandson Jimmu, aided by a dazzling sacred bird that rendered his enemies helpless, became the first emperor of Japan in 660 b.c. To fill the gap between 660 b.c. and the earliest historically documented Japanese monarchs, the chronicles invented 13 other equally fictitious emperors. Before the end of World War II, when Emperor Hirohito finally announced that he was not of divine descent, Japanese archeologists and historians had to make their interpretations conform to this chronicle account. Unlike American archeologists, who acknowledge that ancient sites in the United States were left by peoples (Native Americans) unrelated to most modern Americans, Japanese archeologists believe all archeological deposits in Japan, no matter how old, were left by ancestors of the modern Japanese. Hence archeology in Japan is supported by astronomical budgets, employs up to 50,000 field-workers each year, and draws public attention to a degree inconceivable anywhere else in the world.
Why do they care so much? Unlike most other non-European countries, Japan preserved its independence and culture while emerging from isolation to create an industrialized society in the late nineteenth century. It was a remarkable achievement. Now the Japanese people are understandably concerned about maintaining their traditions in the face of massive Western cultural influences. They want to believe that their distinctive language and culture required uniquely complex developmental processes. To acknowledge a relationship of the Japanese language to any other language seems to constitute a surrender of cultural identity.
What makes it especially difficult to discuss Japanese archeology dispassionately is that Japanese interpretations of the past affect present behavior. Who among East Asian peoples brought culture to whom? Who has historical claims to whose land? These are not just academic questions. For instance, there is much archeological evidence that people and material objects passed between Japan and Korea in the period a.d. 300 to 700. Japanese interpret this to mean that Japan conquered Korea and brought Korean slaves and artisans to Japan; Koreans believe instead that Korea conquered Japan and that the founders of the Japanese imperial family were Korean.
Thus, when Japan sent troops to Korea and annexed it in 1910, Japanese military leaders celebrated the annexation as the restoration of the legitimate arrangement of antiquity. For the next 35 years, Japanese occupation forces tried to eradicate Korean culture and to replace the Korean language with Japanese in schools. The effort was a consequence of a centuries-old attitude of disdain. Nose tombs in Japan still contain 20,000 noses severed from Koreans and brought home as trophies of a sixteenth-century Japanese invasion. Not surprisingly, many Koreans loathe the Japanese, and their loathing is returned with contempt.
What really was the legitimate arrangement of antiquity? Today, Japan and Korea are both economic powerhouses, facing each other across the Korea Strait and viewing each other through colored lenses of false myths and past atrocities. It bodes ill for the future of East Asia if these two great peoples cannot find common ground. To do so, they will need a correct understanding of who the Japanese people really are.
by Jared Diamond, Discover (1998) | Read more:
Image: The Fuji seen from the Mishima pass, Katsushika HokusaiWednesday, April 17, 2013
Shame On You
[ed. Spineless cowards. Who cares if Republicans threaten a filibuster - and Democrats are just hiding behind that threat. If someone wants to filibuster, then let them make their case in the full glare of media attention and see what happens. The blood of our children all around.]
A wrenching national search for solutions to the violence that left 20 children dead in Newtown, Conn., all but ended Wednesday after the Senate defeated several measures to expand gun control.
In rapid succession, a bipartisan compromise to expand background checks for gun buyers, a ban on assault weapons and a ban on high-capacity gun magazines all failed to get the 60 votes needed under an agreement between both parties. Senators also turned back Republican proposals to expand permission to carry concealed weapons and to focus law enforcement efforts on prosecuting gun crimes.
Sitting in the Senate gallery with other survivors of recent mass shootings and their family members, Lori Haas, whose daughter was shot at Virginia Tech, and Patricia Maisch, a survivor of the mass shooting in Arizona, shouted together, “Shame on you.”
President Obama, speaking at the White House after the votes, echoed the cry, calling Wednesday “a pretty shameful day for Washington.”
Opponents of gun control from both parties said that they made their decisions based on logic, and that passions had no place in the making of momentous policy.
“Criminals do not submit to background checks now,” said Senator Charles E. Grassley, Republican of Iowa. “They will not submit to expanded background checks.”
It was a striking defeat for one of Mr. Obama’s highest priorities, on an issue that has consumed much of the country since Adam Lanza opened fire with an assault weapon in the halls of Sandy Hook Elementary School in December.
Faced with a decision either to remove substantial new gun restrictions from the bill or to allow it to fall to a filibuster next week, Senate leaders plan to put it on hold after a scattering of votes Thursday. Senator Harry Reid of Nevada, the majority leader and a longtime gun rights advocate who had thrown himself behind the gun control measures, is expected to pull the bill from the Senate floor and move on to an Internet sales tax measure, then an overhaul of immigration policy, which has better prospects.
More than 50 senators — including a few Republicans, but lacking a handful of Democrats from more conservative states — had signaled their support for the gun bill, not enough to reach the 60-vote threshold to overcome a filibuster.
by Jonathan Weisman, NY Times | Read more:
Jethro Tull
[ed. Ian Anderson, sounding better than ever.]
A Different World
In the beginning, Danyelle Carter had only one thought: I'm going to die. The 7 a.m. boot camp workout class at Spelman College is a muscle-testing, lung-battering trial for even the fittest. And Carter feels no shame in admitting that for most of her young life, she has been far from fit.
Last year, Carter says, she weighed 340 pounds. She grew up in the Bahamas and South Florida, and she learned to eat and eat and eat. Her close-knit Caribbean upbringing centered on family, and family meant food: plenty of it, all the time. If a handful of rice was good, two handfuls were better. "And everything was fried, dripping with grease," she says. Carter studied hard in school, earning an associates degree from Miami Dade College and, obeying her mother's orders, took classes to swap her native lilt for her current TV-anchor diction. But being a girl, nobody expected her to run or lift weights or play a sport. Inactivity was the rule, healthy role models the exception. High school gym class? "It was kind of like, ‘Here's a basketball, and if you use if for an hour a day it will help your heart.'" When Carter arrived in Georgia in September, a walk across the impeccably kept, historically black, all-women's campus inspired dread. It gets hot in Atlanta. It's hotter when you're more than 300 pounds.
But Carter is a sister of Spelman, and the sisterhood strives to do extraordinary things. Last November, the college announced it was dropping all intercollegiate sports at the end of this academic year, the first school in a decade to leave the NCAA. True to its motto of "A Choice to Change the World," Spelman is choosing to move $1 million a year previously budgeted for varsity sports into what leaders call a "Wellness Revolution" for all students-pouring resources into exercise classes and nutrition counseling and intramurals.
Carter jumped two-footed into Spelman's young wellness program last fall before she was even enrolled. Awaiting a transfer, she wasn't officially admitted until January. That didn't stop her or Spelman from changing her life. Carter studied all the fitness and nutrition information Spelman had to offer, and even attended campus fitness classes. She tried tai chi and Zumba, and befriended the treadmill. Within a few months, she figured she was ready for some stronger medicine, so she signed up for the boot camp in January. It hurt. More than once, she remembers thinking, One more burpee, one more lunge, and my heart's giving out. It never did.
Danyelle Carter says she now weighs 220, losing more than 25 pounds since the start of the year. She runs nearly every day. Sometimes, when she's stressed out from studying at 2 a.m., she'll jog six laps-or two miles-around the Spelman Oval, past Rockefeller Hall's stained glass and the flowery alumna arch that only graduates may walk through. She swore off cake, quit eating cereal at night and leaves the candy alone. "I love tofu now," she says. And she made peace with boot camp. Even when mired in the worst part of the workout, called 21 Down-21 pushups, then 21 crunches, then 20 of each, then 19, 18, and so on, with no rest between sets-Carter reminds herself of her mantra: Pain is weakness leaving the body.
This is how revolutions are born. And Spelman's may be a link in a chain that one day leaves the NCAA, as well as the rest of our hypercompetitive, over-selective, winner-take-all interscholastic sports system, as dead as the tsar of all the Russias.
Meanwhile, Danyelle Carter might just be the student athlete of the future. A future marked not by madness, but by common sense. One where the goal is not a championship today, but lifelong play, and where the measure of success is not maximum revenue, but a minimum level of health. Spelman College is doing something remarkable. Instead of spending seven figures a year on a few dozen varsity athletes, Spelman will expand its wellness program, funding fitness for everyone on campus.
Last year, Carter says, she weighed 340 pounds. She grew up in the Bahamas and South Florida, and she learned to eat and eat and eat. Her close-knit Caribbean upbringing centered on family, and family meant food: plenty of it, all the time. If a handful of rice was good, two handfuls were better. "And everything was fried, dripping with grease," she says. Carter studied hard in school, earning an associates degree from Miami Dade College and, obeying her mother's orders, took classes to swap her native lilt for her current TV-anchor diction. But being a girl, nobody expected her to run or lift weights or play a sport. Inactivity was the rule, healthy role models the exception. High school gym class? "It was kind of like, ‘Here's a basketball, and if you use if for an hour a day it will help your heart.'" When Carter arrived in Georgia in September, a walk across the impeccably kept, historically black, all-women's campus inspired dread. It gets hot in Atlanta. It's hotter when you're more than 300 pounds.But Carter is a sister of Spelman, and the sisterhood strives to do extraordinary things. Last November, the college announced it was dropping all intercollegiate sports at the end of this academic year, the first school in a decade to leave the NCAA. True to its motto of "A Choice to Change the World," Spelman is choosing to move $1 million a year previously budgeted for varsity sports into what leaders call a "Wellness Revolution" for all students-pouring resources into exercise classes and nutrition counseling and intramurals.
Carter jumped two-footed into Spelman's young wellness program last fall before she was even enrolled. Awaiting a transfer, she wasn't officially admitted until January. That didn't stop her or Spelman from changing her life. Carter studied all the fitness and nutrition information Spelman had to offer, and even attended campus fitness classes. She tried tai chi and Zumba, and befriended the treadmill. Within a few months, she figured she was ready for some stronger medicine, so she signed up for the boot camp in January. It hurt. More than once, she remembers thinking, One more burpee, one more lunge, and my heart's giving out. It never did.
Danyelle Carter says she now weighs 220, losing more than 25 pounds since the start of the year. She runs nearly every day. Sometimes, when she's stressed out from studying at 2 a.m., she'll jog six laps-or two miles-around the Spelman Oval, past Rockefeller Hall's stained glass and the flowery alumna arch that only graduates may walk through. She swore off cake, quit eating cereal at night and leaves the candy alone. "I love tofu now," she says. And she made peace with boot camp. Even when mired in the worst part of the workout, called 21 Down-21 pushups, then 21 crunches, then 20 of each, then 19, 18, and so on, with no rest between sets-Carter reminds herself of her mantra: Pain is weakness leaving the body.
This is how revolutions are born. And Spelman's may be a link in a chain that one day leaves the NCAA, as well as the rest of our hypercompetitive, over-selective, winner-take-all interscholastic sports system, as dead as the tsar of all the Russias.
-----
Our great nation was just inundated with the Caligula-worthy circus that is the NCAA Division I men's basketball tournament. College kids who won't see a classroom for weeks perform hard, physical labor (for free, at least as far as the IRS knows) on behalf of an American audience that doesn't give a rat's ass whether players can read so long as they convert some timely threes, cover the spread and bust someone else's bracket. The tournament epitomizes what our century-old interscholastic athletics system is all about. March Madness-a tiny, televised group of elites moving at high speeds to entertain great, couch-clinging masses that don't move at all-is the way sports lives now.Meanwhile, Danyelle Carter might just be the student athlete of the future. A future marked not by madness, but by common sense. One where the goal is not a championship today, but lifelong play, and where the measure of success is not maximum revenue, but a minimum level of health. Spelman College is doing something remarkable. Instead of spending seven figures a year on a few dozen varsity athletes, Spelman will expand its wellness program, funding fitness for everyone on campus.
by Luke Cyphers, SB Nation | Read more:
Photo: Getty Images
Dan Loeb Simultaneously Solicits, Betrays Pension Funds
There's confidence. There's chutzpah. And then there's Dan Loeb, hedge fund king extraordinaire and head of Third Point Capital, who's getting set to claim the World Heavyweight Championship of Balls.
On April 18, Loeb will speak before the Council of Institutional Investors, a nonprofit association of pension funds, endowments, employee benefit funds, and foundations with collective assets of over $3 trillion. The CII is an umbrella group that represents the institutions who manage the retirement and benefit funds of public and corporate employees all over America – from bricklayers to Teamsters to teachers to employees of Colgate, the Gap and Johnson and Johnson.
Loeb is going to be, in essence, pitching his services to these institutional investors. He already manages the money for several public funds, including the Ohio Public Employees' Retirement System, the New Jersey State Investment Council, the Sacramento County Employees' Retirement System, and the City of Danbury Retirement System. To give you an idea of the scale, New Jersey alone has $100 million invested with one of Loeb's funds.
When he comes to speak at CII, Lobe will almost certainly be seeking new clients. There will be some serious whales in these waters: For instance, CalSTRS, the California State Teachers' Retirement System, will definitely be represented (Anne Sheehan, the director of corporate governance for CalSTRS, will be moderating Loeb's panel).
But here's the catch. Dan Loeb, who isn't known as the biggest hedge-fund asshole still working on Wall Street (only because Stevie Cohen hasn't been arrested yet), is on the board and co-founder of a group called Students First New York. And Students First has been one of the leading advocates pushing for states to abandon defined benefit plans – packages which guarantee certain retirement benefits for public workers like teachers – in favor of defined contribution plans, where the benefits are not guaranteed.
In other words, Loeb has been soliciting the retirement money of public workers, then turning right around and lobbying for those same workers to lose their benefits. He's essentially asking workers to pay for their own disenfranchisement (with Loeb getting his two-and-twenty cut, or whatever obscene percentage of their retirement monies he will charge as a fee). If that isn't the very definition of balls, I don't know what is.
It's one thing for a group like Students First to have an opinion about defined benefit plans in general, to say, as they have, that "today's district pensions and other benefits are not sustainable and contribute to a looming fiscal crisis." But it's another thing for a Vice President of Students First like Rebecca Sibilia to tweet the following just a few weeks before one of its board members asks for money from a fund like CalSTRS:
Not long ago, the American Federation of Teachers got wind of Loeb's association with Students First and their lobbying efforts, and confronted him about it, leading to a somewhat incredible correspondence, the details of which I'll get to in a moment. But first, a little background on Loeb.
On April 18, Loeb will speak before the Council of Institutional Investors, a nonprofit association of pension funds, endowments, employee benefit funds, and foundations with collective assets of over $3 trillion. The CII is an umbrella group that represents the institutions who manage the retirement and benefit funds of public and corporate employees all over America – from bricklayers to Teamsters to teachers to employees of Colgate, the Gap and Johnson and Johnson.Loeb is going to be, in essence, pitching his services to these institutional investors. He already manages the money for several public funds, including the Ohio Public Employees' Retirement System, the New Jersey State Investment Council, the Sacramento County Employees' Retirement System, and the City of Danbury Retirement System. To give you an idea of the scale, New Jersey alone has $100 million invested with one of Loeb's funds.
When he comes to speak at CII, Lobe will almost certainly be seeking new clients. There will be some serious whales in these waters: For instance, CalSTRS, the California State Teachers' Retirement System, will definitely be represented (Anne Sheehan, the director of corporate governance for CalSTRS, will be moderating Loeb's panel).
But here's the catch. Dan Loeb, who isn't known as the biggest hedge-fund asshole still working on Wall Street (only because Stevie Cohen hasn't been arrested yet), is on the board and co-founder of a group called Students First New York. And Students First has been one of the leading advocates pushing for states to abandon defined benefit plans – packages which guarantee certain retirement benefits for public workers like teachers – in favor of defined contribution plans, where the benefits are not guaranteed.
In other words, Loeb has been soliciting the retirement money of public workers, then turning right around and lobbying for those same workers to lose their benefits. He's essentially asking workers to pay for their own disenfranchisement (with Loeb getting his two-and-twenty cut, or whatever obscene percentage of their retirement monies he will charge as a fee). If that isn't the very definition of balls, I don't know what is.
It's one thing for a group like Students First to have an opinion about defined benefit plans in general, to say, as they have, that "today's district pensions and other benefits are not sustainable and contribute to a looming fiscal crisis." But it's another thing for a Vice President of Students First like Rebecca Sibilia to tweet the following just a few weeks before one of its board members asks for money from a fund like CalSTRS:
Outdated & underfunded #pension systems like CALSTERS break promises to #teachers#edreform #thinkED http://huff.to/15vdALJ via @HuffPostEduThat's a hell of a sales pitch for Loeb to be making: "I belong to an organization that thinks you're all dinosaurs. Now give me a hundred million dollars."
Not long ago, the American Federation of Teachers got wind of Loeb's association with Students First and their lobbying efforts, and confronted him about it, leading to a somewhat incredible correspondence, the details of which I'll get to in a moment. But first, a little background on Loeb.
by Matt Taibbi, Rolling Stone | Read more:
Photo: Simon Dawson/Bloomberg via Getty ImagesMental Disorder or Neurodiversity?
One of the most famous stories of H. G. Wells, “The Country of the Blind” (1904), depicts a society, enclosed in an isolated valley amid forbidding mountains, in which a strange and persistent epidemic has rendered its members blind from birth. Their whole culture is reshaped around this difference: their notion of beauty depends on the feel rather than the look of a face; no windows adorn their houses; they work at night, when it is cool, and sleep during the day, when it is hot. A mountain climber named Nunez stumbles upon this community and hopes that he will rule over it: “In the Country of the Blind the One-Eyed Man is King,” he repeats to himself. Yet he comes to find that his ability to see is not an asset but a burden. The houses are pitch-black inside, and he loses fights to local warriors who possess extraordinary senses of touch and hearing. The blind live with no knowledge of the sense of sight, and no need for it. They consider Nunez’s eyes to be diseased, and mock his love for a beautiful woman whose face feels unattractive to them. When he finally fails to defeat them, exhausted and beaten, he gives himself up. They ask him if he still thinks he can see: “No,” he replies, “That was folly. The word means nothing — less than nothing!” They enslave him because of his apparently subhuman disability. But when they propose to remove his eyes to make him “normal,” he realizes the beauty of the mountains, the snow, the trees, the lines in the rocks, and the crispness of the sky — and he climbs a mountain, attempting to escape.
Wells’s eerie and unsettling story addresses how we understand differences that run deep into the mind and the brain. What one man thinks of as his heightened ability, another thinks of as a disability. This insight about the differences between ways of viewing the world runs back to the ancients: in Plato’s Phaedrus, Socrates discusses how insane people experience life, telling Phaedrus that madness is not “simply an evil.” Instead, “there is also a madness which is a divine gift, and the source of the chiefest blessings granted to men.” The insane, Socrates suggests, are granted a unique experience of the world, or perhaps even special access to its truths — seeing it in a prophetic or artistic way.
Today, some psychologists, journalists, and advocates explore and celebrate mental differences under the rubric of neurodiversity. The term encompasses those with Attention Deficit/Hyperactivity Disorder (ADHD), autism, schizophrenia, depression, dyslexia, and other disorders affecting the mind and brain. People living with these conditions have written books, founded websites, and started groups to explain and praise the personal worlds of those with different neurological “wiring.” The proponents of neurodiversity argue that there are positive aspects to having brains that function differently; many, therefore, prefer that we see these differences simply as differences rather than disorders. Why, they ask, should what makes them them need to be classified as a disability?
But other public figures, including many parents of affected children, focus on the difficulties and suffering brought on by these conditions. They warn of the dangers of normalizing mental disorders, potentially creating reluctance among parents to provide treatments to children — treatments that researchers are always seeking to improve. The National Institute of Mental Health, for example, has been doing extensive research on the physical and genetic causes of various mental conditions, with the aim of controlling or eliminating them.
Disagreements, then, abound. What does it mean to see and experience the world in a different way? What does it mean to be a “normal” human being? What does it mean to be abnormal, disordered, or sick? And what exactly would a cure for these disorders look like? The answers to these questions may be as difficult to know as the minds of others. Learning how properly to treat or accommodate neurological differences means seeking answers to questions such as these — challenging our ideas about “normal” human biology, the purpose of medical innovation, and the uniqueness of each human being.
Wells’s eerie and unsettling story addresses how we understand differences that run deep into the mind and the brain. What one man thinks of as his heightened ability, another thinks of as a disability. This insight about the differences between ways of viewing the world runs back to the ancients: in Plato’s Phaedrus, Socrates discusses how insane people experience life, telling Phaedrus that madness is not “simply an evil.” Instead, “there is also a madness which is a divine gift, and the source of the chiefest blessings granted to men.” The insane, Socrates suggests, are granted a unique experience of the world, or perhaps even special access to its truths — seeing it in a prophetic or artistic way.
Today, some psychologists, journalists, and advocates explore and celebrate mental differences under the rubric of neurodiversity. The term encompasses those with Attention Deficit/Hyperactivity Disorder (ADHD), autism, schizophrenia, depression, dyslexia, and other disorders affecting the mind and brain. People living with these conditions have written books, founded websites, and started groups to explain and praise the personal worlds of those with different neurological “wiring.” The proponents of neurodiversity argue that there are positive aspects to having brains that function differently; many, therefore, prefer that we see these differences simply as differences rather than disorders. Why, they ask, should what makes them them need to be classified as a disability?
But other public figures, including many parents of affected children, focus on the difficulties and suffering brought on by these conditions. They warn of the dangers of normalizing mental disorders, potentially creating reluctance among parents to provide treatments to children — treatments that researchers are always seeking to improve. The National Institute of Mental Health, for example, has been doing extensive research on the physical and genetic causes of various mental conditions, with the aim of controlling or eliminating them.
Disagreements, then, abound. What does it mean to see and experience the world in a different way? What does it mean to be a “normal” human being? What does it mean to be abnormal, disordered, or sick? And what exactly would a cure for these disorders look like? The answers to these questions may be as difficult to know as the minds of others. Learning how properly to treat or accommodate neurological differences means seeking answers to questions such as these — challenging our ideas about “normal” human biology, the purpose of medical innovation, and the uniqueness of each human being.
by Aaron Rothstein, The New Atlantis | Read more:
In the Country of the Blind ~ Alan PollackThe Boston Bombing Produces Familiar and Revealing Reactions
[ed. As Glenn mentions in his update, be sure to read Amy Davidson's chilling: The Saudi Marathon Man]
(1) The widespread compassion for yesterday's victims and the intense anger over the attacks was obviously authentic and thus good to witness. But it was really hard not to find oneself wishing that just a fraction of that compassion and anger be devoted to attacks that the US perpetrates rather than suffers. These are exactly the kinds of horrific, civilian-slaughtering attacks that the US has been bringing to countries in the Muslim world over and over and over again for the last decade, with very little attention paid. My Guardian colleague Gary Younge put this best on Twitter this morning:
One particularly illustrative example I happened to see yesterday was a re-tweet from Washington Examiner columnist David Freddoso, proclaiming:
Idea of secondary bombs designed to kill the first responders is just sick. How does anyone become that evil?"I don't disagree with that sentiment. But I'd bet a good amount of money that the person saying it - and the vast majority of other Americans - have no clue that targeting rescuers with "double-tap" attacks is precisely what the US now does with its drone program and other forms of militarism. If most Americans knew their government and military were doing this, would they react the same way as they did to yesterday's Boston attack: "Idea of secondary bombs designed to kill the first responders is just sick. How does anyone become that evil?" That's highly doubtful, and that's the point.
There's nothing wrong per se with paying more attention to tragedy and violence that happens relatively nearby and in familiar places. Whether wrong or not, it's probably human nature, or at least human instinct, to do that, and that happens all over the world. I'm not criticizing that. But one wishes that the empathy for victims and outrage over the ending of innocent human life that instantly arises when the US is targeted by this sort of violence would at least translate into similar concern when the US is perpetrating it, as it so often does (far, far more often than it is targeted by such violence).
Regardless of your views of justification and intent: whatever rage you're feeling toward the perpetrator of this Boston attack, that's the rage in sustained form that people across the world feel toward the US for killing innocent people in their countries. Whatever sadness you feel for yesterday's victims, the same level of sadness is warranted for the innocent people whose lives are ended by American bombs. However profound a loss you recognize the parents and family members of these victims to have suffered, that's the same loss experienced by victims of US violence. It's natural that it won't be felt as intensely when the victims are far away and mostly invisible, but applying these reactions to those acts of US aggression would go a long way toward better understanding what they are and the outcomes they generate.
(2) The rush, one might say the eagerness, to conclude that the attackers were Muslim was palpable and unseemly, even without any real evidence. The New York Post quickly claimed that the prime suspect was a Saudi national (while also inaccurately reporting that 12 people had been confirmed dead). The Post's insinuation of responsibility was also suggested on CNN by Former Bush Homeland Security Adviser Fran Townsend ("We know that there is one Saudi national who was wounded in the leg who is being spoken to"). Former Democratic Rep. Jane Harman went on CNN to grossly speculate that Muslim groups were behind the attack. Anti-Muslim bigots like Pam Geller predictably announced that this was "Jihad in America". Expressions of hatred for Muslims, and a desire to do violence, were then spewing forth all over Twitter (some particularly unscrupulous partisan Democrat types were identically suggesting with zero evidence that the attackers were right-wing extremists).
Obviously, it's possible that the perpetrator(s) will turn out to be Muslim, just like it's possible they will turn out to be extremist right-wing activists, or left-wing agitators, or Muslim-fearing Anders-Breivik types, or lone individuals driven by apolitical mental illness. But the rush to proclaim the guilty party to be Muslim is seen in particular over and over with such events. Recall that on the day of the 2011 Oslo massacre by a right-wing, Muslim-hating extremist, the New York Times spent virtually the entire day strongly suggesting in its headlines that an Islamic extremist group was responsible, a claim other major news outlets (including the BBC and Washington Post) then repeated as fact. The same thing happened with the 1995 Oklahoma City bombing, when most major US media outlets strongly suggested that the perpetrators were Muslims. As FAIR documented back then:
"In the wake of the explosion that destroyed the Murrah Federal Office Building, the media rushed — almost en masse — to the assumption that the bombing was the work of Muslim extremists. 'The betting here is on Middle East terrorists,' declared CBS News' Jim Stewart just hours after the blast (4/19/95). 'The fact that it was such a powerful bomb in Oklahoma City immediately drew investigators to consider deadly parallels that all have roots in the Middle East,' ABC's John McWethy proclaimed the same day.
"'It has every single earmark of the Islamic car-bombers of the Middle East,' wrote syndicated columnist Georgie Anne Geyer (Chicago Tribune, 4/21/95). 'Whatever we are doing to destroy Mideast terrorism, the chief terrorist threat against Americans, has not been working,' declared the New York Times' A.M. Rosenthal (4/21/95). The Geyer and Rosenthal columns were filed after the FBI released sketches of two suspects who looked more like Midwestern frat boys than mujahideen."This lesson is never learned because, it seems, many people don't want to learn it. Even when it turns out not to have been Muslims who perpetrated the attack but rather right-wing, white Christians, the damage from this relentless and reflexive blame-pinning endures.
by Glenn Greenwald, The Guardian | Read more:
Photograph: Stringer/ReutersMass. General Team Develops Implantable, Bioengineered Rat Kidney
The research team describes building functional replacement kidneys on the structure of donor organs from which living cells had been stripped, an approach previously used to create bioartificial hearts, lungs, and livers.
“What is unique about this approach is that the native organ’s architecture is preserved, so that the resulting graft can be transplanted just like a donor kidney and connected to the recipient’s vascular and urinary systems,” says Harald Ott, MD, PhD, of the MGH Center for Regenerative Medicine, senior author of the Nature Medicine article.
“If this technology can be scaled to human-sized grafts, patients suffering from renal failure who are currently waiting for donor kidneys or who are not transplant candidates could theoretically receive new organs derived from their own cells.”
Around 18,000 kidney transplants are performed in the U.S. each year, but 100,000 Americans with end-stage kidney disease are still waiting for a donor organ. Even those fortunate enough to receive a transplant face a lifetime of immunosuppressive drugs, which pose many health risks and cannot totally eliminate the incidence of eventual organ rejection.
Retrofitted donor organs
The approach used in this study to engineer donor organs, based on a technology that Ott discovered as a research fellow at the University of Minnesota, involves stripping the living cells from a donor organ with a detergent solution and then repopulating the collagen scaffold that remains with the appropriate cell type — in this instance human endothelial cells to replace the lining of the vascular system and kidney cells from newborn rats.
The research team first decellularized rat kidneys to confirm that the organ’s complex structures would be preserved. They also showed the technique worked on a larger scale by stripping cells from pig and human kidneys.
Making sure the appropriate cells were seeded into the correct portions of the collagen scaffold required delivering vascular cells through the renal artery and kidney cells through the ureter. Precisely adjusting the pressures of the solutions enabled the cells to be dispersed throughout the whole organs, which were then cultured in a bioreactor for up to 12 days.
The researchers first tested the repopulated organs in a device that passed blood through its vascular system and drained off any urine, which revealed evidence of limited filtering of blood, molecular activity and urine production.
Bioengineered kidneys transplanted into living rats from which one kidney had been removed began producing urine as soon as the blood supply was restored, with no evidence of bleeding or clot formation. The overall function of the regenerated organs was significantly reduced compared with that of normal, healthy kidneys, something the researchers believe may be attributed to the immaturity of the neonatal cells used to repopulate the scaffolding.
by Amara D. Angelica, Editor, Kurzweil Accelerated Intelligence | Read more:
Massachusetts General Hospital Center for Regenerative Medicine
“What is unique about this approach is that the native organ’s architecture is preserved, so that the resulting graft can be transplanted just like a donor kidney and connected to the recipient’s vascular and urinary systems,” says Harald Ott, MD, PhD, of the MGH Center for Regenerative Medicine, senior author of the Nature Medicine article.
“If this technology can be scaled to human-sized grafts, patients suffering from renal failure who are currently waiting for donor kidneys or who are not transplant candidates could theoretically receive new organs derived from their own cells.”
Around 18,000 kidney transplants are performed in the U.S. each year, but 100,000 Americans with end-stage kidney disease are still waiting for a donor organ. Even those fortunate enough to receive a transplant face a lifetime of immunosuppressive drugs, which pose many health risks and cannot totally eliminate the incidence of eventual organ rejection.
Retrofitted donor organs
The approach used in this study to engineer donor organs, based on a technology that Ott discovered as a research fellow at the University of Minnesota, involves stripping the living cells from a donor organ with a detergent solution and then repopulating the collagen scaffold that remains with the appropriate cell type — in this instance human endothelial cells to replace the lining of the vascular system and kidney cells from newborn rats.
The research team first decellularized rat kidneys to confirm that the organ’s complex structures would be preserved. They also showed the technique worked on a larger scale by stripping cells from pig and human kidneys.
Making sure the appropriate cells were seeded into the correct portions of the collagen scaffold required delivering vascular cells through the renal artery and kidney cells through the ureter. Precisely adjusting the pressures of the solutions enabled the cells to be dispersed throughout the whole organs, which were then cultured in a bioreactor for up to 12 days.
The researchers first tested the repopulated organs in a device that passed blood through its vascular system and drained off any urine, which revealed evidence of limited filtering of blood, molecular activity and urine production.
Bioengineered kidneys transplanted into living rats from which one kidney had been removed began producing urine as soon as the blood supply was restored, with no evidence of bleeding or clot formation. The overall function of the regenerated organs was significantly reduced compared with that of normal, healthy kidneys, something the researchers believe may be attributed to the immaturity of the neonatal cells used to repopulate the scaffolding.
by Amara D. Angelica, Editor, Kurzweil Accelerated Intelligence | Read more:
Massachusetts General Hospital Center for Regenerative Medicine
Tuesday, April 16, 2013
Subscribe to:
Comments (Atom)
















