Tuesday, January 17, 2012

Do Sports Build Character or Damage It?

Do sports build character? For those of us who claim to be educators, it's important to know. Physical-education teachers, coaches, boosters, most trustees, and the balance of alumni seem sure that they do. And so they push sports, sports, and more sports. As for professors, they often see sports as a diversion from the real business of education—empty, time-wasting, and claiming far too much of students' attention. It often seems that neither the boosters nor the bashers want to go too far in examining their assumptions about sports.

But in fact, sports are a complex issue, and it's clear that we as a culture don't really know how to think about them. Public confusion about performance-enhancing drugs, the dangers of concussions in football and of fighting in hockey, and the recent molestation scandal at Penn State suggest that it might be good to pull back and consider the question of athletics and education—of sports and character-building—a bit more closely than we generally do.

The first year I played high-school football, the coaches were united in their belief that drinking water on the practice field was dangerous. It made you cramp up, they told us. It made you sick to your stomach, they said. So at practice, which went on for two and a half hours, twice a day, during a roaring New England summer, we got no water. Players cramped up anyway; players got sick to their stomachs regardless. Players fell on their knees and began making soft, plaintive noises; they were helped to their feet, escorted to the locker room, and seen no more.

On the first day of double practice sessions, there were about 120 players—tough Irish and Italian kids and a few blacks—and by the end of the 12-day ordeal, there were 60 left. Some of us began without proper equipment. I started without cleats. But that was not a problem: Soon someone who wore your shoe size would quit, and then you could have theirs.

The coaches didn't cut anyone from the squad that year. Kids cut themselves. Guys with what appeared to be spectacular athletic talent would, after four days of double-session drills, walk hangdog into the coaches' locker room and hand over their pads. When I asked one of them why he quit, he said simply, "I couldn't take it."

Could I? There was no reason going in to think that I would be able to. I was buttery soft around the waist, nearsighted, not especially fast, and not agile at all. It turned out that underneath the soft exterior, I had some muscle, and that my lung capacity was well developed, probably from vicious bouts of asthma I'd had as a boy. But compared with those of my fellow ballplayers, my physical gifts were meager. What I had was a will that was anything but weak. It was a surprise to me, and to everyone who knew me, how ferociously I wanted to stay with the game.

Did I love the game? I surely liked it. I liked how, when I was deep in fatigue, I became a tougher, more daring person, even a reckless one. One night, scrimmaging, I went head-on with the star running back, a guy who outweighed me by 20 pounds and was far faster and stronger. I did what the coaches said: I squared up, got low (in football, the answer to every difficulty is to get low, or get lower), and planted him. I did that?, I asked myself. I liked being the guy who could do that—sometimes, though alas not often enough. The intensity of the game was inebriating. It conquered my grinding self-consciousness, brought me out of myself.

I liked the transforming aspect of the game: I came to the field one thing—a diffident guy with a slack body—and worked like a dog and so became something else—a guy with some physical prowess and more faith in himself. Mostly, I liked the whole process because it was so damned hard. I didn't think I could make it, and no one I knew did either. My parents were ready to console me if I came home bruised and dead weary and said that I was quitting. In time, one of the coaches confessed to me that he was sure I'd be gone in a few days. I had not succeeded in anything for a long time: I was a crappy student; socially I was close to a wash; my part-time job was scrubbing pans in a hospital kitchen; the first girl I liked in high school didn't like me; the second and the third followed her lead. But football was something I could do, though I was never going to be anything like a star. It was hard, it took some strength of will, and—clumsily, passionately—I could do it.

Over time, I came to understand that the objective of the game, on the deepest level, wasn't to score spectacular touchdowns or make bone-smashing tackles or block kicks. The game was much more about practice than about the Saturday-afternoon contests. And practice was about trying to do something over and over again, failing and failing, and then finally succeeding part way. Practice was about showing up and doing the same drills day after day and getting stronger and faster by tiny, tiny increments, and then discovering that by the end of the season you were effectively another person.

But mostly football was about those first days of double sessions when everyone who stuck with it did something he imagined was impossible, and so learned to recalibrate his instruments. In the future, what immediately looked impossible to us—what said Back Off, Not for You—had to be looked at again and maybe attempted anyway.

by Mark Edmundson, Chronicle of Higher Education |  Read more:
Photo: Chris and Adrienne Scott

Monday, January 16, 2012

Massive Attack



Snoozing Hawaiian monk seal.
photo: markk

Talking Heads

The Rise of the New Groupthink

Solitude is out of fashion. Our companies, our schools and our culture are in thrall to an idea I call the New Groupthink, which holds that creativity and achievement come from an oddly gregarious place. Most of us now work in teams, in offices without walls, for managers who prize people skills above all. Lone geniuses are out. Collaboration is in.

But there’s a problem with this view. Research strongly suggests that people are more creative when they enjoy privacy and freedom from interruption. And the most spectacularly creative people in many fields are often introverted, according to studies by the psychologists Mihaly Csikszentmihalyi and Gregory Feist. They’re extroverted enough to exchange and advance ideas, but see themselves as independent and individualistic. They’re not joiners by nature.

One explanation for these findings is that introverts are comfortable working alone — and solitude is a catalyst to innovation. As the influential psychologist Hans Eysenck observed, introversion fosters creativity by “concentrating the mind on the tasks in hand, and preventing the dissipation of energy on social and sexual matters unrelated to work.” In other words, a person sitting quietly under a tree in the backyard, while everyone else is clinking glasses on the patio, is more likely to have an apple land on his head. (Newton was one of the world’s great introverts: William Wordsworth described him as “A mind for ever/ Voyaging through strange seas of Thought, alone.”)

Solitude has long been associated with creativity and transcendence. “Without great solitude, no serious work is possible,” Picasso said. A central narrative of many religions is the seeker — Moses, Jesus, Buddha — who goes off by himself and brings profound insights back to the community.

Culturally, we’re often so dazzled by charisma that we overlook the quiet part of the creative process. Consider Apple. In the wake of Steve Jobs’s death, we’ve seen a profusion of myths about the company’s success. Most focus on Mr. Jobs’s supernatural magnetism and tend to ignore the other crucial figure in Apple’s creation: a kindly, introverted engineering wizard, Steve Wozniak, who toiled alone on a beloved invention, the personal computer.

Rewind to March 1975: Mr. Wozniak believes the world would be a better place if everyone had a user-friendly computer. This seems a distant dream — most computers are still the size of minivans, and many times as pricey. But Mr. Wozniak meets a simpatico band of engineers that call themselves the Homebrew Computer Club. The Homebrewers are excited about a primitive new machine called the Altair 8800. Mr. Wozniak is inspired, and immediately begins work on his own magical version of a computer. Three months later, he unveils his amazing creation for his friend, Steve Jobs. Mr. Wozniak wants to give his invention away free, but Mr. Jobs persuades him to co-found Apple Computer.

The story of Apple’s origin speaks to the power of collaboration. Mr. Wozniak wouldn’t have been catalyzed by the Altair but for the kindred spirits of Homebrew. And he’d never have started Apple without Mr. Jobs.

But it’s also a story of solo spirit. If you look at how Mr. Wozniak got the work done — the sheer hard work of creating something from nothing — he did it alone. Late at night, all by himself.

Intentionally so. In his memoir, Mr. Wozniak offers this guidance to aspiring inventors:

“Most inventors and engineers I’ve met are like me … they live in their heads. They’re almost like artists. In fact, the very best of them are artists. And artists work best alone …. I’m going to give you some advice that might be hard to take. That advice is: Work alone… Not on a committee. Not on a team.”

by Susan Cain, NY Times |  Read more:
Illustration: Andy Rementer

universum-flammarion. woodcut.
via:

Ernest Ranglin


Why Black is Addictive

Towards the end of the last century, a friend of mine took a taxi to London Fashion Week. The driver gawped in puzzlement at the moving sea of people dressed head-to-toe in black, and asked: “What’s that, then? Some religious cult?”

He had a point. There is something bordering on the cultish in fashion’s devotion to the colour black—it’s the equivalent of white for Moonies or orange for Hare Krishnas. Since that taxi journey in the 1990s the wardrobes of the stylish have brightened up a bit, but although trends such as colour blocking or floral prints may float by on the surface current, underneath there is a deeper, darker tide that pulls us back towards black. Despite pronouncements at intervals by the fashion industry that red or pink or blue is the new black, the old black is still very much with us.

Visiting eBay, the auction website, confirms this. A search in “Clothes, Shoes and Accessories” for the word “black” yields more than 3m items—that’s twice as many as “blue”, and five or six times as many as “brown” or “grey”. This ratio remains more or less the same in winter and summer, and when you narrow the search to “women’s clothing”. (Black also predominates in men’s clothing, though there’s slightly more blue.) A pedant might argue that these are the clothes that people are trying to get rid of—certainly if they were all thrown away we’d be left with a very large, black mountain. But the website of the upmarket fashion retailer Net-a-Porter tells the same story, with black significantly more dominant in its wares, be it January or June.

What is it about, this infatuation with black? It’s a question I am often asked, since I wear black most of the time, and therefore one upon which I have spent much time reflecting. My friends and colleagues might say I wear little else, though it doesn’t feel like that to me—I wear colours sometimes, particularly in summer, but black is what I feel most comfortable in. Putting on black in the morning feels as natural as breathing. If I enter a clothes shop, I am drawn towards the rails of black. I will happily wear black to weddings as well as funerals. I own black sandals and black sundresses. I even wore black when I was nine months pregnant in a July heatwave. This habit of mine is an adult-onset condition, which developed when I spent a dangerously long time working at British Vogue magazine; I didn’t work in the fashion department, but I absorbed black osmotically. I know I’m far from alone in my preference for wearing black, so—for all those others who are asked why they wear so much black, as well as for myself—I’ll try to answer that question here for once and for all.

To do that means asking some other questions about black’s significance in our society generally. How is it that black can betoken both oppression (the Nazis and Fascists) and also the rebellion of youth (punks and goths)? How can it be the distinctive feature of religious garments (nuns, priests, Hassidic Jews), and also of rubber and bondage fetishists? Why is it the uniform of dons and anorexics alike, of waiters and witches, of judges and suicide-bombers? No colour performs so many duties, in so many fields of clothing—smart, casual, uniform, anti-uniform—as black does. It is uniquely versatile and flexible. How, exactly, does my friend and ally pull that off?

by Rebecca Willis, Intelligent Life |  Read more:
Fashion Photography by Sean Gleason

Universal Flu Vaccine Could Be Available by 2013

Annual flu shots might soon become a thing of the past, and threats such as avian and swine flu might disappear with them as a vaccine touted as the "holy grail" of flu treatment could be ready for human trials next year.

That's earlier than the National Institutes of Health estimated in 2010, when they said a universal vaccine could be five years off. By targeting the parts of the virus that rarely mutate, researchers believe they can develop a vaccine similar to the mumps or measles shot—people will be vaccinated as children and then receive boosters later.

That differs from the current '60s-era technology, according to Joseph Kim, head of Inovio Pharmaceuticals, which is working on the universal vaccine. Each year, the seasonal flu vaccine targets three or four strains that researchers believe will be the most common that year. Previous seasons' vaccines have no effect on future strains of the virus, because it mutates quickly. The seasonal vaccine also offers no protection against outbreaks, such as 2009's H1N1 swine flu. A universal vaccine would offer protection against all forms of the virus.

"It's like putting up a tent over your immune system that protects against rapidly mutating viruses," Kim says. At least two other companies are working on a similar vaccine. In late 2010, Inovio earned a $3.1 million grant from the National Institutes of Health to work on the vaccine.

"It's a completely different paradigm than how [the vaccines] are made seasonably every year," Kim says.

by Jason Koebler, US News |  Read more:

The Hacker is Watching


Melissa wondered why her goof-off sister was IM'ing from the next room instead of just padding over—she wasn't usually that lazy—so she walked over to see what was up. Suzy just shrugged. She had no idea what her sister was talking about. Yeah, the IM had come from her account, but she hadn't sent it. Honest.

That night, Suzy's 20-year-old friend Nila Westwood got the same note, the same attachment. Unlike Melissa, she opened it, expecting, say, a video of some guy stapling his lip to his chin on YouTube. She waited. Nothing. When she called her friend to see what she'd missed, things actually got freaky: Suzy'd never sent a thing. The girls pieced together the clues and agreed: Suzy's AOL account had been hacked. For the next couple of weeks, the girls remained watchful for malware, insidious software capable of wreaking all sorts of havoc. But with no sign of trouble on their machines—no slow performance, no deleted files, no alerts from antivirus programs—they pretty much forgot about it.

A month passed. Suzy, Melissa, and Nila went about their lives online and off. They chatted with friends, posted pictures, and when they were tired, stretched out on their beds to rest. But at some point, each of them looked up and noticed the same strange thing: the tiny light beside their webcam glowing. At first they figured it was some kind of malfunction, but when it happened repeatedly—the light flicking on, then off—the girls felt a chill. One by one, they gazed fearfully into the lenses, wondering if someone was watching and if, perhaps now, they were looking into the eye of something scary after all. Nila, for one, wasn't taking any chances. She peeled off a sticker and stuck it on the lens.

by David Kushner, GQ |  Read more:
Photographs by Jason Madara

Sunday, January 15, 2012

In Defense of Hippies

Progressives and mainstream Democratic pundits disagree with each other about many issues at the heart of the Occupy Wall Street protests, but with few exceptions they are joined in their contempt for drum circles, free hugs, and other behavior in Zuccotti Park that smacks of hippie culture.

In a post for the Daily Beast Michelle Goldberg lamented, “Drum circles and clusters of earnest incense-burning meditators ensure that stereotypes about the hippie left remain alive.” At Esquire, Charles Pierce worried that few could “see past all the dreadlocks and hear…over the drum circles.” Michael Smerconish asked on the MSNBC show Hardball if middle Americans “in their Barcalounger” could relate to drum circles. The New Republic’s Alex Klein chimed in, “In the course of my Friday afternoon occupation, I saw two drum circles, four dogs, two saxophones, three babies....Wall Street survived.” And the host of MSNBC’s Up, Chris Hayes (editor at large of the Nation), recently reassured his guests Naomi Klein and Van Jones that although he supported the political agenda of the protest he wasn’t going to “beat the drum” or “give you a free hug,” to knowing laughter.

Yet it is precisely the mystical utopian energy that most professional progressives so smugly dismiss that has aroused a salient, mass political consciousness on economic issues—something that had eluded even the most lucid progressives in the Obama era.

Since the mythology of the 1960s hangs over so much of the analysis of the Wall Street protests, it’s worth reviewing what actually happened then. Media legend lumps sixties radicals and hippies together, but from the very beginning most leaders on the left looked at the hippie culture as, at best, a distraction and, at worst, a saboteur of pragmatic progressive politics. Hippies saw most radicals as delusional and often dangerously angry control freaks. Bad vibes.

Not that there is anything magic about the word “hippie.” Over the years it has been distorted by parody, propaganda, self-hatred, and, from its earliest stirrings, commercialism. In some contemporary contexts it is used merely to refer to people living in the past and/or those who are very stoned.

by Danny Goldberg, Dissent |  Read more:
Image: Woodstock, 1969 (Wikimedia Commons)

Curtis Wilson Cost, Gathering Darkness
via:

Friday, January 13, 2012

Do We Really Want Immortality?

Suppose you had a chance to question an ancient Greek or Roman -- or any of our distant ancestors, for that matter. Let's say you asked them to list the qualities of a deity.

It's a pretty good bet that many of the "god-like" traits he or she described might seem trivial nowadays.

After all, we think little of flying through the air. We fill pitch-dark areas with sudden lavish light, by exerting a mere twitch of a finger. Average folks routinely send messages or observe events taking place far across the globe. Copious and detailed information about the universe is readily available through crystal tubes many of us keep on our desks and command like genies. Some modern citizens can even hurl lightning, if we choose to annoy our neighbors and the electric company.

Few of us deem these powers to be miraculous, because they've been acquired by nearly everyone in prosperous nations. After all, nobody respects a gift if everybody has it. And yet, these are some of the very traits that earlier generations associated with divine beings.

Even so, we remain mortal. Our obsession with that fate is as intense as it was in the time of Gilgamesh. Perhaps more, since we overcame so many other obstacles that thwarted our ancestors.

Will our descendants conquer the last barriers standing between humanity and Olympian glory? Or may we encounter hurdles too daunting even for our brilliant, arrogant, ingenious and ever-persevering species?

There can be no better topic for this contemplation -- the last in a series commissioned for iPlanet -- about our future in the coming millennium. Essay number one cast perspective on our accomplishments during the Twentieth Century and the second dealt with near-term dilemmas we may face in the twenty-first. Now let's take a long-view, exploring the possibility that our great grandchildren will be "great" in every sense of the word... and have problems to match.

by David Brin, Sentient Developments |  Read more:

Who Pinched My Ride?


I used to stay up late watching the film of my bicycle being stolen. It’s amazing what you notice on the 38th replay of a surveillance tape, running the grainy recording backward and forward, pausing and advancing. Sometimes I’d back the tape up to before the 17 minutes that changed my life. All the way back to the part where I still had a bicycle.

Rewinding—past all the New Yorkers striding backward toward lunch; past the Algonquin and Royalton hotels inhaling crowds and the door of the Harvard Club admitting well-fed members; past the New York Yacht Club looming impassively like a beached galleon; past all the finery and civility of West 44th Street—you come to the beginning. You come to him.

The thief. There he is. Caught, if only on tape. (...)

I want my bike back. So do we all. With the rise of the bicycle age has come a rise in bicycle robbery: FBI statistics claim that 204,000 bicycles were stolen nationwide in 2010, but those are only the documented thefts. Transportation Alternatives, a bicycle advocacy group in New York City, estimates the unreported thefts at four or five times that—more than a million bikes a year. New York alone probably sees more than 100,000 bikes stolen annually. Whether in big biking cities like San Francisco and Portland, Oregon, or in sport-loving suburbs and small towns, theft is “one of the biggest reasons people don’t ride bikes,” Noah Budnick, deputy director of Transportation Alternatives, told me. Although bike commuting has increased by 100 percent in New York City during the past seven years, the lack of secure bike parking was ranked alongside bad drivers and traffic as a primary deterrent to riding more. It’s all about the (stolen) bike; even Lance Armstrong had his custom time-trial Trek nicked from the team van in 2009 after a race in California. Not every bike is that precious, but according to figures from the FBI and the National Bike Registry, the value of stolen bikes is as much as $350 million a year.

That’s a lot of bike. Stolen bicycles have become a solvent in America’s underground economy, a currency in the world of drug addicts and petty thieves. Bikes are portable and easily converted to cash, and they usually vanish without a trace—in some places, only 5 percent are even reported stolen. Stealing one is routinely treated as a misdemeanor, even though, in the age of electronic derailleurs and $5,000 coffee-shop rides, many bike thefts easily surpass the fiscal definition of felony, which varies by state but is typically under the thousand-dollar mark. Yet police departments are reluctant to pull officers from robberies or murder investigations to hunt bike thieves. Even when they do, DAs rarely prosecute the thieves the police bring in.

by Patrick Symmes, Outside |  Read more:
Photo: Jake Stangel

Smart Windows


This transparent screen will fit any window up to 46 inches at a resolution of 1366 x 768. The thing is completely see-through, but what you’re viewing on the screen is completely private from those outside. It’s fully controlled by your touch, and reminds us of a scene right out of Minority Report or Mission: Impossible.

via:

Thursday, January 12, 2012

Alaska wildlife conservation director charged with helping illegally kill bears

The director of the Alaska Division of Wildlife Conservation has been charged with 12 counts of illegal hunting related to guiding activities in the bear-rich forests on the north side of Cook Inlet across from Anchorage, according to Alaska State Troopers.

Troopers on Thursday issued a statement saying Corey L. Rossi, 51, of Palmer took two out-of-state men on a bear hunt in the early summer of 2008 and then covered up their kills. Rossi was at the time a licensed assistant guide on the verge of joining the administration of then-Gov. Sarah Palin.

Rossi was not immediately available for comment.

A former predator control officer for the U.S. Department of Agriculture, Rossi is a longtime friend of Chuck and Sally Heath, Palin's parents. After Palin took office in 2007, Sally lobbied her daughter to have Rossi named commissioner of the Alaska Department of Fish and Game. The commissioner oversees all wildlife and fisheries management in the state. Sally Heath, in an email to Palin, noted that almost everyone would object to Rossi as unqualified, but added those "are the very same people who said the same thing about you."

Rossi did not get the commissioner's job, but a special job -- assistant commissioner for abundance management -- was created for him within state government. He moved into the newly created job in December 2008, just months after his alleged illegal bear hunt. Gov. Sean Parnell promoted Rossi to wildlife director in March 2010. A staunch advocate of killing predators -- wolves and bears -- to boost prey populations of moose and caribou within the state, Rossi has been unpopular with many in the agency he runs.

His qualifications have repeatedly been called into question. He lacks a college degree and his prime professional association with wildlife has involved killing rats and foxes in the Aleutian Islands. His associations with Alaska's big-game guiding industry have also raised suspicions. Rossi has continued to work as a guide while employed in the wildlife division by exploiting a loophole in a state policy that bans wildlife division employees from that business.

by Craig Medred, Alaska Dispatch |  Read more:

Lisa Tognon, “Rives”, chine collé, eau-forte, pointe sèche
via:

Breast Implants: the First 50 Years

It was in 1962 that Timmie Jean Lindsey was offered a solution to a non-existent problem. A factory worker from Texas, she had married at 15, had six children, divorced in her mid-20s, and taken up with a man who encouraged her to have a vine tattooed on her cleavage. Roses tumbled across her breasts. When the relationship faltered, Lindsey decided she wanted the tattoos removed. "I was ashamed," she says, "and I needed them taken off." Her low-paid work made her eligible for treatment at a charity hospital, where she was told the tattoo could be removed through dermabrasion. And the doctors had another proposal. Had she ever thought about breast implants?

Lindsey had not. She'd never felt self-conscious about her breasts – and even if she had, the options at that time were primitive and problematic, involving substances injected directly into women's chests, or implants made of sponge. "The only person I'd ever talked to about breast implants was my cousin," says Lindsey, "who had had some kind of surgery. She said: 'Sometimes I wake up and my breast has moved to another part of my body,' and I thought: 'My God. I never want that.' It wasn't long after she and I talked that I came into contact with these doctors."

The team was led by Dr Thomas Cronin, who had been developing the world's first silicone breast implants. Thomas Biggs, then 29, and a surgical resident under Cronin, says the idea came about when one of his colleagues, Frank Gerow, went to the blood bank. "They'd stopped putting liquids in glass bottles, and begun putting them into plastic bags," says Biggs, "and he was walking in the hall with this bag of blood, and felt that it had the softness of a breast." Around the same time, Cronin travelled "to New Orleans to a plastic surgery meeting and encountered a former resident of his. This fellow told him there was a company who had a new product which was interesting because it had very little body reaction, and could be made into a variety of thicknesses, a variety of viscosities, all the way from liquid to solid. If you can make a solid, you can make a bag – and if you can make a liquid, you can make something that goes in it."

Cronin had the idea for a breast implant. A prototype was created, and implanted into a dog called Esmeralda. "That worked OK," says Biggs, "and so then they got to Timmie Lindsey." After spending some time with the doctors, she says, "they asked me if I wanted implants, and I said: 'Well, I don't really know.' The only thing I'd ever thought about changing was my ears. I told them I'd rather have my ears fixed than to have new breasts, and they said, well, they'd fix that too. So I said, OK. When they put the implants in they said: 'Do you want to see them?' and I said: 'No, I don't want to look at it. You put it in me, and it'll be out of sight, out of mind. My theory was that if you think you've got something foreign inside you you're just going to worry about it." She's 80 today, still living in Texas, working night shifts in a care home, and those first, experimental globes remain in her chest.

The 50-year history of breast implants had begun, a history of controversy and success. What no one knew back then was just how phenomenally popular breast augmentation surgery would become – the last available figures from the American Society for Aesthetic Plastic Surgery show it was the most popular form of cosmetic surgery in the US in 2010, with 318,123 augmentations performed. It is also the most popular cosmetic operation in the UK. While there are no overall figures for cosmetic surgery here, those collected by the British Association of Aesthetic Plastic Surgeons (BAAPS), which represent around a third of the market, show 9,418 women had breast augmentation in 2010, a rise of more than 10% from the previous year.

by Kira Cochrane, Guardian |  Read more:
Photograph: Stockbyte/Getty Images