Thursday, January 19, 2012

In Fight Over Piracy Bills, New Economy Rises Against Old


When the powerful world of old media mobilized to win passage of an online antipiracy bill, it marshaled the reliable giants of K Street — the United States Chamber of Commerce, the Recording Industry Association of America and, of course, the motion picture lobby, with its new chairman, former Senator Christopher J. Dodd, the Connecticut Democrat and an insider’s insider.

Yet on Wednesday this formidable old guard was forced to make way for the new as Web powerhouses backed by Internet activists rallied opposition to the legislation through Internet blackouts and cascading criticism, sending an unmistakable message to lawmakers grappling with new media issues: Don’t mess with the Internet.

As a result, the legislative battle over two once-obscure bills to combat the piracy of American movies, music, books and writing on the World Wide Web may prove to be a turning point for the way business is done in Washington. It represented a moment when the new economy rose up against the old.

“I think it is an important moment in the Capitol,” said Representative Zoe Lofgren, Democrat of California and an important opponent of the legislation. “Too often, legislation is about competing business interests. This is way beyond that. This is individual citizens rising up.”

It appeared by Wednesday evening that Congress would follow Bank of America, Netflix and Verizon as the latest institution to change course in the face of a netizen revolt.

by Jonathan Weisman, NY Times |  Read more:

Wednesday, January 18, 2012

Tuesday, January 17, 2012

Overthrow of the Kingdom of Hawaii

[ed. On January 17, 1893.]

Liliuokalani was the last queen of the Hawaiian Islands. Her rule lasted from 1891 to 1895. She was born Lydia Paki Kamekeha Liliuokalani in 1838. Her parents were councillors to King Kamehameha III. Young Lydia attended the Royal School which was run by American missionaries. In 1862 she married John Owen Dominis but he died shortly after she ascended the throne.

Liliuokalani's brother, King David Kalakaua, ascended the throne in 1874. He gave much governing power to a cabinet composed of Americans. As a result, new constitution was passed which gave voting rights to foreign residents but denied these rights to most Hawaiian natives. Liliuokalani succeeded to the throne upon the death of her brother in 1891. When she attempted to restore some of the power of the monarchy that had been lost during the reign of her brother, she encountered the revolt by the American colonists who controlled most of Hawaii's economy. In 1893, U.S. marines called in by a U.S. minister occupied the government buildings in Honolulu and deposed the queen. The colonists, led by Sanford Dole, applied for the annexation of the islands to the United States. Queen Liliuokalani appealed to the U.S. President Grover Cleveland for reinstatement.

Ignoring President Cleveland's orders, Dole established a provisional government in Hawaii. His forces put down the revolt by the royalists and jailed many of the queen's supporters. In 1895 Queen Liliuokalani was put under the house arrest in the Iolani Palace for eight months, after which she abdicated in return for the release of her jailed supporters. In 1898 the Hawaiian Islands were formally annexed to the United States. In the same year Queen Liliuokalani composed a song "Aloha Oe" as a farewell to her country. She was released as a private citizen and lived at Washington Place (320 South Beretania Street) in Honolulu until her death in 1917.

Overthrow of the Hawaiian Government |  Read more:
Biography of Queen Liliuokalani via:

Bargaining for a Child’s Love

Economic malaise and political sloganeering have contributed to the increasingly loud conversation about the coming crisis of old-age care: the depletion of the Social Security trust fund, the ever rising cost of Medicare, the end of defined-benefit pensions, the stagnation of 401(k)’s. News accounts suggest that overstretched and insufficient public services are driving adult children “back” toward caring for dependent parents.

Such accounts often draw on a deeply sentimental view of the past. Once upon a time, the story line goes, family members cared for one another naturally within households, in an organic and unplanned process. But this portrait is too rosy. If we confront what old-age support once looked like — what actually happened when care was almost fully privatized, when the old depended on their families, without the bureaucratic structures and the (under)paid caregivers we take for granted — a different picture emerges.

For the past decade I have been researching cases of family conflict over old-age care in the decades before Social Security. I have found extraordinary testimony about the intimate management of family care: how the old negotiated with the young for what they called retirement, and the exertions of caregiving at a time when support by relatives was the only sustenance available for the old.

In that world, older people could not rely on habit or culture or nature if they wanted their children to support them when they became frail. In an America strongly identified with economic and physical mobility, parents had to offer inducements. Usually, the bait they used was the promise of an inheritance: stay and take care of me and your mother, and someday you will get the house and the farm or the store or the bank account.

But of course what was at stake was never just an economic bargain between rational actors. Older people negotiated with the young to receive love, to be cared for with affection, not just self-interest.

The bargains that were negotiated were often unstable and easily undone. Life expectancies were considerably lower than they are now, but even so, old age could easily stretch for decades. Of course, disease, injury, disability, dementia, insanity, incontinence — not to mention sudden death — were commonplace, too. Wills would be left unwritten, deeds unconveyed, promises unfulfilled, because of the onset of dementia or the meddling of siblings. Or property was conveyed too early, and then the older person would be at the mercy of a child who no longer “cared” — or who could not deal with the work of care.

Consider one story, drawn from a court case in New Jersey that ended in 1904. George H. Slack had been a carpenter and a contractor in Trenton, living in a house with his wife, their daughter, Ella Rees, and her husband and daughter.

by Hendrik Hartog, NY Times |  Read more:

Do Sports Build Character or Damage It?

Do sports build character? For those of us who claim to be educators, it's important to know. Physical-education teachers, coaches, boosters, most trustees, and the balance of alumni seem sure that they do. And so they push sports, sports, and more sports. As for professors, they often see sports as a diversion from the real business of education—empty, time-wasting, and claiming far too much of students' attention. It often seems that neither the boosters nor the bashers want to go too far in examining their assumptions about sports.

But in fact, sports are a complex issue, and it's clear that we as a culture don't really know how to think about them. Public confusion about performance-enhancing drugs, the dangers of concussions in football and of fighting in hockey, and the recent molestation scandal at Penn State suggest that it might be good to pull back and consider the question of athletics and education—of sports and character-building—a bit more closely than we generally do.

The first year I played high-school football, the coaches were united in their belief that drinking water on the practice field was dangerous. It made you cramp up, they told us. It made you sick to your stomach, they said. So at practice, which went on for two and a half hours, twice a day, during a roaring New England summer, we got no water. Players cramped up anyway; players got sick to their stomachs regardless. Players fell on their knees and began making soft, plaintive noises; they were helped to their feet, escorted to the locker room, and seen no more.

On the first day of double practice sessions, there were about 120 players—tough Irish and Italian kids and a few blacks—and by the end of the 12-day ordeal, there were 60 left. Some of us began without proper equipment. I started without cleats. But that was not a problem: Soon someone who wore your shoe size would quit, and then you could have theirs.

The coaches didn't cut anyone from the squad that year. Kids cut themselves. Guys with what appeared to be spectacular athletic talent would, after four days of double-session drills, walk hangdog into the coaches' locker room and hand over their pads. When I asked one of them why he quit, he said simply, "I couldn't take it."

Could I? There was no reason going in to think that I would be able to. I was buttery soft around the waist, nearsighted, not especially fast, and not agile at all. It turned out that underneath the soft exterior, I had some muscle, and that my lung capacity was well developed, probably from vicious bouts of asthma I'd had as a boy. But compared with those of my fellow ballplayers, my physical gifts were meager. What I had was a will that was anything but weak. It was a surprise to me, and to everyone who knew me, how ferociously I wanted to stay with the game.

Did I love the game? I surely liked it. I liked how, when I was deep in fatigue, I became a tougher, more daring person, even a reckless one. One night, scrimmaging, I went head-on with the star running back, a guy who outweighed me by 20 pounds and was far faster and stronger. I did what the coaches said: I squared up, got low (in football, the answer to every difficulty is to get low, or get lower), and planted him. I did that?, I asked myself. I liked being the guy who could do that—sometimes, though alas not often enough. The intensity of the game was inebriating. It conquered my grinding self-consciousness, brought me out of myself.

I liked the transforming aspect of the game: I came to the field one thing—a diffident guy with a slack body—and worked like a dog and so became something else—a guy with some physical prowess and more faith in himself. Mostly, I liked the whole process because it was so damned hard. I didn't think I could make it, and no one I knew did either. My parents were ready to console me if I came home bruised and dead weary and said that I was quitting. In time, one of the coaches confessed to me that he was sure I'd be gone in a few days. I had not succeeded in anything for a long time: I was a crappy student; socially I was close to a wash; my part-time job was scrubbing pans in a hospital kitchen; the first girl I liked in high school didn't like me; the second and the third followed her lead. But football was something I could do, though I was never going to be anything like a star. It was hard, it took some strength of will, and—clumsily, passionately—I could do it.

Over time, I came to understand that the objective of the game, on the deepest level, wasn't to score spectacular touchdowns or make bone-smashing tackles or block kicks. The game was much more about practice than about the Saturday-afternoon contests. And practice was about trying to do something over and over again, failing and failing, and then finally succeeding part way. Practice was about showing up and doing the same drills day after day and getting stronger and faster by tiny, tiny increments, and then discovering that by the end of the season you were effectively another person.

But mostly football was about those first days of double sessions when everyone who stuck with it did something he imagined was impossible, and so learned to recalibrate his instruments. In the future, what immediately looked impossible to us—what said Back Off, Not for You—had to be looked at again and maybe attempted anyway.

by Mark Edmundson, Chronicle of Higher Education |  Read more:
Photo: Chris and Adrienne Scott

Monday, January 16, 2012

Massive Attack



Snoozing Hawaiian monk seal.
photo: markk

Talking Heads

The Rise of the New Groupthink

Solitude is out of fashion. Our companies, our schools and our culture are in thrall to an idea I call the New Groupthink, which holds that creativity and achievement come from an oddly gregarious place. Most of us now work in teams, in offices without walls, for managers who prize people skills above all. Lone geniuses are out. Collaboration is in.

But there’s a problem with this view. Research strongly suggests that people are more creative when they enjoy privacy and freedom from interruption. And the most spectacularly creative people in many fields are often introverted, according to studies by the psychologists Mihaly Csikszentmihalyi and Gregory Feist. They’re extroverted enough to exchange and advance ideas, but see themselves as independent and individualistic. They’re not joiners by nature.

One explanation for these findings is that introverts are comfortable working alone — and solitude is a catalyst to innovation. As the influential psychologist Hans Eysenck observed, introversion fosters creativity by “concentrating the mind on the tasks in hand, and preventing the dissipation of energy on social and sexual matters unrelated to work.” In other words, a person sitting quietly under a tree in the backyard, while everyone else is clinking glasses on the patio, is more likely to have an apple land on his head. (Newton was one of the world’s great introverts: William Wordsworth described him as “A mind for ever/ Voyaging through strange seas of Thought, alone.”)

Solitude has long been associated with creativity and transcendence. “Without great solitude, no serious work is possible,” Picasso said. A central narrative of many religions is the seeker — Moses, Jesus, Buddha — who goes off by himself and brings profound insights back to the community.

Culturally, we’re often so dazzled by charisma that we overlook the quiet part of the creative process. Consider Apple. In the wake of Steve Jobs’s death, we’ve seen a profusion of myths about the company’s success. Most focus on Mr. Jobs’s supernatural magnetism and tend to ignore the other crucial figure in Apple’s creation: a kindly, introverted engineering wizard, Steve Wozniak, who toiled alone on a beloved invention, the personal computer.

Rewind to March 1975: Mr. Wozniak believes the world would be a better place if everyone had a user-friendly computer. This seems a distant dream — most computers are still the size of minivans, and many times as pricey. But Mr. Wozniak meets a simpatico band of engineers that call themselves the Homebrew Computer Club. The Homebrewers are excited about a primitive new machine called the Altair 8800. Mr. Wozniak is inspired, and immediately begins work on his own magical version of a computer. Three months later, he unveils his amazing creation for his friend, Steve Jobs. Mr. Wozniak wants to give his invention away free, but Mr. Jobs persuades him to co-found Apple Computer.

The story of Apple’s origin speaks to the power of collaboration. Mr. Wozniak wouldn’t have been catalyzed by the Altair but for the kindred spirits of Homebrew. And he’d never have started Apple without Mr. Jobs.

But it’s also a story of solo spirit. If you look at how Mr. Wozniak got the work done — the sheer hard work of creating something from nothing — he did it alone. Late at night, all by himself.

Intentionally so. In his memoir, Mr. Wozniak offers this guidance to aspiring inventors:

“Most inventors and engineers I’ve met are like me … they live in their heads. They’re almost like artists. In fact, the very best of them are artists. And artists work best alone …. I’m going to give you some advice that might be hard to take. That advice is: Work alone… Not on a committee. Not on a team.”

by Susan Cain, NY Times |  Read more:
Illustration: Andy Rementer

universum-flammarion. woodcut.
via:

Ernest Ranglin


Why Black is Addictive

Towards the end of the last century, a friend of mine took a taxi to London Fashion Week. The driver gawped in puzzlement at the moving sea of people dressed head-to-toe in black, and asked: “What’s that, then? Some religious cult?”

He had a point. There is something bordering on the cultish in fashion’s devotion to the colour black—it’s the equivalent of white for Moonies or orange for Hare Krishnas. Since that taxi journey in the 1990s the wardrobes of the stylish have brightened up a bit, but although trends such as colour blocking or floral prints may float by on the surface current, underneath there is a deeper, darker tide that pulls us back towards black. Despite pronouncements at intervals by the fashion industry that red or pink or blue is the new black, the old black is still very much with us.

Visiting eBay, the auction website, confirms this. A search in “Clothes, Shoes and Accessories” for the word “black” yields more than 3m items—that’s twice as many as “blue”, and five or six times as many as “brown” or “grey”. This ratio remains more or less the same in winter and summer, and when you narrow the search to “women’s clothing”. (Black also predominates in men’s clothing, though there’s slightly more blue.) A pedant might argue that these are the clothes that people are trying to get rid of—certainly if they were all thrown away we’d be left with a very large, black mountain. But the website of the upmarket fashion retailer Net-a-Porter tells the same story, with black significantly more dominant in its wares, be it January or June.

What is it about, this infatuation with black? It’s a question I am often asked, since I wear black most of the time, and therefore one upon which I have spent much time reflecting. My friends and colleagues might say I wear little else, though it doesn’t feel like that to me—I wear colours sometimes, particularly in summer, but black is what I feel most comfortable in. Putting on black in the morning feels as natural as breathing. If I enter a clothes shop, I am drawn towards the rails of black. I will happily wear black to weddings as well as funerals. I own black sandals and black sundresses. I even wore black when I was nine months pregnant in a July heatwave. This habit of mine is an adult-onset condition, which developed when I spent a dangerously long time working at British Vogue magazine; I didn’t work in the fashion department, but I absorbed black osmotically. I know I’m far from alone in my preference for wearing black, so—for all those others who are asked why they wear so much black, as well as for myself—I’ll try to answer that question here for once and for all.

To do that means asking some other questions about black’s significance in our society generally. How is it that black can betoken both oppression (the Nazis and Fascists) and also the rebellion of youth (punks and goths)? How can it be the distinctive feature of religious garments (nuns, priests, Hassidic Jews), and also of rubber and bondage fetishists? Why is it the uniform of dons and anorexics alike, of waiters and witches, of judges and suicide-bombers? No colour performs so many duties, in so many fields of clothing—smart, casual, uniform, anti-uniform—as black does. It is uniquely versatile and flexible. How, exactly, does my friend and ally pull that off?

by Rebecca Willis, Intelligent Life |  Read more:
Fashion Photography by Sean Gleason

Universal Flu Vaccine Could Be Available by 2013

Annual flu shots might soon become a thing of the past, and threats such as avian and swine flu might disappear with them as a vaccine touted as the "holy grail" of flu treatment could be ready for human trials next year.

That's earlier than the National Institutes of Health estimated in 2010, when they said a universal vaccine could be five years off. By targeting the parts of the virus that rarely mutate, researchers believe they can develop a vaccine similar to the mumps or measles shot—people will be vaccinated as children and then receive boosters later.

That differs from the current '60s-era technology, according to Joseph Kim, head of Inovio Pharmaceuticals, which is working on the universal vaccine. Each year, the seasonal flu vaccine targets three or four strains that researchers believe will be the most common that year. Previous seasons' vaccines have no effect on future strains of the virus, because it mutates quickly. The seasonal vaccine also offers no protection against outbreaks, such as 2009's H1N1 swine flu. A universal vaccine would offer protection against all forms of the virus.

"It's like putting up a tent over your immune system that protects against rapidly mutating viruses," Kim says. At least two other companies are working on a similar vaccine. In late 2010, Inovio earned a $3.1 million grant from the National Institutes of Health to work on the vaccine.

"It's a completely different paradigm than how [the vaccines] are made seasonably every year," Kim says.

by Jason Koebler, US News |  Read more:

The Hacker is Watching


Melissa wondered why her goof-off sister was IM'ing from the next room instead of just padding over—she wasn't usually that lazy—so she walked over to see what was up. Suzy just shrugged. She had no idea what her sister was talking about. Yeah, the IM had come from her account, but she hadn't sent it. Honest.

That night, Suzy's 20-year-old friend Nila Westwood got the same note, the same attachment. Unlike Melissa, she opened it, expecting, say, a video of some guy stapling his lip to his chin on YouTube. She waited. Nothing. When she called her friend to see what she'd missed, things actually got freaky: Suzy'd never sent a thing. The girls pieced together the clues and agreed: Suzy's AOL account had been hacked. For the next couple of weeks, the girls remained watchful for malware, insidious software capable of wreaking all sorts of havoc. But with no sign of trouble on their machines—no slow performance, no deleted files, no alerts from antivirus programs—they pretty much forgot about it.

A month passed. Suzy, Melissa, and Nila went about their lives online and off. They chatted with friends, posted pictures, and when they were tired, stretched out on their beds to rest. But at some point, each of them looked up and noticed the same strange thing: the tiny light beside their webcam glowing. At first they figured it was some kind of malfunction, but when it happened repeatedly—the light flicking on, then off—the girls felt a chill. One by one, they gazed fearfully into the lenses, wondering if someone was watching and if, perhaps now, they were looking into the eye of something scary after all. Nila, for one, wasn't taking any chances. She peeled off a sticker and stuck it on the lens.

by David Kushner, GQ |  Read more:
Photographs by Jason Madara

Sunday, January 15, 2012

In Defense of Hippies

Progressives and mainstream Democratic pundits disagree with each other about many issues at the heart of the Occupy Wall Street protests, but with few exceptions they are joined in their contempt for drum circles, free hugs, and other behavior in Zuccotti Park that smacks of hippie culture.

In a post for the Daily Beast Michelle Goldberg lamented, “Drum circles and clusters of earnest incense-burning meditators ensure that stereotypes about the hippie left remain alive.” At Esquire, Charles Pierce worried that few could “see past all the dreadlocks and hear…over the drum circles.” Michael Smerconish asked on the MSNBC show Hardball if middle Americans “in their Barcalounger” could relate to drum circles. The New Republic’s Alex Klein chimed in, “In the course of my Friday afternoon occupation, I saw two drum circles, four dogs, two saxophones, three babies....Wall Street survived.” And the host of MSNBC’s Up, Chris Hayes (editor at large of the Nation), recently reassured his guests Naomi Klein and Van Jones that although he supported the political agenda of the protest he wasn’t going to “beat the drum” or “give you a free hug,” to knowing laughter.

Yet it is precisely the mystical utopian energy that most professional progressives so smugly dismiss that has aroused a salient, mass political consciousness on economic issues—something that had eluded even the most lucid progressives in the Obama era.

Since the mythology of the 1960s hangs over so much of the analysis of the Wall Street protests, it’s worth reviewing what actually happened then. Media legend lumps sixties radicals and hippies together, but from the very beginning most leaders on the left looked at the hippie culture as, at best, a distraction and, at worst, a saboteur of pragmatic progressive politics. Hippies saw most radicals as delusional and often dangerously angry control freaks. Bad vibes.

Not that there is anything magic about the word “hippie.” Over the years it has been distorted by parody, propaganda, self-hatred, and, from its earliest stirrings, commercialism. In some contemporary contexts it is used merely to refer to people living in the past and/or those who are very stoned.

by Danny Goldberg, Dissent |  Read more:
Image: Woodstock, 1969 (Wikimedia Commons)

Curtis Wilson Cost, Gathering Darkness
via:

Friday, January 13, 2012

Do We Really Want Immortality?

Suppose you had a chance to question an ancient Greek or Roman -- or any of our distant ancestors, for that matter. Let's say you asked them to list the qualities of a deity.

It's a pretty good bet that many of the "god-like" traits he or she described might seem trivial nowadays.

After all, we think little of flying through the air. We fill pitch-dark areas with sudden lavish light, by exerting a mere twitch of a finger. Average folks routinely send messages or observe events taking place far across the globe. Copious and detailed information about the universe is readily available through crystal tubes many of us keep on our desks and command like genies. Some modern citizens can even hurl lightning, if we choose to annoy our neighbors and the electric company.

Few of us deem these powers to be miraculous, because they've been acquired by nearly everyone in prosperous nations. After all, nobody respects a gift if everybody has it. And yet, these are some of the very traits that earlier generations associated with divine beings.

Even so, we remain mortal. Our obsession with that fate is as intense as it was in the time of Gilgamesh. Perhaps more, since we overcame so many other obstacles that thwarted our ancestors.

Will our descendants conquer the last barriers standing between humanity and Olympian glory? Or may we encounter hurdles too daunting even for our brilliant, arrogant, ingenious and ever-persevering species?

There can be no better topic for this contemplation -- the last in a series commissioned for iPlanet -- about our future in the coming millennium. Essay number one cast perspective on our accomplishments during the Twentieth Century and the second dealt with near-term dilemmas we may face in the twenty-first. Now let's take a long-view, exploring the possibility that our great grandchildren will be "great" in every sense of the word... and have problems to match.

by David Brin, Sentient Developments |  Read more:

Who Pinched My Ride?


I used to stay up late watching the film of my bicycle being stolen. It’s amazing what you notice on the 38th replay of a surveillance tape, running the grainy recording backward and forward, pausing and advancing. Sometimes I’d back the tape up to before the 17 minutes that changed my life. All the way back to the part where I still had a bicycle.

Rewinding—past all the New Yorkers striding backward toward lunch; past the Algonquin and Royalton hotels inhaling crowds and the door of the Harvard Club admitting well-fed members; past the New York Yacht Club looming impassively like a beached galleon; past all the finery and civility of West 44th Street—you come to the beginning. You come to him.

The thief. There he is. Caught, if only on tape. (...)

I want my bike back. So do we all. With the rise of the bicycle age has come a rise in bicycle robbery: FBI statistics claim that 204,000 bicycles were stolen nationwide in 2010, but those are only the documented thefts. Transportation Alternatives, a bicycle advocacy group in New York City, estimates the unreported thefts at four or five times that—more than a million bikes a year. New York alone probably sees more than 100,000 bikes stolen annually. Whether in big biking cities like San Francisco and Portland, Oregon, or in sport-loving suburbs and small towns, theft is “one of the biggest reasons people don’t ride bikes,” Noah Budnick, deputy director of Transportation Alternatives, told me. Although bike commuting has increased by 100 percent in New York City during the past seven years, the lack of secure bike parking was ranked alongside bad drivers and traffic as a primary deterrent to riding more. It’s all about the (stolen) bike; even Lance Armstrong had his custom time-trial Trek nicked from the team van in 2009 after a race in California. Not every bike is that precious, but according to figures from the FBI and the National Bike Registry, the value of stolen bikes is as much as $350 million a year.

That’s a lot of bike. Stolen bicycles have become a solvent in America’s underground economy, a currency in the world of drug addicts and petty thieves. Bikes are portable and easily converted to cash, and they usually vanish without a trace—in some places, only 5 percent are even reported stolen. Stealing one is routinely treated as a misdemeanor, even though, in the age of electronic derailleurs and $5,000 coffee-shop rides, many bike thefts easily surpass the fiscal definition of felony, which varies by state but is typically under the thousand-dollar mark. Yet police departments are reluctant to pull officers from robberies or murder investigations to hunt bike thieves. Even when they do, DAs rarely prosecute the thieves the police bring in.

by Patrick Symmes, Outside |  Read more:
Photo: Jake Stangel