Thursday, January 19, 2012
Soccer's Heavy Boredom
Soccer is boring. One of the misconceptions non-soccer fans have about soccer fans is that we don't know this. The classic Simpsons parody of a soccer match — "Fast kickin'! Low scorin'! And ties? You bet!" — hangs on the joke that the game puts Americans to sleep while somehow, bafflingly, driving foreigners wild with excitement. Calling the game for Springfield TV, Kent Brockman practically grinds his teeth with frustration: "Halfback passes to the center … back to the wing … back to the center. Center holds it. Holds it. [Huge sigh.] Holds it." One booth over, the Spanish commentator is going nuts: "Halfback passes to the center! Back to the wing! Back to the center! Center holds it! Holds it!! HOLDS IT!!!"1
It's a great comedy bit, but it's not really accurate as a depiction of soccer culture. Soccer fans know soccer is boring. Soccer fans have seen more soccer than anyone. We're aware that it can be a chore. Fire up Twitter during the average Stoke City-Wigan match and you'll find us making jokes about gouging out our own eyes with wire hangers, about the players forgetting where the goals are, about what would happen if we released a pride of lions onto the pitch. (Answer: The game would still finish 0-0.) When Ricky Gervais recorded his "David Brent on Football Management" clip for the BBC during the first run of The Office, he snuck in a similar dig at the tedium of some of Liverpool's greatest teams:
So why do soccer fans do this? Assuming we follow sports for something like entertainment, what do we get out of a game for which the potential for tedium is so high that some of its most famous inspirational quotes are simply about not being dull?
I keep thinking about this question lately, maybe because I've been finding myself drawn to more and more boring games. This past weekend, I sat through the slow cudgeling death of Liverpool-Stoke. The final score was 0-0, but the final emotional score was -5. During Swansea's deliriously fun 3-2 upset of Arsenal on Sunday, I kept switching over to Athletic Bilbao's mundane 3-0 win over Levante. Why am I doing this? I thought, as Fernando Amorebieta whuffed in a gloomy header and Levante pinned themselves into their own half. But I kept checking back.
There are two reasons, basically, why soccer lends itself to spectatorial boredom. One is that the game is mercilessly hard to play at a high level. (You know, what with the whole "maneuver a small ball via precisely coordinated spontaneous group movement with 10 other people on a huge field while 11 guys try to knock it away from you, and oh, by the way, you can't use your arms and hands" element.) The other is that the gameplay almost never stops — it's a near-continuous flow for 45-plus minutes at a stretch, with only very occasional resets. Combine those two factors and you have a game that's uniquely adapted for long periods of play where, say, the first team's winger goes airborne to bring down a goal kick, but he jumps a little too soon, so the ball kind of kachunks off one side of his face, then the second team's fullback gets control of it, and he sees his attacking midfielder lurking unmarked in the center of the pitch, so he kludges the ball 20 yards upfield, but by the time it gets there the first team's holding midfielder has already closed him down and gone in for a rough tackle, and while the first team's attacking midfielder is rolling around on the ground the second team's right back runs onto the loose ball, only he's being harassed by two defenders, so he tries to knock it ahead and slip through them, but one of them gets a foot to it, so the ball sproings up in the air … etc., etc., etc. Both teams have carefully worked-out tactical plans that influence everything they're trying to do. But the gameplay is so relentless that it can't help but go through these periodic bouts of semi-decomposition.
But — and here's the obvious answer to the "Why are we doing this?" question — those same two qualities, difficulty and fluidity, also mean that soccer is uniquely adapted to produce moments of awesome visual beauty. Variables converge. Players discover solutions to problems it would be impossible to summarize without math. The ball sproings up in the air … and comes down in just such a way that Dennis Bergkamp can pull off a reverse-pirouette flick that spins the ball around the defender and back into his own path … or Thierry Henry can three-touch a 40-yard pass in the air before lining it up and scoring a weak-foot roundhouse … or Zlatan Ibrahimovic can stutter-fake his way through an entire defense. In sports, pure chaos is boring. Soccer gives players more chaos to contend with than any other major sport. So there's something uniquely thrilling about the moments when they manage to impose their own order on it.
by Brian Phillips, Grantland | Read more:
Photo: Michael Steele/Getty Images
It's a great comedy bit, but it's not really accurate as a depiction of soccer culture. Soccer fans know soccer is boring. Soccer fans have seen more soccer than anyone. We're aware that it can be a chore. Fire up Twitter during the average Stoke City-Wigan match and you'll find us making jokes about gouging out our own eyes with wire hangers, about the players forgetting where the goals are, about what would happen if we released a pride of lions onto the pitch. (Answer: The game would still finish 0-0.) When Ricky Gervais recorded his "David Brent on Football Management" clip for the BBC during the first run of The Office, he snuck in a similar dig at the tedium of some of Liverpool's greatest teams:
Do you think that Alan Hansen or Mark Lawrenson would have had the careers they had if they'd had the skills but not the discipline? If they didn't have the concentration?Translation: Those guys were good. Now please, God, someone release the lions.
It's not easy passing the ball back to the goalkeeper every single time you get it. For ninety minutes.
So why do soccer fans do this? Assuming we follow sports for something like entertainment, what do we get out of a game for which the potential for tedium is so high that some of its most famous inspirational quotes are simply about not being dull?
I keep thinking about this question lately, maybe because I've been finding myself drawn to more and more boring games. This past weekend, I sat through the slow cudgeling death of Liverpool-Stoke. The final score was 0-0, but the final emotional score was -5. During Swansea's deliriously fun 3-2 upset of Arsenal on Sunday, I kept switching over to Athletic Bilbao's mundane 3-0 win over Levante. Why am I doing this? I thought, as Fernando Amorebieta whuffed in a gloomy header and Levante pinned themselves into their own half. But I kept checking back.
There are two reasons, basically, why soccer lends itself to spectatorial boredom. One is that the game is mercilessly hard to play at a high level. (You know, what with the whole "maneuver a small ball via precisely coordinated spontaneous group movement with 10 other people on a huge field while 11 guys try to knock it away from you, and oh, by the way, you can't use your arms and hands" element.) The other is that the gameplay almost never stops — it's a near-continuous flow for 45-plus minutes at a stretch, with only very occasional resets. Combine those two factors and you have a game that's uniquely adapted for long periods of play where, say, the first team's winger goes airborne to bring down a goal kick, but he jumps a little too soon, so the ball kind of kachunks off one side of his face, then the second team's fullback gets control of it, and he sees his attacking midfielder lurking unmarked in the center of the pitch, so he kludges the ball 20 yards upfield, but by the time it gets there the first team's holding midfielder has already closed him down and gone in for a rough tackle, and while the first team's attacking midfielder is rolling around on the ground the second team's right back runs onto the loose ball, only he's being harassed by two defenders, so he tries to knock it ahead and slip through them, but one of them gets a foot to it, so the ball sproings up in the air … etc., etc., etc. Both teams have carefully worked-out tactical plans that influence everything they're trying to do. But the gameplay is so relentless that it can't help but go through these periodic bouts of semi-decomposition.
But — and here's the obvious answer to the "Why are we doing this?" question — those same two qualities, difficulty and fluidity, also mean that soccer is uniquely adapted to produce moments of awesome visual beauty. Variables converge. Players discover solutions to problems it would be impossible to summarize without math. The ball sproings up in the air … and comes down in just such a way that Dennis Bergkamp can pull off a reverse-pirouette flick that spins the ball around the defender and back into his own path … or Thierry Henry can three-touch a 40-yard pass in the air before lining it up and scoring a weak-foot roundhouse … or Zlatan Ibrahimovic can stutter-fake his way through an entire defense. In sports, pure chaos is boring. Soccer gives players more chaos to contend with than any other major sport. So there's something uniquely thrilling about the moments when they manage to impose their own order on it.
by Brian Phillips, Grantland | Read more:
Photo: Michael Steele/Getty Images
How to Build a Dog
It's an unusually balmy mid-February afternoon in New York City, but the lobby of the Hotel Pennsylvania is teeming with fur coats.
The wearers are attendees of what is undoubtedly the world's elite canine mixer, one that takes place each year on the eve of the Westminster Kennel Club dog show. Tomorrow the nation's top dogs from 173 breeds will compete for glory across the street at Madison Square Garden. But today is more akin to a four-legged meet-and-greet, as owners shuffle through the check-in line at the competition's official lodgings. A basset hound aims a droopy eye across a luggage cart at a wired-up terrier. A pair of muscled Rhodesian ridgebacks, with matching leather leashes, pause for a brief hello with a fluffy Pyrenean shepherd. Outside the gift shop a Tibetan mastiff with paws the size of human hands goes nose to nose with a snuffling pug.
The variety on display in the hotel lobby—a dizzying array of body sizes, ear shapes, nose lengths, and barking habits—is what makes dog lovers such obstinate partisans. For reasons both practical and whimsical, man's best friend has been artificially evolved into the most diverse animal on the planet—a staggering achievement, given that most of the 350 to 400 dog breeds in existence have been around for only a couple hundred years. The breeders fast-forwarded the normal pace of evolution by combining traits from disparate dogs and accentuating them by breeding those offspring with the largest hints of the desired attributes. To create a dog well suited for cornering badgers, for instance, it is thought that German hunters in the 18th and 19th centuries brought together some combination of hounds—the basset, a native of France, being the likely suspect—and terriers, producing a new variation on the theme of dog with stubby legs and a rounded body that enabled it to chase its prey into the mouth of a burrow: hence the dachshund, or "badger dog" in German. (A rival, flimsier history of the breed has it dating back, in some form, to ancient Egypt.) Pliable skin served as a defense mechanism, allowing the dog to endure sharp-toothed bites without significant damage. A long and sturdy tail helped hunters to retrieve it from an animal's lair, badger in its mouth.
The breeders gave no thought, of course, to the fact that while coaxing such weird new dogs into existence, they were also tinkering with the genes that determine canine anatomy in the first place. Scientists since have assumed that underneath the morphological diversity of dogs lay an equivalent amount of genetic diversity. A recent explosion in canine genomic research, however, has led to a surprising, and opposite, conclusion: The vast mosaic of dog shapes, colors, and sizes is decided largely by changes in a mere handful of gene regions. The difference between the dachshund's diminutive body and the Rottweiler's massive one hangs on the sequence of a single gene. The disparity between the dachshund's stumpy legs—known officially as disproportionate dwarfism, or chondrodysplasia—and a greyhound's sleek ones is determined by another one.
by Evan Ratliff, National Geographic | Read more:
The wearers are attendees of what is undoubtedly the world's elite canine mixer, one that takes place each year on the eve of the Westminster Kennel Club dog show. Tomorrow the nation's top dogs from 173 breeds will compete for glory across the street at Madison Square Garden. But today is more akin to a four-legged meet-and-greet, as owners shuffle through the check-in line at the competition's official lodgings. A basset hound aims a droopy eye across a luggage cart at a wired-up terrier. A pair of muscled Rhodesian ridgebacks, with matching leather leashes, pause for a brief hello with a fluffy Pyrenean shepherd. Outside the gift shop a Tibetan mastiff with paws the size of human hands goes nose to nose with a snuffling pug.The variety on display in the hotel lobby—a dizzying array of body sizes, ear shapes, nose lengths, and barking habits—is what makes dog lovers such obstinate partisans. For reasons both practical and whimsical, man's best friend has been artificially evolved into the most diverse animal on the planet—a staggering achievement, given that most of the 350 to 400 dog breeds in existence have been around for only a couple hundred years. The breeders fast-forwarded the normal pace of evolution by combining traits from disparate dogs and accentuating them by breeding those offspring with the largest hints of the desired attributes. To create a dog well suited for cornering badgers, for instance, it is thought that German hunters in the 18th and 19th centuries brought together some combination of hounds—the basset, a native of France, being the likely suspect—and terriers, producing a new variation on the theme of dog with stubby legs and a rounded body that enabled it to chase its prey into the mouth of a burrow: hence the dachshund, or "badger dog" in German. (A rival, flimsier history of the breed has it dating back, in some form, to ancient Egypt.) Pliable skin served as a defense mechanism, allowing the dog to endure sharp-toothed bites without significant damage. A long and sturdy tail helped hunters to retrieve it from an animal's lair, badger in its mouth.
The breeders gave no thought, of course, to the fact that while coaxing such weird new dogs into existence, they were also tinkering with the genes that determine canine anatomy in the first place. Scientists since have assumed that underneath the morphological diversity of dogs lay an equivalent amount of genetic diversity. A recent explosion in canine genomic research, however, has led to a surprising, and opposite, conclusion: The vast mosaic of dog shapes, colors, and sizes is decided largely by changes in a mere handful of gene regions. The difference between the dachshund's diminutive body and the Rottweiler's massive one hangs on the sequence of a single gene. The disparity between the dachshund's stumpy legs—known officially as disproportionate dwarfism, or chondrodysplasia—and a greyhound's sleek ones is determined by another one.
by Evan Ratliff, National Geographic | Read more:
The Secrets Apple Keeps
Apple employees know something big is afoot when the carpenters appear in their office building. New walls are quickly erected. Doors are added and new security protocols put into place. Windows that once were transparent are now frosted. Other rooms have no windows at all. They are called lockdown rooms: No information goes in or out without a reason.
The hubbub is disconcerting for employees. Quite likely you have no idea what is going on, and it's not like you're going to ask. If it hasn't been disclosed to you, then it's literally none of your business. What's more, your badge, which got you into particular areas before the new construction, no longer works in those places. All you can surmise is that a new, highly secretive project is under way, and you are not in the know. End of story.
Secrecy takes two basic forms at Apple -- external and internal. There is the obvious kind, the secrecy that Apple uses as a way of keeping its products and practices hidden from competitors and the rest of the outside world. This cloaking device is the easier of the two types for the rank and file to understand because many companies try to keep their innovations under wraps. Internal secrecy, as evidenced by those mysterious walls and off-limits areas, is tougher to stomach. Yet the link between secrecy and productivity is one way that Apple challenges long-held management truths and the notion of transparency as a corporate virtue.
All companies have secrets, of course. The difference is that at Apple everything is a secret. The company understands, by the way, that it takes things a little far; there is a hint of a sense of humor about its loose-lips-sink-ships mentality: A T‑shirt for sale in the company store, which is open to the public at 1 Infinite Loop, reads: I VISITED THE APPLE CAMPUS. BUT THAT'S ALL I'M ALLOWED TO SAY.
Apple's airy physical surroundings belie its secretive core. From above, it appears that an oval football stadium could be plopped down inside Infinite Loop. Through the doors of the buildings, in the core of the loop, is a sunny, green courtyard with volleyball courts, grassy lawns, and outdoor seating for lunch. The splendid central cafeteria, Caffe Macs, features separate stations for fresh sushi, salad, and desserts and teems with Apple employees. They pay for their meals, by the way, unlike at Google, but the food is quite good and reasonably priced. The appearance is collegiate, but good luck auditing a class. Unlike Google's famously and ridiculously named "Googleplex," where a visitor can roam the inner courtyards and slip into an open door as employees come and go, Apple's buildings are airtight. Employees can be spotted on the volleyball courts from time to time. More typically, visitors gaping into the courtyard will see a campus in constant motion. Apple employees scurry from building to building for meetings that start and end on time.
For new recruits, keeping secrets begins even before they learn which building they'll be working in. Many employees are hired into so‑called dummy positions, roles that aren't explained in detail until after they join the company. "They wouldn't tell me what it was," remembered a former engineer who had been a graduate student before joining Apple. "I knew it was related to the iPod, but not what the job was." Others do know but won't say, a realization that hits the newbies on their first day of work at new-employee orientation.
"You sit down, and you start with the usual roundtable of who is doing what," recalled Bob Borchers, a product marketing executive in the early days of the iPhone. "And half the folks can't tell you what they're doing, because it's a secret project that they've gotten hired for."
by Adam Lashinsky, Forture | Read more:
Illustration: Tavis Coburn
The hubbub is disconcerting for employees. Quite likely you have no idea what is going on, and it's not like you're going to ask. If it hasn't been disclosed to you, then it's literally none of your business. What's more, your badge, which got you into particular areas before the new construction, no longer works in those places. All you can surmise is that a new, highly secretive project is under way, and you are not in the know. End of story.Secrecy takes two basic forms at Apple -- external and internal. There is the obvious kind, the secrecy that Apple uses as a way of keeping its products and practices hidden from competitors and the rest of the outside world. This cloaking device is the easier of the two types for the rank and file to understand because many companies try to keep their innovations under wraps. Internal secrecy, as evidenced by those mysterious walls and off-limits areas, is tougher to stomach. Yet the link between secrecy and productivity is one way that Apple challenges long-held management truths and the notion of transparency as a corporate virtue.
All companies have secrets, of course. The difference is that at Apple everything is a secret. The company understands, by the way, that it takes things a little far; there is a hint of a sense of humor about its loose-lips-sink-ships mentality: A T‑shirt for sale in the company store, which is open to the public at 1 Infinite Loop, reads: I VISITED THE APPLE CAMPUS. BUT THAT'S ALL I'M ALLOWED TO SAY.
Apple's airy physical surroundings belie its secretive core. From above, it appears that an oval football stadium could be plopped down inside Infinite Loop. Through the doors of the buildings, in the core of the loop, is a sunny, green courtyard with volleyball courts, grassy lawns, and outdoor seating for lunch. The splendid central cafeteria, Caffe Macs, features separate stations for fresh sushi, salad, and desserts and teems with Apple employees. They pay for their meals, by the way, unlike at Google, but the food is quite good and reasonably priced. The appearance is collegiate, but good luck auditing a class. Unlike Google's famously and ridiculously named "Googleplex," where a visitor can roam the inner courtyards and slip into an open door as employees come and go, Apple's buildings are airtight. Employees can be spotted on the volleyball courts from time to time. More typically, visitors gaping into the courtyard will see a campus in constant motion. Apple employees scurry from building to building for meetings that start and end on time.
For new recruits, keeping secrets begins even before they learn which building they'll be working in. Many employees are hired into so‑called dummy positions, roles that aren't explained in detail until after they join the company. "They wouldn't tell me what it was," remembered a former engineer who had been a graduate student before joining Apple. "I knew it was related to the iPod, but not what the job was." Others do know but won't say, a realization that hits the newbies on their first day of work at new-employee orientation.
"You sit down, and you start with the usual roundtable of who is doing what," recalled Bob Borchers, a product marketing executive in the early days of the iPhone. "And half the folks can't tell you what they're doing, because it's a secret project that they've gotten hired for."
by Adam Lashinsky, Forture | Read more:
Illustration: Tavis Coburn
In Fight Over Piracy Bills, New Economy Rises Against Old
When the powerful world of old media mobilized to win passage of an online antipiracy bill, it marshaled the reliable giants of K Street — the United States Chamber of Commerce, the Recording Industry Association of America and, of course, the motion picture lobby, with its new chairman, former Senator Christopher J. Dodd, the Connecticut Democrat and an insider’s insider.
Yet on Wednesday this formidable old guard was forced to make way for the new as Web powerhouses backed by Internet activists rallied opposition to the legislation through Internet blackouts and cascading criticism, sending an unmistakable message to lawmakers grappling with new media issues: Don’t mess with the Internet.
As a result, the legislative battle over two once-obscure bills to combat the piracy of American movies, music, books and writing on the World Wide Web may prove to be a turning point for the way business is done in Washington. It represented a moment when the new economy rose up against the old.
“I think it is an important moment in the Capitol,” said Representative Zoe Lofgren, Democrat of California and an important opponent of the legislation. “Too often, legislation is about competing business interests. This is way beyond that. This is individual citizens rising up.”
It appeared by Wednesday evening that Congress would follow Bank of America, Netflix and Verizon as the latest institution to change course in the face of a netizen revolt.
by Jonathan Weisman, NY Times | Read more:
Wednesday, January 18, 2012
Stop SOPA (House) and the Protect IP Act (Senate)
Tuesday, January 17, 2012
Overthrow of the Kingdom of Hawaii
[ed. On January 17, 1893.]
Liliuokalani was the last queen of the Hawaiian Islands. Her rule lasted from 1891 to 1895. She was born Lydia Paki Kamekeha Liliuokalani in 1838. Her parents were councillors to King Kamehameha III. Young Lydia attended the Royal School which was run by American missionaries. In 1862 she married John Owen Dominis but he died shortly after she ascended the throne.
Liliuokalani's brother, King David Kalakaua, ascended the throne in 1874. He gave much governing power to a cabinet composed of Americans. As a result, new constitution was passed which gave voting rights to foreign residents but denied these rights to most Hawaiian natives. Liliuokalani succeeded to the throne upon the death of her brother in 1891. When she attempted to restore some of the power of the monarchy that had been lost during the reign of her brother, she encountered the revolt by the American colonists who controlled most of Hawaii's economy. In 1893, U.S. marines called in by a U.S. minister occupied the government buildings in Honolulu and deposed the queen. The colonists, led by Sanford Dole, applied for the annexation of the islands to the United States. Queen Liliuokalani appealed to the U.S. President Grover Cleveland for reinstatement.
Ignoring President Cleveland's orders, Dole established a provisional government in Hawaii. His forces put down the revolt by the royalists and jailed many of the queen's supporters. In 1895 Queen Liliuokalani was put under the house arrest in the Iolani Palace for eight months, after which she abdicated in return for the release of her jailed supporters. In 1898 the Hawaiian Islands were formally annexed to the United States. In the same year Queen Liliuokalani composed a song "Aloha Oe" as a farewell to her country. She was released as a private citizen and lived at Washington Place (320 South Beretania Street) in Honolulu until her death in 1917.
Overthrow of the Hawaiian Government | Read more:
Biography of Queen Liliuokalani via:
Liliuokalani was the last queen of the Hawaiian Islands. Her rule lasted from 1891 to 1895. She was born Lydia Paki Kamekeha Liliuokalani in 1838. Her parents were councillors to King Kamehameha III. Young Lydia attended the Royal School which was run by American missionaries. In 1862 she married John Owen Dominis but he died shortly after she ascended the throne.
Liliuokalani's brother, King David Kalakaua, ascended the throne in 1874. He gave much governing power to a cabinet composed of Americans. As a result, new constitution was passed which gave voting rights to foreign residents but denied these rights to most Hawaiian natives. Liliuokalani succeeded to the throne upon the death of her brother in 1891. When she attempted to restore some of the power of the monarchy that had been lost during the reign of her brother, she encountered the revolt by the American colonists who controlled most of Hawaii's economy. In 1893, U.S. marines called in by a U.S. minister occupied the government buildings in Honolulu and deposed the queen. The colonists, led by Sanford Dole, applied for the annexation of the islands to the United States. Queen Liliuokalani appealed to the U.S. President Grover Cleveland for reinstatement.
Ignoring President Cleveland's orders, Dole established a provisional government in Hawaii. His forces put down the revolt by the royalists and jailed many of the queen's supporters. In 1895 Queen Liliuokalani was put under the house arrest in the Iolani Palace for eight months, after which she abdicated in return for the release of her jailed supporters. In 1898 the Hawaiian Islands were formally annexed to the United States. In the same year Queen Liliuokalani composed a song "Aloha Oe" as a farewell to her country. She was released as a private citizen and lived at Washington Place (320 South Beretania Street) in Honolulu until her death in 1917.
Overthrow of the Hawaiian Government | Read more:
Biography of Queen Liliuokalani via:
Bargaining for a Child’s Love
Economic malaise and political sloganeering have contributed to the increasingly loud conversation about the coming crisis of old-age care: the depletion of the Social Security trust fund, the ever rising cost of Medicare, the end of defined-benefit pensions, the stagnation of 401(k)’s. News accounts suggest that overstretched and insufficient public services are driving adult children “back” toward caring for dependent parents.
Such accounts often draw on a deeply sentimental view of the past. Once upon a time, the story line goes, family members cared for one another naturally within households, in an organic and unplanned process. But this portrait is too rosy. If we confront what old-age support once looked like — what actually happened when care was almost fully privatized, when the old depended on their families, without the bureaucratic structures and the (under)paid caregivers we take for granted — a different picture emerges.
For the past decade I have been researching cases of family conflict over old-age care in the decades before Social Security. I have found extraordinary testimony about the intimate management of family care: how the old negotiated with the young for what they called retirement, and the exertions of caregiving at a time when support by relatives was the only sustenance available for the old.
In that world, older people could not rely on habit or culture or nature if they wanted their children to support them when they became frail. In an America strongly identified with economic and physical mobility, parents had to offer inducements. Usually, the bait they used was the promise of an inheritance: stay and take care of me and your mother, and someday you will get the house and the farm or the store or the bank account.
But of course what was at stake was never just an economic bargain between rational actors. Older people negotiated with the young to receive love, to be cared for with affection, not just self-interest.
The bargains that were negotiated were often unstable and easily undone. Life expectancies were considerably lower than they are now, but even so, old age could easily stretch for decades. Of course, disease, injury, disability, dementia, insanity, incontinence — not to mention sudden death — were commonplace, too. Wills would be left unwritten, deeds unconveyed, promises unfulfilled, because of the onset of dementia or the meddling of siblings. Or property was conveyed too early, and then the older person would be at the mercy of a child who no longer “cared” — or who could not deal with the work of care.
Consider one story, drawn from a court case in New Jersey that ended in 1904. George H. Slack had been a carpenter and a contractor in Trenton, living in a house with his wife, their daughter, Ella Rees, and her husband and daughter.
by Hendrik Hartog, NY Times | Read more:
Such accounts often draw on a deeply sentimental view of the past. Once upon a time, the story line goes, family members cared for one another naturally within households, in an organic and unplanned process. But this portrait is too rosy. If we confront what old-age support once looked like — what actually happened when care was almost fully privatized, when the old depended on their families, without the bureaucratic structures and the (under)paid caregivers we take for granted — a different picture emerges.
For the past decade I have been researching cases of family conflict over old-age care in the decades before Social Security. I have found extraordinary testimony about the intimate management of family care: how the old negotiated with the young for what they called retirement, and the exertions of caregiving at a time when support by relatives was the only sustenance available for the old.
In that world, older people could not rely on habit or culture or nature if they wanted their children to support them when they became frail. In an America strongly identified with economic and physical mobility, parents had to offer inducements. Usually, the bait they used was the promise of an inheritance: stay and take care of me and your mother, and someday you will get the house and the farm or the store or the bank account.
But of course what was at stake was never just an economic bargain between rational actors. Older people negotiated with the young to receive love, to be cared for with affection, not just self-interest.
The bargains that were negotiated were often unstable and easily undone. Life expectancies were considerably lower than they are now, but even so, old age could easily stretch for decades. Of course, disease, injury, disability, dementia, insanity, incontinence — not to mention sudden death — were commonplace, too. Wills would be left unwritten, deeds unconveyed, promises unfulfilled, because of the onset of dementia or the meddling of siblings. Or property was conveyed too early, and then the older person would be at the mercy of a child who no longer “cared” — or who could not deal with the work of care.
Consider one story, drawn from a court case in New Jersey that ended in 1904. George H. Slack had been a carpenter and a contractor in Trenton, living in a house with his wife, their daughter, Ella Rees, and her husband and daughter.
by Hendrik Hartog, NY Times | Read more:
Do Sports Build Character or Damage It?
Do sports build character? For those of us who claim to be educators, it's important to know. Physical-education teachers, coaches, boosters, most trustees, and the balance of alumni seem sure that they do. And so they push sports, sports, and more sports. As for professors, they often see sports as a diversion from the real business of education—empty, time-wasting, and claiming far too much of students' attention. It often seems that neither the boosters nor the bashers want to go too far in examining their assumptions about sports.
But in fact, sports are a complex issue, and it's clear that we as a culture don't really know how to think about them. Public confusion about performance-enhancing drugs, the dangers of concussions in football and of fighting in hockey, and the recent molestation scandal at Penn State suggest that it might be good to pull back and consider the question of athletics and education—of sports and character-building—a bit more closely than we generally do.
The first year I played high-school football, the coaches were united in their belief that drinking water on the practice field was dangerous. It made you cramp up, they told us. It made you sick to your stomach, they said. So at practice, which went on for two and a half hours, twice a day, during a roaring New England summer, we got no water. Players cramped up anyway; players got sick to their stomachs regardless. Players fell on their knees and began making soft, plaintive noises; they were helped to their feet, escorted to the locker room, and seen no more.
On the first day of double practice sessions, there were about 120 players—tough Irish and Italian kids and a few blacks—and by the end of the 12-day ordeal, there were 60 left. Some of us began without proper equipment. I started without cleats. But that was not a problem: Soon someone who wore your shoe size would quit, and then you could have theirs.
The coaches didn't cut anyone from the squad that year. Kids cut themselves. Guys with what appeared to be spectacular athletic talent would, after four days of double-session drills, walk hangdog into the coaches' locker room and hand over their pads. When I asked one of them why he quit, he said simply, "I couldn't take it."
Could I? There was no reason going in to think that I would be able to. I was buttery soft around the waist, nearsighted, not especially fast, and not agile at all. It turned out that underneath the soft exterior, I had some muscle, and that my lung capacity was well developed, probably from vicious bouts of asthma I'd had as a boy. But compared with those of my fellow ballplayers, my physical gifts were meager. What I had was a will that was anything but weak. It was a surprise to me, and to everyone who knew me, how ferociously I wanted to stay with the game.
Did I love the game? I surely liked it. I liked how, when I was deep in fatigue, I became a tougher, more daring person, even a reckless one. One night, scrimmaging, I went head-on with the star running back, a guy who outweighed me by 20 pounds and was far faster and stronger. I did what the coaches said: I squared up, got low (in football, the answer to every difficulty is to get low, or get lower), and planted him. I did that?, I asked myself. I liked being the guy who could do that—sometimes, though alas not often enough. The intensity of the game was inebriating. It conquered my grinding self-consciousness, brought me out of myself.
I liked the transforming aspect of the game: I came to the field one thing—a diffident guy with a slack body—and worked like a dog and so became something else—a guy with some physical prowess and more faith in himself. Mostly, I liked the whole process because it was so damned hard. I didn't think I could make it, and no one I knew did either. My parents were ready to console me if I came home bruised and dead weary and said that I was quitting. In time, one of the coaches confessed to me that he was sure I'd be gone in a few days. I had not succeeded in anything for a long time: I was a crappy student; socially I was close to a wash; my part-time job was scrubbing pans in a hospital kitchen; the first girl I liked in high school didn't like me; the second and the third followed her lead. But football was something I could do, though I was never going to be anything like a star. It was hard, it took some strength of will, and—clumsily, passionately—I could do it.
Over time, I came to understand that the objective of the game, on the deepest level, wasn't to score spectacular touchdowns or make bone-smashing tackles or block kicks. The game was much more about practice than about the Saturday-afternoon contests. And practice was about trying to do something over and over again, failing and failing, and then finally succeeding part way. Practice was about showing up and doing the same drills day after day and getting stronger and faster by tiny, tiny increments, and then discovering that by the end of the season you were effectively another person.
But mostly football was about those first days of double sessions when everyone who stuck with it did something he imagined was impossible, and so learned to recalibrate his instruments. In the future, what immediately looked impossible to us—what said Back Off, Not for You—had to be looked at again and maybe attempted anyway.
by Mark Edmundson, Chronicle of Higher Education | Read more:
Photo: Chris and Adrienne Scott
But in fact, sports are a complex issue, and it's clear that we as a culture don't really know how to think about them. Public confusion about performance-enhancing drugs, the dangers of concussions in football and of fighting in hockey, and the recent molestation scandal at Penn State suggest that it might be good to pull back and consider the question of athletics and education—of sports and character-building—a bit more closely than we generally do.The first year I played high-school football, the coaches were united in their belief that drinking water on the practice field was dangerous. It made you cramp up, they told us. It made you sick to your stomach, they said. So at practice, which went on for two and a half hours, twice a day, during a roaring New England summer, we got no water. Players cramped up anyway; players got sick to their stomachs regardless. Players fell on their knees and began making soft, plaintive noises; they were helped to their feet, escorted to the locker room, and seen no more.
On the first day of double practice sessions, there were about 120 players—tough Irish and Italian kids and a few blacks—and by the end of the 12-day ordeal, there were 60 left. Some of us began without proper equipment. I started without cleats. But that was not a problem: Soon someone who wore your shoe size would quit, and then you could have theirs.
The coaches didn't cut anyone from the squad that year. Kids cut themselves. Guys with what appeared to be spectacular athletic talent would, after four days of double-session drills, walk hangdog into the coaches' locker room and hand over their pads. When I asked one of them why he quit, he said simply, "I couldn't take it."
Could I? There was no reason going in to think that I would be able to. I was buttery soft around the waist, nearsighted, not especially fast, and not agile at all. It turned out that underneath the soft exterior, I had some muscle, and that my lung capacity was well developed, probably from vicious bouts of asthma I'd had as a boy. But compared with those of my fellow ballplayers, my physical gifts were meager. What I had was a will that was anything but weak. It was a surprise to me, and to everyone who knew me, how ferociously I wanted to stay with the game.
Did I love the game? I surely liked it. I liked how, when I was deep in fatigue, I became a tougher, more daring person, even a reckless one. One night, scrimmaging, I went head-on with the star running back, a guy who outweighed me by 20 pounds and was far faster and stronger. I did what the coaches said: I squared up, got low (in football, the answer to every difficulty is to get low, or get lower), and planted him. I did that?, I asked myself. I liked being the guy who could do that—sometimes, though alas not often enough. The intensity of the game was inebriating. It conquered my grinding self-consciousness, brought me out of myself.
I liked the transforming aspect of the game: I came to the field one thing—a diffident guy with a slack body—and worked like a dog and so became something else—a guy with some physical prowess and more faith in himself. Mostly, I liked the whole process because it was so damned hard. I didn't think I could make it, and no one I knew did either. My parents were ready to console me if I came home bruised and dead weary and said that I was quitting. In time, one of the coaches confessed to me that he was sure I'd be gone in a few days. I had not succeeded in anything for a long time: I was a crappy student; socially I was close to a wash; my part-time job was scrubbing pans in a hospital kitchen; the first girl I liked in high school didn't like me; the second and the third followed her lead. But football was something I could do, though I was never going to be anything like a star. It was hard, it took some strength of will, and—clumsily, passionately—I could do it.
Over time, I came to understand that the objective of the game, on the deepest level, wasn't to score spectacular touchdowns or make bone-smashing tackles or block kicks. The game was much more about practice than about the Saturday-afternoon contests. And practice was about trying to do something over and over again, failing and failing, and then finally succeeding part way. Practice was about showing up and doing the same drills day after day and getting stronger and faster by tiny, tiny increments, and then discovering that by the end of the season you were effectively another person.
But mostly football was about those first days of double sessions when everyone who stuck with it did something he imagined was impossible, and so learned to recalibrate his instruments. In the future, what immediately looked impossible to us—what said Back Off, Not for You—had to be looked at again and maybe attempted anyway.
by Mark Edmundson, Chronicle of Higher Education | Read more:
Photo: Chris and Adrienne Scott
Monday, January 16, 2012
The Rise of the New Groupthink
Solitude is out of fashion. Our companies, our schools and our culture are in thrall to an idea I call the New Groupthink, which holds that creativity and achievement come from an oddly gregarious place. Most of us now work in teams, in offices without walls, for managers who prize people skills above all. Lone geniuses are out. Collaboration is in.
But there’s a problem with this view. Research strongly suggests that people are more creative when they enjoy privacy and freedom from interruption. And the most spectacularly creative people in many fields are often introverted, according to studies by the psychologists Mihaly Csikszentmihalyi and Gregory Feist. They’re extroverted enough to exchange and advance ideas, but see themselves as independent and individualistic. They’re not joiners by nature.
One explanation for these findings is that introverts are comfortable working alone — and solitude is a catalyst to innovation. As the influential psychologist Hans Eysenck observed, introversion fosters creativity by “concentrating the mind on the tasks in hand, and preventing the dissipation of energy on social and sexual matters unrelated to work.” In other words, a person sitting quietly under a tree in the backyard, while everyone else is clinking glasses on the patio, is more likely to have an apple land on his head. (Newton was one of the world’s great introverts: William Wordsworth described him as “A mind for ever/ Voyaging through strange seas of Thought, alone.”)
Solitude has long been associated with creativity and transcendence. “Without great solitude, no serious work is possible,” Picasso said. A central narrative of many religions is the seeker — Moses, Jesus, Buddha — who goes off by himself and brings profound insights back to the community.
Culturally, we’re often so dazzled by charisma that we overlook the quiet part of the creative process. Consider Apple. In the wake of Steve Jobs’s death, we’ve seen a profusion of myths about the company’s success. Most focus on Mr. Jobs’s supernatural magnetism and tend to ignore the other crucial figure in Apple’s creation: a kindly, introverted engineering wizard, Steve Wozniak, who toiled alone on a beloved invention, the personal computer.
Rewind to March 1975: Mr. Wozniak believes the world would be a better place if everyone had a user-friendly computer. This seems a distant dream — most computers are still the size of minivans, and many times as pricey. But Mr. Wozniak meets a simpatico band of engineers that call themselves the Homebrew Computer Club. The Homebrewers are excited about a primitive new machine called the Altair 8800. Mr. Wozniak is inspired, and immediately begins work on his own magical version of a computer. Three months later, he unveils his amazing creation for his friend, Steve Jobs. Mr. Wozniak wants to give his invention away free, but Mr. Jobs persuades him to co-found Apple Computer.
The story of Apple’s origin speaks to the power of collaboration. Mr. Wozniak wouldn’t have been catalyzed by the Altair but for the kindred spirits of Homebrew. And he’d never have started Apple without Mr. Jobs.
But it’s also a story of solo spirit. If you look at how Mr. Wozniak got the work done — the sheer hard work of creating something from nothing — he did it alone. Late at night, all by himself.
Intentionally so. In his memoir, Mr. Wozniak offers this guidance to aspiring inventors:
“Most inventors and engineers I’ve met are like me … they live in their heads. They’re almost like artists. In fact, the very best of them are artists. And artists work best alone …. I’m going to give you some advice that might be hard to take. That advice is: Work alone… Not on a committee. Not on a team.”
by Susan Cain, NY Times | Read more:
Illustration: Andy Rementer
But there’s a problem with this view. Research strongly suggests that people are more creative when they enjoy privacy and freedom from interruption. And the most spectacularly creative people in many fields are often introverted, according to studies by the psychologists Mihaly Csikszentmihalyi and Gregory Feist. They’re extroverted enough to exchange and advance ideas, but see themselves as independent and individualistic. They’re not joiners by nature. One explanation for these findings is that introverts are comfortable working alone — and solitude is a catalyst to innovation. As the influential psychologist Hans Eysenck observed, introversion fosters creativity by “concentrating the mind on the tasks in hand, and preventing the dissipation of energy on social and sexual matters unrelated to work.” In other words, a person sitting quietly under a tree in the backyard, while everyone else is clinking glasses on the patio, is more likely to have an apple land on his head. (Newton was one of the world’s great introverts: William Wordsworth described him as “A mind for ever/ Voyaging through strange seas of Thought, alone.”)
Solitude has long been associated with creativity and transcendence. “Without great solitude, no serious work is possible,” Picasso said. A central narrative of many religions is the seeker — Moses, Jesus, Buddha — who goes off by himself and brings profound insights back to the community.
Culturally, we’re often so dazzled by charisma that we overlook the quiet part of the creative process. Consider Apple. In the wake of Steve Jobs’s death, we’ve seen a profusion of myths about the company’s success. Most focus on Mr. Jobs’s supernatural magnetism and tend to ignore the other crucial figure in Apple’s creation: a kindly, introverted engineering wizard, Steve Wozniak, who toiled alone on a beloved invention, the personal computer.
Rewind to March 1975: Mr. Wozniak believes the world would be a better place if everyone had a user-friendly computer. This seems a distant dream — most computers are still the size of minivans, and many times as pricey. But Mr. Wozniak meets a simpatico band of engineers that call themselves the Homebrew Computer Club. The Homebrewers are excited about a primitive new machine called the Altair 8800. Mr. Wozniak is inspired, and immediately begins work on his own magical version of a computer. Three months later, he unveils his amazing creation for his friend, Steve Jobs. Mr. Wozniak wants to give his invention away free, but Mr. Jobs persuades him to co-found Apple Computer.
The story of Apple’s origin speaks to the power of collaboration. Mr. Wozniak wouldn’t have been catalyzed by the Altair but for the kindred spirits of Homebrew. And he’d never have started Apple without Mr. Jobs.
But it’s also a story of solo spirit. If you look at how Mr. Wozniak got the work done — the sheer hard work of creating something from nothing — he did it alone. Late at night, all by himself.
Intentionally so. In his memoir, Mr. Wozniak offers this guidance to aspiring inventors:
“Most inventors and engineers I’ve met are like me … they live in their heads. They’re almost like artists. In fact, the very best of them are artists. And artists work best alone …. I’m going to give you some advice that might be hard to take. That advice is: Work alone… Not on a committee. Not on a team.”
by Susan Cain, NY Times | Read more:
Illustration: Andy Rementer
Why Black is Addictive
Towards the end of the last century, a friend of mine took a taxi to London Fashion Week. The driver gawped in puzzlement at the moving sea of people dressed head-to-toe in black, and asked: “What’s that, then? Some religious cult?”
He had a point. There is something bordering on the cultish in fashion’s devotion to the colour black—it’s the equivalent of white for Moonies or orange for Hare Krishnas. Since that taxi journey in the 1990s the wardrobes of the stylish have brightened up a bit, but although trends such as colour blocking or floral prints may float by on the surface current, underneath there is a deeper, darker tide that pulls us back towards black. Despite pronouncements at intervals by the fashion industry that red or pink or blue is the new black, the old black is still very much with us.
Visiting eBay, the auction website, confirms this. A search in “Clothes, Shoes and Accessories” for the word “black” yields more than 3m items—that’s twice as many as “blue”, and five or six times as many as “brown” or “grey”. This ratio remains more or less the same in winter and summer, and when you narrow the search to “women’s clothing”. (Black also predominates in men’s clothing, though there’s slightly more blue.) A pedant might argue that these are the clothes that people are trying to get rid of—certainly if they were all thrown away we’d be left with a very large, black mountain. But the website of the upmarket fashion retailer Net-a-Porter tells the same story, with black significantly more dominant in its wares, be it January or June.
What is it about, this infatuation with black? It’s a question I am often asked, since I wear black most of the time, and therefore one upon which I have spent much time reflecting. My friends and colleagues might say I wear little else, though it doesn’t feel like that to me—I wear colours sometimes, particularly in summer, but black is what I feel most comfortable in. Putting on black in the morning feels as natural as breathing. If I enter a clothes shop, I am drawn towards the rails of black. I will happily wear black to weddings as well as funerals. I own black sandals and black sundresses. I even wore black when I was nine months pregnant in a July heatwave. This habit of mine is an adult-onset condition, which developed when I spent a dangerously long time working at British Vogue magazine; I didn’t work in the fashion department, but I absorbed black osmotically. I know I’m far from alone in my preference for wearing black, so—for all those others who are asked why they wear so much black, as well as for myself—I’ll try to answer that question here for once and for all.
To do that means asking some other questions about black’s significance in our society generally. How is it that black can betoken both oppression (the Nazis and Fascists) and also the rebellion of youth (punks and goths)? How can it be the distinctive feature of religious garments (nuns, priests, Hassidic Jews), and also of rubber and bondage fetishists? Why is it the uniform of dons and anorexics alike, of waiters and witches, of judges and suicide-bombers? No colour performs so many duties, in so many fields of clothing—smart, casual, uniform, anti-uniform—as black does. It is uniquely versatile and flexible. How, exactly, does my friend and ally pull that off?
by Rebecca Willis, Intelligent Life | Read more:
Fashion Photography by Sean Gleason
He had a point. There is something bordering on the cultish in fashion’s devotion to the colour black—it’s the equivalent of white for Moonies or orange for Hare Krishnas. Since that taxi journey in the 1990s the wardrobes of the stylish have brightened up a bit, but although trends such as colour blocking or floral prints may float by on the surface current, underneath there is a deeper, darker tide that pulls us back towards black. Despite pronouncements at intervals by the fashion industry that red or pink or blue is the new black, the old black is still very much with us.Visiting eBay, the auction website, confirms this. A search in “Clothes, Shoes and Accessories” for the word “black” yields more than 3m items—that’s twice as many as “blue”, and five or six times as many as “brown” or “grey”. This ratio remains more or less the same in winter and summer, and when you narrow the search to “women’s clothing”. (Black also predominates in men’s clothing, though there’s slightly more blue.) A pedant might argue that these are the clothes that people are trying to get rid of—certainly if they were all thrown away we’d be left with a very large, black mountain. But the website of the upmarket fashion retailer Net-a-Porter tells the same story, with black significantly more dominant in its wares, be it January or June.
What is it about, this infatuation with black? It’s a question I am often asked, since I wear black most of the time, and therefore one upon which I have spent much time reflecting. My friends and colleagues might say I wear little else, though it doesn’t feel like that to me—I wear colours sometimes, particularly in summer, but black is what I feel most comfortable in. Putting on black in the morning feels as natural as breathing. If I enter a clothes shop, I am drawn towards the rails of black. I will happily wear black to weddings as well as funerals. I own black sandals and black sundresses. I even wore black when I was nine months pregnant in a July heatwave. This habit of mine is an adult-onset condition, which developed when I spent a dangerously long time working at British Vogue magazine; I didn’t work in the fashion department, but I absorbed black osmotically. I know I’m far from alone in my preference for wearing black, so—for all those others who are asked why they wear so much black, as well as for myself—I’ll try to answer that question here for once and for all.
To do that means asking some other questions about black’s significance in our society generally. How is it that black can betoken both oppression (the Nazis and Fascists) and also the rebellion of youth (punks and goths)? How can it be the distinctive feature of religious garments (nuns, priests, Hassidic Jews), and also of rubber and bondage fetishists? Why is it the uniform of dons and anorexics alike, of waiters and witches, of judges and suicide-bombers? No colour performs so many duties, in so many fields of clothing—smart, casual, uniform, anti-uniform—as black does. It is uniquely versatile and flexible. How, exactly, does my friend and ally pull that off?
by Rebecca Willis, Intelligent Life | Read more:
Fashion Photography by Sean Gleason
Universal Flu Vaccine Could Be Available by 2013
Annual flu shots might soon become a thing of the past, and threats such as avian and swine flu might disappear with them as a vaccine touted as the "holy grail" of flu treatment could be ready for human trials next year.
That's earlier than the National Institutes of Health estimated in 2010, when they said a universal vaccine could be five years off. By targeting the parts of the virus that rarely mutate, researchers believe they can develop a vaccine similar to the mumps or measles shot—people will be vaccinated as children and then receive boosters later.
That differs from the current '60s-era technology, according to Joseph Kim, head of Inovio Pharmaceuticals, which is working on the universal vaccine. Each year, the seasonal flu vaccine targets three or four strains that researchers believe will be the most common that year. Previous seasons' vaccines have no effect on future strains of the virus, because it mutates quickly. The seasonal vaccine also offers no protection against outbreaks, such as 2009's H1N1 swine flu. A universal vaccine would offer protection against all forms of the virus.
"It's like putting up a tent over your immune system that protects against rapidly mutating viruses," Kim says. At least two other companies are working on a similar vaccine. In late 2010, Inovio earned a $3.1 million grant from the National Institutes of Health to work on the vaccine.
"It's a completely different paradigm than how [the vaccines] are made seasonably every year," Kim says.
by Jason Koebler, US News | Read more:
That's earlier than the National Institutes of Health estimated in 2010, when they said a universal vaccine could be five years off. By targeting the parts of the virus that rarely mutate, researchers believe they can develop a vaccine similar to the mumps or measles shot—people will be vaccinated as children and then receive boosters later.
That differs from the current '60s-era technology, according to Joseph Kim, head of Inovio Pharmaceuticals, which is working on the universal vaccine. Each year, the seasonal flu vaccine targets three or four strains that researchers believe will be the most common that year. Previous seasons' vaccines have no effect on future strains of the virus, because it mutates quickly. The seasonal vaccine also offers no protection against outbreaks, such as 2009's H1N1 swine flu. A universal vaccine would offer protection against all forms of the virus.
"It's like putting up a tent over your immune system that protects against rapidly mutating viruses," Kim says. At least two other companies are working on a similar vaccine. In late 2010, Inovio earned a $3.1 million grant from the National Institutes of Health to work on the vaccine.
"It's a completely different paradigm than how [the vaccines] are made seasonably every year," Kim says.
by Jason Koebler, US News | Read more:
Subscribe to:
Comments (Atom)













