Friday, September 16, 2011

Does Science Back up Alaska's Policy of Killing Grizzly Bears?

by Rick Sinnott

Four years ago the Alaska Legislature offered Gov. Sarah Palin and the Alaska Department of Fish and Game a special deal: $400,000 to “educate” voters on predator control. The money -- spent mostly on a video, glossy brochures and public presentations -- was meant to persuade and reassure Alaskans that predator control is essential and effective.

Firmly convinced he’s doing the right thing, the new director of the Division of Wildlife Conservation at Fish and Game, Corey Rossi, is taking predator control to new levels. For the first time since statehood, Alaska has targeted grizzly bears for large-scale population reductions, not by hunters but by agents of the state.

The publicity campaign, Rossi, Governor Sean Parnell and the Alaska Legislature would like you to believe that scientific experts on predator and prey populations -- particularly the professional wildlife biologists and researchers with the Alaska Department of Fish and Game -- unanimously support killing bears to increase numbers of moose and caribou.

But some of those experts have questioned the efficacy and advisability of reducing numbers of grizzly bears in a peer-reviewed article in the latest edition of the Journal of Wildlife Management.
------
Wolves have long been hunted, trapped, poisoned, and shot from aircraft to increase prey populations. Because public hunting and trapping in Alaska are unable to reduce wolf populations and keep them low, in the past decade the state has increasingly relied on predator control agents. Now Alaska’s bears are being tossed into the predator-control arena.

This is not a new phenomenon either. What is new is that Alaska is bucking a 50-year hiatus on state-sponsored bear control. Before statehood, many Alaskans regarded grizzly bears as dangerous vermin. The celebrated brown bears of Kodiak Island were nominated for eradication because they ate salmon and cattle. This was during the Dark Ages of fish and game management, when bounties were paid not just for wolves, but harbor seals, bald eagles, and Dolly Varden char.

Following statehood, Alaska’s new wildlife managers attempted to raise the status of wolves, bears, eagles and other predators from varmints to valued species. However, moose and caribou hunters have always outnumbered wolf and bear hunters, and hunter tolerance of wolves and bears dipped after moose and caribou populations declined beginning in the late 1960s and early 1970s. Wildlife managers attribute those declines to a combination of severe winters, predation, and high hunter harvests. You can’t do much about the weather -- and moose and caribou were still in high demand as meat and trophies -- so many hunters and some wildlife managers returned to the earlier paradigm, demanding fewer wild predators.

Predator control was facilitated and accelerated in 1994 when the Alaska Legislature enacted the intensive management law. In my opinion, this law ignores a much broader public interest in wildlife resources. The state’s constitution mandates making all wildlife, not only moose and caribou, “available for maximum use consistent with the public interest” and conserving wildlife according to “the sustained yield principle, subject to preferences among beneficial uses.” This suggests that the state’s constitutional convention recognized the value of all Alaska’s wildlife and anticipated it would be managed holistically for all Alaskans. I've yet to find anything in the constitution about Alaska becoming the world’s largest game ranch. Nevertheless, the intensive management law required the Alaska Board of Game to elevate human consumption of wild animals over other beneficial uses, such as conserving natural diversity, tourism, or the satisfaction of knowing some corner of the world is not completely dominated by humans.

Before the legislature’s intervention, managing Alaska’s wildlife was like fixing grandpa’s gold watch. It entailed routine fine-tuning and replacement of springs, sprockets and cogs, the regulatory moving parts required to precisely apportion the resources of a complex world. The intensive management law removed essentially all the tools from the toolbox except one. The legislature expects Fish and Game to fix the watch with a hammer.
------
The authors are well aware that both black and grizzly bears can be effective predators on moose calves, but it remains unclear whether reducing grizzly bear populations will increase calf survival. A calf eaten by a grizzly may have been just as likely to be killed by another predator, or disease, or accident, or an inexperienced mother. The point is that controversial and potentially destructive programs to control predators should be fiscally and scientifically justifiable. Fish and Game has spent millions of dollars -- in the field, in meetings, in public relations, and in the courtroom -- to implement and defend predator control. Are we harvesting millions of dollars worth of additional moose and caribou?

Predator control is seldom warranted ecologically, and is more usually politically driven. The authors were unable to find any place in Alaska in the past three decades where regulations were tightened when moose or caribou populations rebounded. Too many moose or caribou can damage critical winter ranges, throwing their populations into a tailspin. But some people fail to understand that you can have too much of a good thing. Because some hunters keep demanding more moose, more caribou, predator control doesn’t appear to have a political exit strategy. “Success” is a moving target because some hunters are never satisfied with the current availability of moose or caribou.

Read more:

It's Not Plagiarism. It's 'Repurposing.'

by Kenneth Goldsmith

In 1969 the conceptual artist Douglas Huebler wrote, "The world is full of objects, more or less interesting; I do not wish to add any more." I've come to embrace Huebler's idea, though it might be retooled as: "The world is full of texts, more or less interesting; I do not wish to add any more."

It seems an appropriate response to a new condition in writing: With an unprecedented amount of available text, our problem is not needing to write more of it; instead, we must learn to negotiate the vast quantity that exists. How I make my way through this thicket of information—how I manage it, parse it, organize and distribute it—is what distinguishes my writing from yours.

The prominent literary critic Marjorie Perloff has recently begun using the term "unoriginal genius" to describe this tendency emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is outdated. An updated notion of genius would have to center around one's mastery of information and its dissemination. Perloff has coined another term, "moving information," to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today's writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.
-----
In 2007 Jonathan Lethem published a pro-plagiarism, plagiarized essay in Harper's titled, "The Ecstasy of Influence: A Plagiarism." It's a lengthy defense and history of how ideas in literature have been shared, riffed, culled, reused, recycled, swiped, stolen, quoted, lifted, duplicated, gifted, appropriated, mimicked, and pirated for as long as literature has existed. Lethem reminds us of how gift economies, open-source cultures, and public commons have been vital for the creation of new works, with themes from older works forming the basis for new ones. Echoing the cries of free-culture advocates such as Lawrence Lessig and Cory Doctorow, he eloquently rails against copyright law as a threat to the lifeblood of creativity. From Martin Luther King Jr.'s sermons to Muddy Waters's blues tunes, he showcases the rich fruits of shared culture. He even cites examples of what he had assumed were his own "original" thoughts, only later to realize—usually by Googling—that he had unconsciously absorbed someone else's ideas that he then claimed as his own.

It's a great essay. Too bad he didn't "write" it. The punchline? Nearly every word and idea was borrowed from somewhere else—either appropriated in its entirety or rewritten by Lethem. His essay is an example of "patchwriting," a way of weaving together various shards of other people's words into a tonally cohesive whole. It's a trick that students use all the time, rephrasing, say, a Wikipedia entry into their own words. And if they're caught, it's trouble: In academia, patchwriting is considered an offense equal to that of plagiarism. If Lethem had submitted this as a senior thesis or dissertation chapter, he'd be shown the door. Yet few would argue that he didn't construct a brilliant work of art—as well as writing a pointed essay—entirely in the words of others. It's the way in which he conceptualized and executed his writing machine—surgically choosing what to borrow, arranging those words in a skillful way—that wins us over. Lethem's piece is a self-reflexive, demonstrative work of unoriginal genius.

Lethem's provocation belies a trend among younger writers who take his exercise one step further by boldly appropriating the work of others without citation, disposing of the artful and seamless integration of Lethem's patchwriting. For them, the act of writing is literally moving language from one place to another, proclaiming that context is the new content. While pastiche and collage have long been part and parcel of writing, with the rise of the Internet plagiaristic intensity has been raised to extreme levels.

Read more:

Locusts, Cilantro, Elvis Presley


As a young man studying in Amsterdam, Vincent van Gogh on August 18, 1877, wrote to his brother Theo, “I breakfasted on a piece of dry bread and a glass of beer—that is what Dickens advises for those who are on the point of committing suicide, as being a good way to keep them, at least for some time, from their purpose.”

Cornbread, hot biscuits, wheat bread, and fried chicken were among the foods that Mark Twain said couldn’t be cooked north of the Mason-Dixon line.

“As if I swallowed a baby,” said William Makepeace Thackeray about eating his first oyster.

Puréed applesauce—the first food eaten in outer space, by John Glenn in 1962. Shrimp cocktail, macaroni and cheese, candy-coated peanuts, Metamucil wafers—among what he ate thirty-six years later aboard the spaceship Discovery.

Tomato, potato, corn, beans, zucchini, squash, avocado, bell pepper, chili, and pineapple are among the foods that Christopher Columbus brought back to the Old World. Onion, garlic, wheat, barley, olives, and lettuce are among the foods he introduced to the New.

About cilantro in a dish, Julia Child said, “I would pick it out if I saw it and throw it on the floor.”

Thirty to sixty million—the estimate of buffalo in the United States in the early 1800s. 1,200—the estimate some ninety years later.

“If you’re just going to sit there and stare at me, I’m going to bed,” Elvis Presley said, breaking an awkward silence when the Beatles visited him on August 27, 1965. As midnight snacks for his guests, he requested broiled chicken-livers wrapped in bacon and sweet-and-sour meatballs.

Read more:

Post-Grunge, Seattle Rocks On

Nirvana at Beehive Records in Seattle on Sept. 16, 1991, for the release of “Nevermind.” More Photos »

by William Yardley and Sean Patrick Farrell

Kurt Cobain spoke of his band’s breakthrough single at a concert here that turned out to be one of Nirvana’s final performances in the United States.

“This song made Seattle the most livable city in America,” Mr. Cobain told the audience.

Then he ripped into the opening chords of “Smells Like Teen Spirit,” an anthem of indifference polished just enough to give it popular appeal.

Now, 20 years after Nirvana soared from obscurity to superstardom and the Seattle scene was anointed as rock relevant, a new exhibition, a film and a tribute concert planned for the anniversary make it clear how different things really are here now.

Seattle has become even more livable since Mr. Cobain’s dry declaration, way back in January 1994. The city still rocks, and its rockers still ache, but more gently now. Flannel? Sure. Screaming? There is less of it from this new stage.

Once mostly boom, bust and Boeing, Seattle rebuilt itself on high technology, global commerce and well-educated newcomers making waterproof peace with the weather. The Pacific Northwest, long a mysterious corner of the country, stepped out of isolation and into cliché: coffee, computers and Kurt, the voice and face of grunge.

“The times for Seattle to be sort of a misty forgotten land are over,” said Jacob McMurray, the curator of “Nirvana: Taking Punk to the Masses,” the exhibition at the Experience Music Project here.

The anniversary events are canonizing — and continuing to commercialize — the moment when, some people say, alternative music fully broke through to the mainstream. Yet many of the dark clubs where Nirvana and bands like Soundgarden, Alice in Chains, Pearl Jam, Mudhoney and the Melvins built followings are gone.

Read more:

Thursday, September 15, 2011


Nate Rivera (Japan) - Portfolio
via:

Cell Phone Etiquette for Morons

by Beth Mann

So you have a cell phone? Okay, well good for you. I do too! Fancy, isn't it? But remember, there are some rules to remember when using that spiffy telecommunication device of yours in public:
1. You're not special because you have a cell phone. Small children and homeless people have cell phones. There are probably pets out there with cellular devices. Remember that when you're walking down the street barking orders like you're Donald Trump and thinking people are impressed. We're not.

2. Using a cell phone in a theater is the height of rudeness. Don't even dare convince yourself otherwise just because other people are doing it. People also pick their nose and urinate in their pants in public. Wanna follow that lead too?
That glow from your cellphone is extremely distracting to those around you. God forbid you simply try to be present and enjoy the show instead of likely recording crappy video that no one will watch.

3. Using your cell phone excessively in the following places is also rude, rude, rude: 
  • Public transportation
  • Restaurants
  • Libraries (Come on...are you serious?)
  • Church (See above.)
  • In a grocery store line (You're too close to me. I can't run from your inanity.)
  • The beach (Is anything sacred? Can you just be in nature for ten damn minutes without a phone glued to your face?)
  • A date
  • A museum
4. Annoying cell phone rings showcase your shallow personality. Just go with something simple. No one needs to know about your love of Rhianna's Umbrella, you know what I mean? Keep that a secret. And don't let it ring incessantly if you're not prepared to answer it. Turn the damn thing off and spare us Toby Keith or whatever weird shit you're into.
 Read more:

Sally Oldfield


Top Chef, Old Master


by Michelle Legro

They called him “fat boy,” this seventeen-year old apprentice in the studio of Florentine painter Verrocchio who would receive care packages from his step-father, a pastry chef. The bastard son of a Florentine notary and a lady of Vinci, the boy’s doting step-father gave him a taste for marzipans and sugars from a very young age. The apprentice would receive the packages and devour them so quickly—crapulando, it was called, or guzzling—that the master felt the need to punish him, instructing the boy to paint an angel in the corner of a baptism of Christ, a mediocre painting which hangs in the Uffizi because it includes the first work of Leonardo da Vinci.

After three years as an apprentice, twenty-year old Leonardo took a job as a cook at the Tavern of the Three Snails near the Ponte Vecchio, working during the day on the few commissions his master sent his way and slinging polenta in the evenings. Polenta was the restaurant’s signature dish, a tasteless hash of meats and corn porridge. The other cooks at the Three Snails cared little about the quality of the food they served, and when in the spring of 1473, a poisoning sickened and killed the majority of the cooking staff of the tavern, Leonardo was put in charge of the kitchen. He changed the menu completely, serving up delicate portions of carved polenta arranged beautifully on the plate. However, like most tavern clientele, the patrons preferred their meals in huge messy portions. Upset with the change in management, they ran Leonardo out of a job.

Much like a modern struggling artist, Leonardo da Vinci was in his daily life a line cook, tavern keeper, and chef-for-hire. “My painting and my sculpture will stand comparison with that of any other artist,” he wrote in a humble introduction to Ludovico Sforza, the future Duke of Milan, by way of a job application. “I am supreme at telling riddles and tying knots. And I make cakes that are without compare.”

Sforza took him on neither as a cook, painter, or sculptor, but instead as a lute player and after-dinner entertainer. Leonardo attempted to show his lord his new inventions for fortifications, catapults, and ladders, but Sforza paid little attention until the lute-player fashioned his inventions out of marzipan and jelly. Sforza charged the young man with refurbishing his kitchen, a task which would consume the life of Leonardo and the entire Sforza court.

Five hundred years before Modernist Cuisine’s exhaustive look at molecular gastronomy, The Kitchen Notebooks of Leonardo da Vinci envisioned a culinary world as studio and laboratory, where food was to be prepared efficiently, beautifully, and ingeniously. Unfortunately, Italian food of the late fifteenth century had less to do with the luxurious feats of Ancient Rome and more to do with the rustic tastes of the Goths, whose dishes included meats and birds for those who could afford it, and an endless parade of porridge and gruel for those who could not. Leonardo was horrified by much of the food that was served to him, both at court and at home, and he included in his notebooks a running list of dishes that he hated, but that his own servant insisted on serving him: jellied goat, hemp bread, white mosquito pudding, inedible turnips, and eel balls—which he notes, “this dish if eaten often can cause madness.”

The notebooks, which include a history of Leonardo’s tenure as chef at the Sforza court, is primarily a collection of recipes (cabbage jam, snail soup), wayward thoughts (“Would porridge balls in gold-leaf attract My Lord’s attention?”), dining etiquette (“On the Unseemly Behaviors at My Lord’s Table”), household tips, (“On Ridding your Kitchens of Pestilential Flies”), and household inventions (“The Machines I Have Yet to Design for my Kitchen”).

Read more:

Andrew B. Myers
via:

Kramer O'Neill
via:

What if the Secret to Success Is Failure?

[ed.  An interesting examination of school curriculums built around character development, as well as academic achievement.]

by Paul Tough

Cohen and Fierst told me that they also see many Riverdale parents who, while pushing their children to excel, also inadvertently shield them from exactly the kind of experience that can lead to character growth. As Fierst put it: “Our kids don’t put up with a lot of suffering. They don’t have a threshold for it. They’re protected against it quite a bit. And when they do get uncomfortable, we hear from their parents. We try to talk to parents about having to sort of make it O.K. for there to be challenge, because that’s where learning happens.”

Cohen said that in the middle school, “if a kid is a C student, and their parents think that they’re all-A’s, we do get a lot of pushback: ‘What are you talking about? This is a great paper!’ We have parents calling in and saying, for their kids, ‘Can’t you just give them two more days on this paper?’ Overindulging kids, with the intention of giving them everything and being loving, but at the expense of their character — that’s huge in our population. I think that’s one of the biggest problems we have at Riverdale.”

This is a problem, of course, for all parents, not just affluent ones. It is a central paradox of contemporary parenting, in fact: we have an acute, almost biological impulse to provide for our children, to give them everything they want and need, to protect them from dangers and discomforts both large and small. And yet we all know — on some level, at least — that what kids need more than anything is a little hardship: some challenge, some deprivation that they can overcome, even if just to prove to themselves that they can. As a parent, you struggle with these thorny questions every day, and if you make the right call even half the time, you’re lucky. But it’s one thing to acknowledge this dilemma in the privacy of your own home; it’s quite another to have it addressed in public, at a school where you send your kids at great expense.

Read more:

My Superpower Is Being Alone Forever


by Joe Berkowitz and Joanna Neborsky

It’s pretty hard to reverse engineer a meet-cute. These things either happen or they don’t. If you were really serious about it, you could probably arrange for, say, an errant shopping cart to go charging off in someone's direction and then you could rush up behind it saying, "Sorry, sorry!" and that’s how you'd meet, but then you’d have to live with yourself for the next 50 years or so, knowing that, basically, you're Elmer Fudd. Sometimes when a radiant single lady comes floating along the sidewalk like a dream, I think about stopping her. But I never would. It just seems as intrusive as a catcall—or an errant shopping cart. I might as well be passing out handbills for a shady-sounding sample sale. So instead I say nothing and then she’s gone. We won’t be accidental seatmates at a dinner party later. It’s a missed non-connection, a moment less significant than if we’d been on line together at Whole Foods buying the same artisanal sherbet. How-we-met stories are overrated, anyway.

Read more:

Little Kids Rock

by David Bornstein

With school underway, I asked my eight-year-old son this week if he had any interest in learning guitar. He said he’d prefer the piano. I was pleased, but hesitant. I had my own stint with after-school piano lessons at age eight — plinking out notes from classical pieces that were foreign to me. My progress was agonizingly slow and I gave up within months.

Music education hasn’t changed fundamentally since the 1970s. Students are still taught to read notation so they can recite compositions that they would never listen to on their MP3 players or play with friends. The four “streams” in music education — orchestra, chorus, marching band and jazz band — have remained constant for four decades, while a third generation is growing up listening to rock and pop music. And my experience as an eight-year-old is all too common. Many children quit before making progress with an instrument, then regret it as adults. Others play violin or trumpet for the school orchestra or band, then drop the instrument after graduating from high school.

This is a loss for all. Playing music enriches life. That’s why so many adults wish that they could play an instrument, particularly guitar or piano, which are ideally suited for playing with others. The question is: Why do schools teach music in a way that turns off so many young people rather than igniting their imagination? Adolescents and teenagers are crazy about popular music. At a time when educators are desperate to engage students and improve school cultures, can we do a better job of harnessing the power of music to get kids excited about school?

The experience of an organization called Little Kids Rock suggests the answer is a resounding yes — provided we change the way music is taught. Little Kids Rock has helped revitalize music programs in over a thousand public schools and served 150,000 children, most of them from low-income families. The organization has distributed 30,000 free instruments, primarily guitars, and trained 1,500 teachers to run music classes in which students quickly experience the joys of playing their favorite songs, performing in bands, and composing their own music. Along the way, the organization is working to institute a fifth stream in American music education: popular music — or what it calls “contemporary band.”

“Students truly experience just about immediate success in Little Kids Rock,” explained Melanie Faulkner, supervisor of elementary music for Hillsborough County Public Schools, in Tampa, Fla., where 14,000 students in 83 schools are served by the program. “The children feel like they’re right there making real music. And the success spills over into other areas of school.”

The key to Little Kids Rock is that it teaches children to play music the way many musicians learn to play it — not by notation, but by listening, imitation and meaningful experimentation. “The knowledge you need to get started playing rock music is very limited,” explains Dave Wish, the founder of Little Kids Rock. “In high school, my friend Paul taught me a couple of chords and, boom, my life was changed forever.”

“Making music is as much a physical act as it is a cognitive act,” he adds. “We don’t begin with theory when we want to teach a child to play tee-ball. We just bring the kid up to the tee, give them a bat, and let them swing.”

On the first day of class, Little Kids Rock teachers place guitars in the hands of their students and get them practicing chords that will enable them to play thousands of songs. (Many simple lessons are freely available online here.) The kids decide what songs they want to learn and the class is off and running. Their progress is remarkable. Within a year, eight- and nine-year-olds are playing electric guitar, bass guitar, drums and keyboards, and giving concerts, even performing their own songs. And the effect is predictable: the children can’t get enough of it.

Read more:

Wednesday, September 14, 2011


Composition around White, 1959, by Charles Sheeler
via:
Susan Rios - Peaceful Hours
via:

The Opposite of

The opposite of love is not hate, it's indifference.
The opposite of art is not ugliness, it's indifference.
The opposite of faith is not heresy, it's indifference.
And the opposite of life is not death, it's indifference.
 

- Elie Wiesel

The Shame of College Sports

by Taylor Branch

“I’m not hiding,” Sonny Vaccaro told a closed hearing at the Willard Hotel in Washington, D.C., in 2001. “We want to put our materials on the bodies of your athletes, and the best way to do that is buy your school. Or buy your coach.”

Vaccaro’s audience, the members of the Knight Commission on Intercollegiate Athletics, bristled. These were eminent reformers—among them the president of the National Collegiate Athletic Association, two former heads of the U.S. Olympic Committee, and several university presidents and chancellors. The Knight Foundation, a nonprofit that takes an interest in college athletics as part of its concern with civic life, had tasked them with saving college sports from runaway commercialism as embodied by the likes of Vaccaro, who, since signing his pioneering shoe contract with Michael Jordan in 1984, had built sponsorship empires successively at Nike, Adidas, and Reebok. Not all the members could hide their scorn for the “sneaker pimp” of schoolyard hustle, who boasted of writing checks for millions to everybody in higher education.

“Why,” asked Bryce Jordan, the president emeritus of Penn State, “should a university be an advertising medium for your industry?”

Vaccaro did not blink. “They shouldn’t, sir,” he replied. “You sold your souls, and you’re going to continue selling them. You can be very moral and righteous in asking me that question, sir,” Vaccaro added with irrepressible good cheer, “but there’s not one of you in this room that’s going to turn down any of our money. You’re going to take it. I can only offer it.”

William Friday, a former president of North Carolina’s university system, still winces at the memory. “Boy, the silence that fell in that room,” he recalled recently. “I never will forget it.” Friday, who founded and co-chaired two of the three Knight Foundation sports initiatives over the past 20 years, called Vaccaro “the worst of all” the witnesses ever to come before the panel.

But what Vaccaro said in 2001 was true then, and it’s true now: corporations offer money so they can profit from the glory of college athletes, and the universities grab it. In 2010, despite the faltering economy, a single college athletic league, the football-crazed Southeastern Conference (SEC), became the first to crack the billion-dollar barrier in athletic receipts. The Big Ten pursued closely at $905 million. That money comes from a combination of ticket sales, concession sales, merchandise, licensing fees, and other sources—but the great bulk of it comes from television contracts.

Educators are in thrall to their athletic departments because of these television riches and because they respect the political furies that can burst from a locker room. “There’s fear,” Friday told me when I visited him on the University of North Carolina campus in Chapel Hill last fall. As we spoke, two giant construction cranes towered nearby over the university’s Kenan Stadium, working on the latest $77 million renovation. (The University of Michigan spent almost four times that much to expand its Big House.) Friday insisted that for the networks, paying huge sums to universities was a bargain. “We do every little thing for them,” he said. “We furnish the theater, the actors, the lights, the music, and the audience for a drama measured neatly in time slots. They bring the camera and turn it on.” Friday, a weathered idealist at 91, laments the control universities have ceded in pursuit of this money. If television wants to broadcast football from here on a Thursday night, he said, “we shut down the university at 3 o’clock to accommodate the crowds.” He longed for a campus identity more centered in an academic mission.

The United States is the only country in the world that hosts big-time sports at institutions of higher learning. This should not, in and of itself, be controversial. College athletics are rooted in the classical ideal of Mens sana in corpore sano—a sound mind in a sound body—and who would argue with that? College sports are deeply inscribed in the culture of our nation. Half a million young men and women play competitive intercollegiate sports each year. Millions of spectators flock into football stadiums each Saturday in the fall, and tens of millions more watch on television. The March Madness basketball tournament each spring has become a major national event, with upwards of 80 million watching it on television and talking about the games around the office water cooler. ESPN has spawned ESPNU, a channel dedicated to college sports, and Fox Sports and other cable outlets are developing channels exclusively to cover sports from specific regions or divisions.

With so many people paying for tickets and watching on television, college sports has become Very Big Business. According to various reports, the football teams at Texas, Florida, Georgia, Michigan, and Penn State—to name just a few big-revenue football schools—each earn between $40 million and $80 million in profits a year, even after paying coaches multimillion-dollar salaries. When you combine so much money with such high, almost tribal, stakes—football boosters are famously rabid in their zeal to have their alma mater win—corruption is likely to follow.

Read more:

I Survived Target's Missoni Disaster

by DG Strong

Like approximately 356 of my Facebook friends, I spent Tuesday morning driving from Target to Target looking for Missoni. Missoni! Missoni! Are you sick of hearing the word yet? In the last day, various media outlets have been going mad about Target's Missoni disaster. When the megastore chain announced it would be selling the beloved brand's clothes, fans went crazy -- a little too crazy. Buyers crashed the Target website, and there were reports of stampedes, and assorted other frenzies. And I should know, because I witnessed Missoni Madness firsthand.

I'm not even a particular fan of the Missoni aesthetic, but Target has been running the groovy spy-woman commercial for it so incessantly that I'd become practically hypnotized into thinking I really needed some new bath towels and a sweater (autumn is almost here!). I'd be helping the economy, after all -- God Bless America, blah blah blah. Also: Target sells those movie-theater-boxes of candy and I was completely out of Lemonheads.

I spent Monday night looking at the Missoni-for-Target look book and had settled on the items I wanted. No, needed. And I had what I considered an inspired battle plan for Nashville's various Target locations sketched out on a Post-it note: hit the more "Country" Target first for the menswear (figuring farmers in bargello knit cardigans was probably an unusual combination) and then, if necessary, hit the "Soccer Mom" Target for the bath towels (figuring moms would be busy in school drop-off lanes offloading the Cassidys and Calebs of America). I wasn't even going to bother with the "Fancy Urban" Target (the one with the Starbucks inside); every skinny jeans'ed hipster girl within a 15-mile radius of the place would be in line there for a melamine bowl and a tote bag.

So I set my alarm for 7 a.m. and by 8 o'clock on the nose, I was the sole car in the parking lot of Country Target. Could it be that my plan was unfolding perfectly? Would I just waltz in, get exactly two black-and-white Famiglia Wavy bath towels, two black-and-white Famiglia Wavy hand towels, and one black-and-white men's cardigan? Alas, no. Country Target had apparently missed the memo about the upcoming flame-stitch feeding frenzy and not all of the stuff was out yet. A few bowls here and there, a scarf. No towels. No menswear. Worrisome.

Read more: