Tuesday, November 20, 2012

New MySpace Seems Too Good to be True


A funny thing happened after my drive home from a Los Angeles press junket where MySpace executives Tim Vanderhook, Chris Vanderhook, and Justin Timberlake gave members of the media a detailed tour of the yet-to-be-unveiled site: I changed my mind.

While dictating detailed notes into my iPhone during the drive, I decided that the second coming of MySpace is like an extremely beautiful woman who also possesses the intelligence of a scholar — too much to absorb.

If you can have too much of a good thing, the reincarnated MySpace is that thing, I reasoned.

But when I sat down to write this story and actually started exploring MySpace and its 53 million tracks, I got lost in the experience. Suddenly, the words of the executive brothers from earlier in the day came back to me.

“You give users a couple of days and they become hooked,” CEO Tim said. He was responding to my query as to whether MySpace was too convoluted, too complicated.

He’s right. I’ve spent a few hours with the site. I think I’m hooked.

“The Internet just became boring,” COO Chris said to a room of eight reporters (and several handlers), all of whom were hoping to hear more from Mr. Sexy-Back. “There was nothing fun anymore … I want to make it fun to use MySpace.”

It is fun, and so I have to amend my conclusion to this: Wrapped in a pretty package and equipped with brains to match, MySpace feels too good to be true. It’s not. No joke.

Gushing aside, there’s a full review to be had, not all so glowing, so let’s get to it.

The profile

Log on to MySpace and you’ll find a design so noticeably different from anything else you’ve encountered that it will be hard to look away.

Designed for artists and their fans, the new MySpace, said every executive and product manager I talked to, is not a redesign. It’s a new product with a new purpose and a design meant to evoke emotion. MySpace wants to draw people into relationships with creatives and the content they produce.

“The standout feature is the design. No doubt,” Chris said. “We really changed the level of expectations of consumers about what design is for a website.”

by Jennifer Van Grove, Venture Beat |  Read more:
Image: Unkown

Monday, November 19, 2012


Denis Peploe Still Life with Fish 1970
via:

Lennart Olson, Röd signal / Red light 1970

Deadhead


The first memory I have of the Grateful Dead is of a classmate in sixth grade telling me he’d gone to see them with his older sister. He reported that the band consisted of a bunch of hairy old guys in baggy clothes sitting on a stage eating spaghetti. It occurred to me later that he might have made this up, or that his sister had perhaps said something about “noodling.” I’ve since concluded that this would have been the band’s fall, 1980, stand at Radio City Music Hall, when the Dead, most definitely hairy and baggily clothed (though none of them yet over forty), opened each night with an acoustic set, during which a few of them sat on stools. I’ve never found anything in the literature regarding spaghetti.

Otherwise, I thought of the Dead at that time, if I thought of them at all, as some kind of malevolent cult, or, at least, a heavy-metal outfit, like Black Sabbath. A kid saw the iconography around—the skulls and skeletons—and imagined dark, angry noise. When I was thirteen, I bought an album of greatest hits, “Skeletons from the Closet,” and discovered that I’d been wrong. Many of the songs were delicate acoustic numbers with rustic harmonies and bouncy, if obscure, lyrics. There was some country, some folk, some blues, a Chuck Berry rocker. The lead singer, or one of them, had a delicate tenor. No Ozzy Osbourne, this guy. Maybe they really were just hippies who ate spaghetti onstage. It didn’t seem like much. Give me the guy who bites heads off bats. Give me “War Pigs.”  (...)

It is very easy, and in many circles compulsory, to make fun of the Dead. “What does a Deadhead say when the drugs wear off? ‘This music sucks.’ ” The Dead, more than any band of their stature, have legions of haters—real hostility—as typified by Dave Marsh’s remark, in Playboy, that they were “the worst band in creation.”

What’s to hate? Even the fanatic can admit to a few things. The Dead were musically self-indulgent, and yet, to some ears, harmonically shallow. They played one- and two-chord jams that went on for twenty or thirty minutes. One live version of “Dark Star,” a modal vamp based on the A mixolydian scale, with two short verses and no bridge, clocked in at forty-eight minutes. (Oh, to have been in Rotterdam!) Even their straightforward songs could go on for ten or twelve minutes. Pop-craft buffs, punkers, and anyone steeped in the orthodoxy of concision tend to plug their ears to the noodling, while jazz buffs often find it unsophisticated and aimless. The Dead’s sense of time was not always crisp. It’s been said that the two drummers, in the eighties, sounded like sneakers in a dryer. For those attracted to the showy side of rock, the Dead were always an unsightly ensemble, whose ugliness went undiminished in middle age—which happened to coincide with the dawn of MTV. They were generally without sex appeal. Bob Weir, their showman and heartthrob, might be said to be an exception, but he spent much of the eighties performing in short cutoff jean shorts and lavender tank tops—a sight even more troubling, I’d submit, than that of Garcia circa 1984, drooling on his microphone as he fought off the nods. Even the high-tech light shows of later years and the spaceship twinkle of their amplifiers could not compensate for a lumpy stage presence. They could be sloppy, unrehearsed. They forgot lyrics, sang out of key, delivered rank harmonies, missed notes, blew takeoffs and landings, and laid down clams by the dozen. Their lyrics were often fruity—hippie poetry about roses and bells and dew. They resisted irony. They were apolitical. They bombed at the big gigs. They unleashed those multicolored dancing bears.

Most objectionable, perhaps, were the Deadheads, that travelling gang of phony vagabonds. As unironic as the Dead may have been, Deadheads were more so. Not for them the arch framings and jagged epiphanies of punk. They dispensed bromides about peace and fellowship as they laid waste to parking lots and town squares. Many came by the stereotypes honestly: airheads and druggies, smelling of patchouli and pot, hairy, hypocritical, pious, ingenuous, and uncritical in the extreme. They danced their flappy Snoopy dance and foisted their hissy bootlegs on roommates and friends, clearing dance floors and common rooms. The obnoxious ones came in many varieties: The frat boys in their Teva sandals and tie-dyed T-shirts, rolling their shoulders to the easy lilt of “Franklin’s Tower.” The so-called spinners, dervishes in prairie skirts and bare feet. The earnest acoustic strummers of “Uncle John’s Band,” the school-bus collective known as the Rainbow Family, the gaunt junkies shuffling around their vans like the Sleestaks in “Land of the Lost”—they came for the party, more than for the band. Sometimes they didn’t even bother to go in to the show. They bought into the idea, which grew flimsier each year, that following a rock band from football stadium to football stadium, fairground to fairground, constituted adventure of the Kerouac kind.

This is not to say that adventures were not had. At a certain point, later in the band’s career, the Dead became, especially on the East Coast, a token of entitlement squandered or lightly worn. Consider the preppy Deadhead, in his new Jetta, and his counterpart, the Jewish Deadhead, with his boxes of blank Maxells. In “Perspectives on the Grateful Dead,” a volume of scholarly writings published in 1999, one author, in an essay called “Why Are There So Many Jewish Deadheads?,” attempts to explain the affinity in terms of the Diaspora’s search for spiritual meaning (neshama) and community (chevra). The goyish trustafarians lacked that excuse. At any rate, they all quailed in the presence of the biker Deadheads, the leather-vested roughnecks crying out for “U.S. Blues,” but were heartened, in absentia, to have seen them there. The tough guys seasoned the scene with authenticity and menace.

The Dead’s reputation and press coverage have always fixated on the culture that sprouted up around the band, and that then began to choke it, like a weed. When the Dead stopped touring, many of the fans moved on to other travelling carnivals—often to the so-called jam bands that had drawn inspiration and a music-industry approach (though not quite a musical vocabulary) from the Dead. This, too, was often taken to be a kind of indictment: the Dead are sometimes damned by the company their fans keep. The conflation of the Dead with, say, the Dave Matthews Band—incongruous as the two may be musically—can really smart.

There is a silent minority, though, of otherwise unobjectionable aesthetes who, as “Grateful Dead” has become a historical record, rather than a living creative enterprise, have found themselves rekindling a fascination with the band’s recorded legacy. These are the tapeheads, the geeks, the throngs of workaday Phil Schaaps, who approach the band’s body of work with the intensity and the attention to detail that one might bring to birding, baseball, or the Talmud. They may be brain surgeons, lawyers, bartenders, or even punk-rock musicians. Really, it shouldn’t matter what they do, or what they smell like, or whether they can still take a toke without keeling over. It’s the music, and not the parking lot, that’s got them by the throat.

by Nick Paumgarten, New Yorker |  Read more:
Photo: Robert Altman/Michael Ochs Archives/Getty

Making Cents


I'm sure each generation of musicians feels they've lived through a time of tremendous change, but the shifts I've witnessed in my relatively short music career-- from morphing formats to dissolving business models-- do seem extraordinary. The first album I made was originally released on LP only, in 1988-- and my next will likely only be pressed on LP again. But in between, the music industry seems to have done everything it could to screw up that simple model of exchange; today it is no longer possible for most of us to earn even a modest wage through our recordings.

Not that I am naively nostalgic for the old days-- we weren't paid for that first album, either. (The record label we were signed to at the time, Rough Trade, declared bankruptcy before cutting us even one royalty check.) But the ways in which musicians are screwed have changed qualitatively, from individualized swindles to systemic ones. And with those changes, a potential end-run around the industry's problems seems less and less possible, even for bands who have managed to hold on to 100% of their rights and royalties, as we have.

Consider Pandora and Spotify, the streaming music services that are becoming ever more integrated into our daily listening habits. My BMI royalty check arrived recently, reporting songwriting earnings from the first quarter of 2012, and I was glad to see that our music is being listened to via these services. Galaxie 500's "Tugboat", for example, was played 7,800 times on Pandora that quarter, for which its three songwriters were paid a collective total of 21 cents, or seven cents each. Spotify pays better: For the 5,960 times "Tugboat" was played there, Galaxie 500's songwriters went collectively into triple digits: $1.05 (35 cents each).

To put this into perspective: Since we own our own recordings, by my calculation it would take songwriting royalties for roughly 312,000 plays on Pandora to earn us the profit of one--one-- LP sale. (On Spotify, one LP is equivalent to 47,680 plays.)

Or to put it in historical perspective: The "Tugboat" 7" single, Galaxie 500's very first release, cost us $980.22 for 1,000 copies-- including shipping! (Naomi kept the receipts)-- or 98 cents each. I no longer remember what we sold them for, but obviously it was easy to turn at least a couple bucks' profit on each. Which means we earned more from every one of those 7"s we sold than from the song's recent 13,760 plays on Pandora and Spotify. Here's yet another way to look at it: Pressing 1,000 singles in 1988 gave us the earning potential of more than 13 million streams in 2012. (And people say the internet is a bonanza for young bands...) (...)

Which gets to the heart of the problem. When I started making records, the model of economic exchange was exceedingly simple: make something, price it for more than it costs to manufacture, and sell it if you can. It was industrial capitalism, on a 7" scale. The model now seems closer to financial speculation. Pandora and Spotify are not selling goods; they are selling access, a piece of the action. Sign on, and we'll all benefit. (I'm struck by the way that even crowd-sourcing mimics this "investment" model of contemporary capitalism: You buy in to what doesn't yet exist.)

But here's the rub: Pandora and Spotify are not earning any income from their services, either. In the first quarter of 2012, Pandora-- the same company that paid Galaxie 500 a total of $1.21 for their use of "Tugboat"-- reported a net loss of more than $20 million dollars. As for Spotify, their latest annual report revealed a loss in 2011 of $56 million.

Leaving aside why these companies are bothering to chisel hundredths of a cent from already ridiculously low "royalties," or paying lobbyists to work a bill through Congress that would lower those rates even further-- let's instead ask a question they themselves might consider relevant: Why are they in business at all?

by Damon Krukowski, Pitchfork |  Read more:
Illustration: Unknown

The Twerps



[ed. See also: Who Killed the Twinkie?]
via:

Living the Four Star Life

Then-defense secretary Robert M. Gates stopped bagging his leaves when he moved into a small Washington military enclave in 2007. His next-door neighbor was Mike Mullen, the chairman of the Joint Chiefs of Staff at the time, who had a chef, a personal valet and — not lost on Gates — troops to tend his property.

Gates may have been the civilian leader of the world’s largest military, but his position did not come with household staff. So, he often joked, he disposed of his leaves by blowing them onto the chairman’s lawn.

“I was often jealous because he had four enlisted people helping him all the time,” Gates said in response to a question after a speech Thursday. He wryly complained to his wife that “Mullen’s got guys over there who are fixing meals for him, and I’m shoving something into the microwave. And I’m his boss.”

Of the many facts that have come to light in the scandal involving former CIA director David H. Petraeus, among the most curious was that during his days as a four-star general, he was once escorted by 28 police motorcycles as he traveled from his Central Command headquarters in Tampa to socialite Jill Kelley’s mansion. Although most of his trips did not involve a presidential-size convoy, the scandal has prompted new scrutiny of the imperial trappings that come with a senior general’s lifestyle.

The commanders who lead the nation’s military services and those who oversee troops around the world enjoy an array of perquisites befitting a billionaire, including executive jets, palatial homes, drivers, security guards and aides to carry their bags, press their uniforms and track their schedules in 10-minute increments. Their food is prepared by gourmet chefs. If they want music with their dinner parties, their staff can summon a string quartet or a choir.

The elite regional commanders who preside over large swaths of the planet don’t have to settle for Gulfstream V jets. They each have a C-40, the military equivalent of a Boeing 737, some of which are configured with beds.

Since Petraeus’s resignation, many have strained to understand how such a celebrated general could have behaved so badly. Some have speculated that an exhausting decade of war impaired his judgment. Others wondered if Petraeus was never the Boy Scout he appeared to be. But Gates, who still possesses a modest Kansan’s bemusement at Washington excess, has floated another theory.

“There is something about a sense of entitlement and of having great power that skews people’s judgment,” Gates said last week.  (...)

“You can become completely disconnected from the way people live in the regular world — and even from the modest lifestyle of others in the military,” Barno said. “When that happens, it’s not necessarily healthy either for the military or the country.”

Although American generals have long enjoyed many perks — in World War II and in Vietnam, some dined on china set atop linen tablecloths — the amenities afforded to today’s military leaders are more lavish than anyone else in government enjoys, save for the president.

by  Rajiv Chandrasekaran and Greg Jaffe, Washington Post |  Read more:
Photo credit: Not indicated

Sunday, November 18, 2012

When the Nerds Go Marching In


The Obama campaign's technologists were tense and tired. It was game day and everything was going wrong.

Josh Thayer, the lead engineer of Narwhal, had just been informed that they'd lost another one of the services powering their software. That was bad: Narwhal was the code name for the data platform that underpinned the campaign and let it track voters and volunteers. If it broke, so would everything else.

They were talking with people at Amazon Web Services, but all they knew was that they had packet loss. Earlier that day, they lost their databases, their East Coast servers, and their memcache clusters. Thayer was ready to kill Nick Hatch, a DevOps engineer who was the official bearer of bad news. Another of their vendors, PalominoDB, was fixing databases, but needed to rebuild the replicas. It was going to take time, Hatch said. They didn't have time.

They'd been working 14-hour days, six or seven days a week, trying to reelect the president, and now everything had been broken at just the wrong time. It was like someone had written a Murphy's Law algorithm and deployed it at scale.

And that was the point. "Game day" was October 21. The election was still 17 days away, and this was a live action role playing (LARPing!) exercise that the campaign's chief technology officer, Harper Reed, was inflicting on his team. "We worked through every possible disaster situation," Reed said. "We did three actual all-day sessions of destroying everything we had built."

Hatch was playing the role of dungeon master, calling out devilishly complex scenarios that were designed to test each and every piece of their system as they entered the exponential traffic-growth phase of the election. Mark Trammell, an engineer who Reed hired after he left Twitter, saw a couple game days. He said they reminded him of his time in the Navy. "You ran firefighting drills over and over and over, to make sure that you not just know what you're doing," he said, "but you're calm because you know you can handle your shit."

The team had elite and, for tech, senior talent -- by which I mean that most of them were in their 30s -- from Twitter, Google, Facebook, Craigslist, Quora, and some of Chicago's own software companies such as Orbitz and Threadless, where Reed had been CTO. But even these people, maybe *especially* these people, knew enough about technology not to trust it. "I think the Republicans fucked up in the hubris department," Reed told me. "I know we had the best technology team I've ever worked with, but we didn't know if it would work. I was incredibly confident it would work. I was betting a lot on it. We had time. We had resources. We had done what we thought would work, and it still could have broken. Something could have happened."

In fact, the day after the October 21 game day, Amazon services -- on which the whole campaign's tech presence was built -- went down. "We didn't have any downtime because we had done that scenario already," Reed said. Hurricane Sandy hit on another game day, October 29, threatening the campaign's whole East Coast infrastructure. "We created a hot backup of all our applications to US-west in preparation for US-east to go down hard," Reed said.

"We knew what to do," Reed maintained, no matter what the scenario was. "We had a runbook that said if this happens, you do this, this, and this. They did not do that with Orca."

THE NEW CHICAGO MACHINE vs. THE GRAND OLD PARTY

Orca was supposed to be the Republican answer to Obama's perceived tech advantage. In the days leading up to the election, the Romney campaign pushed its (not-so) secret weapon as the answer to the Democrats' vaunted ground game. Orca was going to allow volunteers at polling places to update the Romney camp's database of voters in real time as people cast their ballots. That would supposedly allow them to deploy resources more efficiently and wring every last vote out of Florida, Ohio, and the other battleground states. The product got its name, a Romney spokesperson told NPR , because orcas are the only known predator of the one-tusked narwhal.

The billing the Republicans gave the tool confused almost everyone inside the Obama campaign. Narwhal wasn't an app for a smartphone. It was the architecture of the company's sophisticated data operation. Narwhal unified what Obama for America knew about voters, canvassers, event-goers, and phone-bankers, and it did it in real time. From the descriptions of the Romney camp's software that were available then and now, Orca was not even in the same category as Narwhal. It was like touting the iPad as a Facebook killer, or comparing a GPS device to an engine. And besides, in the scheme of a campaign, a digitized strike list is cool, but it's not, like, a gamechanger. It's just a nice thing to have.

by Alexis C. Madrigal, The Atlantic |  Read more:
Photo by Daniel X. O'Neil

As Boom Lures App Creators, Tough Part Is Making a Living


Shawn and Stephanie Grimes spent much of the last two years pursuing their dream of doing research and development for Apple, the world’s most successful corporation.

But they did not actually have jobs at Apple. It was freelance work that came with nothing in the way of a regular income, health insurance or retirement plan. Instead, the Grimeses tried to prepare by willingly, even eagerly, throwing overboard just about everything they could.

They sold one of their cars, gave some possessions to relatives and sold others in a yard sale, rented out their six-bedroom house and stayed with family for a while. They even cashed in Mr. Grimes’s 401(k).

“We didn’t lose any sleep over it,” said Mr. Grimes, 32. “I’ll retire when I die.”

The couple’s chosen field is so new it did not even exist a few years ago: writing software applications for mobile devices like the iPhone or iPad. Even as unemployment remained stubbornly high and the economy struggled to emerge from the recession’s shadow, the ranks of computer software engineers, including app writers, increased nearly 8 percent in 2010 to more than a million, according to the latest available government data for that category. These software engineers now outnumber farmers and have almost caught up with lawyers.

Much as the Web set off the dot-com boom 15 years ago, apps have inspired a new class of entrepreneurs. These innovators have turned cellphones and tablets into tools for discovering, organizing and controlling the world, spawning a multibillion-dollar industry virtually overnight. The iPhone and iPad have about 700,000 apps, from Instagram to Angry Birds.

Yet with the American economy yielding few good opportunities in recent years, there is debate about how real, and lasting, the rise in app employment might be.

Despite the rumors of hordes of hip programmers starting million-dollar businesses from their kitchen tables, only a small minority of developers actually make a living by creating their own apps, according to surveys and experts. The Grimeses began their venture with high hopes, but their apps, most of them for toddlers, did not come quickly enough or sell fast enough.

And programming is not a skill that just anyone can learn. While people already employed in tech jobs have added app writing to their résumés, the profession offers few options to most unemployed, underemployed and discouraged workers.

One success story is Ethan Nicholas, who earned more than $1 million in 2009 after writing a game for the iPhone. But he says the app writing world has experienced tectonic shifts since then.

“Can someone drop everything and start writing apps? Sure,” said Mr. Nicholas, 34, who quit his job to write apps after iShoot, an artillery game, became a sensation. “Can they start writing good apps? Not often, no. I got lucky with iShoot, because back then a decent app could still be successful. But competition is fierce nowadays, and decent isn’t good enough.”

The boom in apps comes as economists are debating the changing nature of work, which technology is reshaping at an accelerating speed. The upheaval, in some ways echoing the mechanization of agriculture a century ago, began its latest turbulent phase with the migration of tech manufacturing to places like China. Now service and even white-collar jobs, like file clerks and data entry specialists or office support staff and mechanical drafters, are disappearing.

“Technology is always destroying jobs and always creating jobs, but in recent years the destruction has been happening faster than the creation,” said Erik Brynjolfsson, an economist and director of the M.I.T. Center for Digital Business.

by David Streitfeld, NY Times |  Read more:
Photo: Daniel Rosenbaum for The New York Times

Brandi Carlile



Pt. Reyes, CA by Alex Fradkin
via:

Maybe the Web's Not a Place to Stick Your Ads

"Steve Jobs hates the internet." So jokes a contact of mine whenever he laments what he regards as Apple's relatively paltry investment in web advertising. The point that person -- who once had a stake in that investment -- is trying to make is not that Mr. Jobs is actually a closet Luddite but that Apple, one of the world's strongest brands, isn't as experimental as it should be and, as such, isn't contributing enough to the gold rush that is the digital-advertising business.

That's one way to look at it. Another is that regardless of what it lays out on ads, Apple has a greater online presence than most brands that spend many times what it does. Consider that in December, Apple sites had the 10th-best traffic figures on the web. Those sites, which grabbed more unique visitors than many of the most popular sites where Apple would place its own ads -- including The New York Times, NBC Universal and ESPN -- are destinations. Plus, there's the endless gusher of Apple-obsessed jabbering on any number of blogs and social networks. Oh, and Apple did manage to lay out $32 million in measured media online in 2007, more than double the amount it spent the year before and four times its 2005 outlay.

Look closely at the disappointment that an advanced marketer in 2008 wouldn't be willing to spend more than that to spray its brand all over an Internet already saturated by it and you'll see very clearly some misperceptions plaguing the marketing business today. First, there's the basic mistake that marketing is synonymous with advertising. Then, there's the underexamined assumption so popular in marketing circles of all kinds that when it comes to helping companies create brands or move product the Internet's greatest use is as an ad medium.

Are we having the right conversation?

What you're about to read is not an argument for making over web marketing as a factory for destination websites or for making every brand a content player. Not every brand has as much natural pull as Apple and, anyway, there have already been high-profile flubs in the if-you-build-a-content-channel-they-will-come department (Bud.TV, anyone?). This, however, is a call to give some thought to a question that's not asked enough about the Internet: Should it even be viewed as an ad medium? After all, in some quarters of the broader marketing world, the habit of looking at advertising as the most important tool in the marketers' toolbox is undergoing intense interrogation. Consider the growth of the word-of-mouth marketing business, premised on the notion that people not corporations who help other people make consumer decisions. Or look at the growing importance put on public relations and customer-relationship management both in marketing circles and even in the c-suite.

The same conversation should be going on around the Internet. Trends like those listed suggest the possibility of a post-advertising age, a not-too-distant future where consumers will no longer be treated as subjects to be brainwashed with endless repetitions of whatever messaging some focus group liked. That world isn't about hidden persuasion, but about transparency and dialogue and at its center is that supreme force of consumer empowerment, the Internet. But when you look at how the media and marketing business packages the Internet -- as just more space to be bought and sold -- you have to worry that the history of mass media is just trying to repeat itself. Rarely a fortnight goes by without some new bullish forecast for ad growth that works to stoke digital exuberance within media owners that often drowns out critical thinking about the medium itself.

Here's the issue: The internet is too often viewed as inventory, as a place where brands pay for the privilege of being adjacent to content, like prime-time TV and glossy magazines relics of the pre-blog days when getting into the media game actually required infrastructure and distribution. The presumed power of that adjacency has provided the groundwork for the media industry for decades and long ago calcified into an auspicious economic reality the big media companies are trying to take with it to the digital future. For the media seller, ads and ad revenue might be all that's left.

by Matthew Creamer, Ad Age |  Read more:
Image: The Guardian

That Was Fast

So, late Friday, we reported on how the Republican Study Committee (the conservative caucus of House Republicans) had put out a surprisingly awesome report about copyright reform. You can read that post to see the details. The report had been fully vetted and reviewed by the RSC before it was released. However, as soon as it was published, the MPAA and RIAA apparently went ballistic and hit the phones hard, demanding that the RSC take down the report. They succeeded. Even though the report had been fully vetted and approved by the RSC, executive director Paul S. Teller has now retracted it, sending out the following email to a wide list of folks this afternoon:
From: Teller, Paul
Sent: Saturday, November 17, 2012 04:11 PM
Subject: RSC Copyright PB

We at the RSC take pride in providing informative analysis of major policy issues and pending legislation that accounts for the range of perspectives held by RSC Members and within the conservative community. Yesterday you received a Policy Brief on copyright law that was published without adequate review within the RSC and failed to meet that standard. Copyright reform would have far-reaching impacts, so it is incredibly important that it be approached with all facts and viewpoints in hand. As the RSC’s Executive Director, I apologize and take full responsibility for this oversight. Enjoy the rest of your weekend and a meaningful Thanksgiving holiday....

Paul S. Teller
Executive Director
U.S. House Republican Study Committee
Paul.Teller@mail.house.gov
http://republicanstudycommittee.com
The idea that this was published "without adequate review" is silly. Stuff doesn't just randomly appear on the RSC website. Anything being posted there has gone through the same full review process. What happened, instead, was that the entertainment industry's lobbyists went crazy, and some in the GOP folded.

by Mike Masnick, TechDirt |  Read more:

Saturday, November 17, 2012


René Magritte. Three nudes in an interior, 1923
via:

David Foster Wallace: The Nature of Fun


[ed. Excerpt from DFW's posthumously published collection Both Flesh and Not on a writer's motivation.]

But it's still all a lot of fun. Don't get me wrong. As to the nature of that fun, I keep remembering this strange little story I heard in Sunday school when I was about the size of a fire hydrant. It takes place in China or Korea or someplace like that. It seems there was this old farmer outside a village in the hill country who worked his farm with only his son and his beloved horse. One day the horse, who was not only beloved but vital to the labour-intensive work on the farm, picked the lock on his corral or whatever and ran off into the hills. All the old farmer's friends came around to exclaim what bad luck this was. The farmer only shrugged and said, "Good luck, bad luck, who knows?" A couple of days later the beloved horse returned from the hills in the company of a whole priceless herd of wild horses, and the farmer's friends all come around to congratulate him on what good luck the horse's escape turned out to be. "Good luck, bad luck, who knows?" is all the farmer says in reply, shrugging. The farmer now strikes me as a bit Yiddish-sounding for an old Chinese farmer, but this is how I remember it. But so the farmer and his son set about breaking the wild horses, and one of the horses bucks the son off his back with such wild force that the son breaks his leg. And here come the friends to commiserate with the farmer and curse the bad luck that had ever brought these accursed wild horses on to his farm. The old farmer just shrugs and says: "Good luck, bad luck, who knows?" A few days later the Imperial Sino-Korean Army or something like that comes marching through the village, conscripting every able-bodied male between 10 and 60 for cannon-fodder for some hideously bloody conflict that's apparently brewing, but when they see the son's broken leg, they let him off on some sort of feudal 4-F, and instead of getting shanghaied the son stays on the farm with the old farmer. Good luck? Bad luck?

This is the sort of parabolic straw you cling to as you struggle with the issue of fun, as a writer. In the beginning, when you first start out trying to write fiction, the whole endeavour's about fun. You don't expect anybody else to read it. You're writing almost wholly to get yourself off. To enable your own fantasies and deviant logics and to escape or transform parts of yourself you don't like. And it works – and it's terrific fun. Then, if you have good luck and people seem to like what you do, and you actually get to get paid for it, and get to see your stuff professionally typeset and bound and blurbed and reviewed and even (once) being read on the AM subway by a pretty girl you don't even know, it seems to make it even more fun. For a while. Then things start to get complicated and confusing, not to mention scary. Now you feel like you're writing for other people, or at least you hope so. You're no longer writing just to get yourself off, which – since any kind of masturbation is lonely and hollow – is probably good. But what replaces the onanistic motive? You've found you very much enjoy having your writing liked by people, and you find you're extremely keen to have people like the new stuff you're doing. The motive of pure personal fun starts to get supplanted by the motive of being liked, of having pretty people you don't know like you and admire you and think you're a good writer. Onanism gives way to attempted seduction, as a motive.

Now, attempted seduction is hard work, and its fun is offset by a terrible fear of rejection. Whatever "ego" means, your ego has now gotten into the game. Or maybe "vanity" is a better word. Because you notice that a good deal of your writing has now become basically showing off, trying to get people to think you're good. This is understandable. You have a great deal of yourself on the line, now, writing – your vanity is at stake. You discover a tricky thing about fiction writing: a certain amount of vanity is necessary to be able to do it at all, but any vanity above that certain amount is lethal. At this point 90+% of the stuff you're writing is motivated and informed by an overwhelming need to be liked. This results in shitty fiction. And the shitty work must get fed to the wastebasket, less because of any sort of artistic integrity than simply because shitty work will make you disliked. At this point in the evolution of writerly fun, the very thing that's always motivated you to write is now also what's motivating you to feed your writing to the wastebasket. This is a paradox and a kind of double bind, and it can keep you stuck inside yourself for months or even years, during which you wail and gnash and rue your bad luck and wonder bitterly where all the fun of the thing could have gone.

The smart thing to say, I think, is that the way out of this bind is to work your way somehow back to your original motivation: fun. And, if you can find your way back to the fun, you will find that the hideously unfortunate double bind of the late vain period turns out really to have been good luck for you. Because the fun you work back to has been transfigured by the unpleasantness of vanity and fear, an unpleasantness you're now so anxious to avoid that the fun you rediscover is a way fuller and more large-hearted kind of fun. It has something to do with Work as Play. Or with the discovery that disciplined fun is more fun than impulsive or hedonistic fun. Or with figuring out that not all paradoxes have to be paralysing. Under fun's new administration, writing fiction becomes a way to go deep inside yourself and illuminate precisely the stuff you don't want to see or let anyone else see, and this stuff usually turns out (paradoxically) to be precisely the stuff all writers and readers share and respond to, feel. Fiction becomes a weird way to countenance yourself and to tell the truth instead of being a way to escape yourself or present yourself in a way you figure you will be maximally likeable. This process is complicated and confusing and scary, and also hard work, but it turns out to be the best fun there is.

by David Foster Wallace, The Guardian |  Read more:
Photograph: © Gary Hannabarger/Corbis

Our New $237/month Health Insurance Plan

As noted in past articles, I’ve had a pretty cozy health insurance situation up to this point. Growing up in Canada, I was blissfully unaware of the issue, since like virtually all other rich nations, that country provides universal healthcare for all citizens. I took advantage of that system for exactly two major health events: being born in the early 1970s, and a broken ankle after a bike accident in the late 1990s. Both times, the hospital got the job done well.

Moving to the United States, I found the choice of employer-offered health insurance plans confusing, so I just went with the cheapest one. Occasional gaps in coverage occurred as I hopped between employers throughout the early 2000s, but I didn’t notice since I was fortunate enough to have no occasion to visit a doctor during those years.

Then early retirement came and my wife was kind enough to throw me under the umbrella of coverage offered by her part-time employer for the last five years. Although I was grateful, I was not able to take advantage of the insurance outside of an annual visit to the doctor for a checkup. But it did help out greatly by paying most of the bill for the hospital birth of our son.

At last, she quit her part-time job, the free insurance ended, and we were forced to think for ourselves earlier this fall. So all of the health history above went into deciding how to cover ourselves for the rest of our adult lives, during which we will probably never be conventionally employed again.

The thing about insurance is that it is best enjoyed as a game of numbers and probabilities – not feared as a nightmare of imagined outcomes. As I noted long ago in Insurance: A tax on people who are bad at math?, there are only two situations in which I buy insurance:
  • If I am significantly riskier than the insurance company thinks I am, or
  • If the consequences of being uninsured would be too disastrous for me to handle, yet still have a reasonable chance of occurring
For car insurance, the choice is clear: my car is worth about $7,000 right now, so if I destroyed it in a crash, replacing it would not make a big dent in the ‘Stash. Plus, I’ve never been in an accident, and my car lives in a cozy garage and rarely get used (meaning I am probably even less risky than the insurance company expects). So I don’t buy collision or comprehensive insurance.

Health insurance is different: medical care is expensive in the US, with lifetime costs for major conditions potentially reaching to a million dollars or more. On top of that, my young son is a wild card who is more likely than me to injure himself while playing, and I still have the slightly dangerous hobbies of mountain biking and snowboarding. We may even be slightly riskier than the insurance company estimates, making the choice to buy health insurance a positive one.

The next step was looking at our own health care spending over the 13 years we’ve lived in the US:
  • From 1999-2005, costs were negligible: less than one check-up per year each, with no treatments or prescriptions. They were covered by insurance, but even if paid out of pocket, this would have averaged to under $200 per year.
  • In 2006, the birth of the boy and related issues racked up a bill of about $20,000 (a routine surgical intervention was needed, quadrupling the cost), $4,500 of which we had to pay ourselves.
  • From 2006 to the present, we have averaged one doctor checkup each per year, plus one antibiotic prescription per year between us, which if paid out of pocket would have cost about $600 per year.
Total medical spending (mostly covered by insurance): about $25,000
Total premiums paid by from employers to insurance companies on our behalf: about $100,000

Hey, there’s an unexpected result! We took a 12-year period which included the once-in-a-lifetime (for us) event of a hospital birth of a baby with added surgery, and it still ended up that the insurance premiums were about four times higher than the insurance benefits. This told me that I should probably shop carefully for insurance, in order to get something that protects me from those million-dollar illnesses, but does not attempt to pay for any hundred-dollar incidents, since the cost for that extra protection is clearly very high.

The next stop was an insurance comparison engine. We used ehealthinsurance.com* to do this search, which allowed me to see offerings from the companies that compete specifically in my area – sorted by price. I was pleased to note that prices drop rapidly as the annual deductible rises – meaning most health care expenses are statistically the lower cost ones, and the million-dollar illnesses are indeed very rare (otherwise the premiums would be different).

The winning plan for us was one called the “Saver80 United Health One” plan from United Healthcare, with a quoted price of $219/month** for the family (two 38-year-old adults and a 6-year old boy). The price is low because it comes with a relatively whopping $10,000 per-person/ $20k-per-family deductible, meaning we are very unlikely to ever use this coverage. But at the same time, covering $10-20k in the event of a catastrophe would not be a significant hardship for us, especially given that this is an unlikely event. Even if the expense were to reoccur annually for decades, we could adjust our lifestyle as needed, or earn more income, or get a job with insurance coverage, or make any number of other changes – assuming we even survived that long with such a serious condition. So it passes the test of putting a safe cap on expenses.

All plans these days also provide one free checkup (or “annual physical”) doctor visit per year, with no copay or deductible at all. The value of this alone is worth 10-15% of the annual premium of our new plan.

by MMM | Read more: