Wednesday, July 27, 2011

Portland and Booklyn

by Adrianne Jeffries

On a cold day in late January, Paul LaRosa, an author and CBS producer, and his wife, Susan, were shopping for cheese at the Park Slope/Gowanus Indoor Winter Farmer’s Market at Third Avenue and Third Street when they struck up a conversation at one of the stands with a tall, clean-cut yoga instructor who had just returned from studying meditation in Thailand.

He had discovered the most marvelous cocoa there, he enthused, and offered them a tiny, wrapped sample of stone-ground, small batch “virgin” chocolate, which he sells in four flavors including Blueberry Lavender and Vanilla Rooibos.

“I had just seen Portlandia,” Mr. LaRosa told The Observer, referring to the indie sitcom. “And as this nice guy began telling us all the trouble he’d gone to to make this chocolate, my head went straight to the first episode, where a young couple cannot order the chicken on the menu without knowing the chicken’s name and whether it had any friends.

“In his eyes it wasn’t a simple chocolate bar, it was this whole thing, it was all wrapped up in Thailand and meditation and yoga and beautiful paper,” Mr. LaRosa went on. “This is a guy you could imagine would be a young Wall Street exec or something but he’s making artisanal chocolate bars in Brooklyn.”

Earlier that month, Brooklynites were passing around a clip of Brian Williams riffing on the ironic glasses frames, homemade beads, shared apartments and gourmet grilled cheeses of their home borough, and the New York Times’s marveling at them. “I’m leaving here to get to an artisanal market that just opened up today!” the anchorman snarked. “It’s a flash artisanal market! The newest thing!”

How often the Connecticut commuter actually gets to the better borough is unknown, but the bit killed. “It was dead on,” said Eric Cunningham, a Carroll Gardens-based comedian, who was inspired to start a website calling on Mr. Williams to run for president.

Heroic though it was, Mr. Williams’s intervention may have been too little too late. Brooklyn’s overwrought mustaches and handmade ice cream in upcycled cups are now well-established facts of life. It’s as if the tumor of hipster culture that formed when the cool kids moved to Williamsburg had metastasized into a cluster of cysts pressing down on parts of the borough’s brain. Around the militantly organic Park Slope Co-op, for example, or Brooklyn Flea in Fort Greene, where you can buy rings glued to typewriter keys as well as used, handmade, vegetable-dyed, vintage Oriental rugs for $1,000. Brooklyn is producing and consuming more of its own culture than ever before, giving rise to a sense of Brooklyn exceptionalism and a set of affectations that’s making the borough look more and more like Portland, Oregon.

“Would you like one of my cool little bags?” the chocolate vendor asked after Mrs. LaRosa bought a few bars to use for baking. No thanks, she said.

So it wasn’t until later, when he passed by again, that Mr. LaRosa noticed a sign above the bags. He took a picture because he was afraid he wouldn’t be believed: “Raaka’s packaging is designed by his friends and printed with soy inks on 100 percent postconsumer-recycled, chlorine-free, processed paper that was made from wind-generated energy.” He put the picture on his blog in a post titled “Brooklandia?

Portland was “Brooklyn before Brooklyn was Brooklyn,” as NPR correspondent Ari Shapiro once quipped. His colleague Kurt Andersen, host of the public radio show Studio 360 and co-founder of Spy, put it more starkly: “Brooklyn without black people.”

Mr. Andersen co-founded the Portland Brooklyn Project, a “loose sister-cityish entity” to unite what the organization calls “creators of culture … with an interest in the connection between Portland and Brooklyn,” in 2010; it’s since changed hands. “Both suffered from an urban inferiority complex that during the last decade or so has become a superiority complex,” he explained in an email. “Brooklyn at its best today is in lots of ways probably like Manhattan at its best in the middle third of the 20th century, although with less hard-core, playing-for-keeps, drunken, druggy, up-all-night Bohemianism.”

I lived in Portland for two years after college. It’s a delightful place with plenty of drunken, druggy Bohemianism. But, dear Brooklyn, you do not want to go there.

Read more:

Banks Shared Clients’ Profits, but Not Losses

by Emily Lambert

JPMorgan Chase & Company has a proposition for the mutual funds and pension funds that oversee many Americans’ savings: Heads, we win together. Tails, you lose — alone.

Here is the deal: Funds lend some of their stocks and bonds to Wall Street, in return for cash that banks like JPMorgan then invest. If the trades do well, the bank takes a cut of the profits. If the trades do poorly, the funds absorb all of the losses.

The strategy is called securities lending, a practice that is thriving even though some investments linked to it were virtually wiped out during the financial panic of 2008. These trades were supposed to be safe enough to make a little extra money at little risk.

JPMorgan customers, including public or corporate pension funds of I.B.M., New York State and the American Federation of Television and Radio Artists, ended up owing JPMorgan more than $500 million to cover the losses. But JPMorgan protected itself on some of these investments and kept millions of dollars in profit, before the trades went awry.

How JPMorgan won while its customers lost provides a glimpse into the ways Wall Street banks can, and often do, gain advantages over their customers. Today’s giant banks not only create and sell investment products, but also bet on those products, and sometimes against them, putting the banks’ interests at odds with those of their customers. The banks and their lobbyists also help fashion financial rules and regulations. And banks’ traders know what their customers are buying and selling, giving them a valuable edge.

Some of JPMorgan’s customers say they are disappointed with the bank. “They took 40 percent of our profits, and even that was O.K.,” said Jerry D. Davis, the chairman of the municipal employee pension fund in New Orleans, which lost about $340,000, enough to wipe out years of profits that it had earned through securities lending. “But then we started losing money, and they didn’t lose along with us.”

Read more:

British Women Golfers in the Rough

by Karen Crouse

No amount of sideways rain at Royal St. George’s could obscure the view that the sun is shining on the British empire. The British Open began with three men from the British Isles among the top four golfers in the world and ended with a fourth, Darren Clarke of Northern Ireland, clasping the champion’s Claret Jug.

Lost in all the buzz about how Britain’s talent is rising like clotted cream to the top of the world golf rankings was the fact that, like many celebrated British courses, it is exclusionary. To celebrate the resurgence of British golf is to ignore that the women are lagging far behind their male counterparts.

Heading into the Women’s British Open this week at Carnoustie Golf Club, the highest-ranked women from the British Isles are Catriona Matthew of Scotland (36), Melissa Reid of England (39) and her compatriot Laura Davies (68).

The absence of Britons at or near the top is all the more glaring given Britain’s rich golfing history, one in which players of both sexes have figured prominently. In the 1920s, when the sport was ruled by amateurs, the women’s game was the domain of Joyce Wethered, who won five English Ladies Championships, four British Ladies Amateurs and the esteem of Bobby Jones, the legendary American star, who described her as the finest golfer, male or female, that he had ever seen.

Among professionals in Britain, the women’s standard-bearer is Davies, a four-time major winner from Coventry, England, who was the L.P.G.A.’s leading money winner in 1994 and the Player of the Year in 1996.

Those who play and follow the sport suggest golf’s patrician roots in Britain have constricted the women’s professional progress. Neil Squires, who covers golf for the Manchester Evening News, estimated that 90 percent of the country’s golfers are men. There remain clubs, he said, where women are invisible by design.

“Historically, there’s always been an issue with golf and all-male clubs,” Squires said, adding that until recently there was a sign displayed at Royal St. George’s that reflected the prevailing attitude.

“It read ‘No Women, No Dogs in the clubhouse,”’ he added. “If you’re a woman wanting to take up golf or even a guy with daughters wanting to take up golf, would you take your daughter along to a place like that?”

Reid, 23, is a two-time winner in Europe who aspires to be the female version of Rory McIlroy, the 22-year-old wunderkind from Northern Ireland who rose to No.4 in the rankings after his victory in the U.S. Open in June.

“The blokes are doing pretty good,” Reid said this month during the U.S. Women’s Open. “Can we reach that level of success? I think so. For it to happen, we need someone like myself to take the golf world by storm. That would make golf more attractive to young girls.”

Reid accepts there are obstacles she must overcome that McIlroy never had to hurdle. To get ready for Carnoustie, for example, Reid could practice at Holywell Golf Club in Wales, near where she lives, but not on any given Saturday.

“There are no women allowed on the course on Saturdays,” Reid said, adding, “Unfortunately, it’s just the way the world is.”

Reid, a willowy blonde who has gotten more press in England for her good looks than her game, added: “I completely understand golf tradition in Britain. I love the tradition but…” Her voice trailed off.

Read more:

Tuesday, July 26, 2011

Neverware Revives Old Computers to Power a Better Future


by Courtney Boyd Myers

“Do you think computers are meant to crash every three to four years?” I asked Jonathan Hefter, the CEO of Neverware, a start-up based in New York City.

He digs under his desk, laughing and pulls out a pair of lady’s pantyhose. “There are two things in this world planned for obsolescence. Computers and pantyhose. They are designed for the dump.”

While an undergrad at Wharton at the University of Pennsylvania, Hefter studied economics. Not wanting to go into finance after graduating in 2009, he spent a year tinkering in his parent’s lonely basement in Englewood, NJ. While he’d never taken a computer class before, the concepts of networks came naturally to Hefter. While hashing out his dream to create sustainable computing, he successfully developed the world’s first “juicebox.”

In early 2010, he set up two technology pilots in area schools, which proved to himself and others that the concept could work. In the spring, at the Kairos Society’s Annual Summit he was approached by Polaris Partner Peter Flint, who invited him to become a resident of Dogpatch Labs.

In May 2010, Hefter, now age 25, founded Neverware, a company that is akin to the fountain of youth for computers and moved into NYC’s Dogpatch Labs. The company’s flagship product, the JuiceBox a100 is a single server appliance that, when added to a network, will power up to one hundred old desktops with Windows 7. Under Hefter’s desk is a 10-year old Pentium 3 computer, your typical “general piece of crap computer,” with a missing hard drive (pictured above right). Using his college laptop as a monitor, Hefter demos Neverware’s power.

A Note to Readers

Hello.  I don't step out from this side of the curtain too often, but I wanted to make regular readers of this blog aware of a slight change in format going forward.  Yesterday I received a notice from Google regarding possible copyright infringement for something I had posted several weeks ago.  After reading the notice, examining the post and spending time on the chillingeffects.org web site I have to assume the issue involves reproduction of an original article in greater length than the originating source intended (although this is never explicitly stated).

I've posted articles like this before (not being clear on the finer details of copyright law), hoping to enhance continuity and readibility, and to help readers avoid having to contend with different formats, typefaces and sometimes even inoperative urls.  Sources are always credited with a link at the end of each post so that interested readers can see an article in its original context - along with other stories, charts, tables, pictures, etc. that might not have been included due to my limited formating abilities.

However, in terms of copyright liability those efforts appear to be insufficient.  So, from now on only a portion of an article will be displayed, along with a link to the "heart" of the article as it appears at it's original source (although, what constitutes "heart" is also very much a subjective matter). Hopefully this will keep everything safe and above board and not be too much of an inconvenience.

If anyone is interested, I encourage you to visit the chillingeffects.org website and examine the copyright FAQ in detail.  And, if there are any copyright lawyers in the Duck Soup community that can shed further light on this matter, please feel free to contact me.  Even with the format change, I hope you will continue to stop by and enjoy this blog.  Your visits mean a lot to me (and, I hope, the sources that contribute so much to our understanding of the world we live in).  Thank you.

markk

Monday, July 25, 2011

"I can't believe you symbolize peace when you're such a bitch."

Buddy Holly


The Kingdom and the Paywall

By Seth Mnookin

Two weeks ago, I went to the New York Times’ gleaming, modernist, Renzo Piano–­designed headquarters on Eighth Avenue in Manhattan to discuss some good financial news with Arthur Sulzberger Jr., the paper’s publisher and the chairman of the New York Times Company. Good news has been in short supply in the world of dead-tree media, and for the Times in particular.

For much of Sulzberger’s nineteen-year tenure, the paper that his family has controlled for more than a century has been embroiled in one crisis or another, ranging from the Jayson Blair fiasco, which led to the overthrow of Howell Raines, the hard-charging editor who had been handpicked by Sulzberger, to the paper’s reporting on the phantom WMDs in Iraq, which some believed had even helped propel the U.S. into war.

Then there were the paper’s financial troubles, which appeared to have pushed it to the brink of extinction. For well over a decade, the Internet had been relentlessly consuming the paper’s business model. On the web, the saying went, information wants to be free; this left institutions like the Times, which invest huge sums in reporting the news, in an existential quandary. In the months after the collapse of the credit market in the fall of 2008, the company was forced to take drastic measures to stay afloat: In January 2009, it granted Mexican telecom mogul Carlos Slim Helú purchase warrants for 15.9 million shares of Times Company stock for the privilege of borrowing $250 million at essentially a junk-bond interest rate of 14 percent. Two months later, in a move redolent with uncomfortable symbolism, the company raised another $225 million through a sale-leaseback deal for its headquarters—which had been built only two years earlier and which, in its understated, environmentally conscious, progressive, user-friendly way, was supposed to be the emblem of the paper’s 21st-century identity. Add on double-digit declines in both circulation and ad pages and the trend lines looked increasingly clear: The New York Times was doomed.

But a funny thing happened on the way to the graveyard. Though the Times’ circulation dipped during the crash years, much of the lost revenue was made up for by doubling the newsstand price, from $1 to $2—evidence, the paper insisted, that its premium audience understood the value of a premium product. In March, after several years of planning and tens of millions in investments, the Times launched a digital-subscription plan—and the early signs were good. In fact, less than 48 hours before my interview, the Times announced it would finish paying back the Carlos Slim loan in full on August 15, three and a half years early. When they were released last week, the company’s second-quarter financial results showed an overall loss largely owing to the write-down of some regional papers, but they also contained a much more important piece of data: The digital-subscription plan—the famous “paywall”—was working better than anyone had dared to hope.

Meanwhile, a phone-hacking scandal was engulfing Rupert Murdoch and News Corp. This was not in itself relevant to the Times, but it carried its own symbolism. Murdoch had made a point, after his purchase of The Wall Street Journal, of suggesting that the Times was vulnerable. “Let the battle begin,” he wrote in a note to Sulzberger. Sulzberger would not be quite human if he didn’t take some satisfaction in his rival’s troubles, especially because an aggressively reported investigation the Times published in its magazine last September was critical in bringing the scandal to light.

The bottom line for the paywall is more than the bottom line: The Times has taken a do-or-die stand for hard-core, boots-on-the-ground journalism, for earnest civic purpose, for the primacy of content creators over aggregators, and has brought itself back from the precipice. And if that does indeed end up being the case, there’s one unlikely person who deserves most of the credit: Arthur Ochs Sulzberger Jr.

Read More: 

The Chart That Should Accompany All Discussions of the Debt Ceiling

by James Fallows


It's based on data from the Congressional Budget Office and the Center on Budget and Policy Priorities. Its significance is not partisan (who's "to blame" for the deficit) but intellectual. It demonstrates the utter incoherence of being very concerned about a structural federal deficit but ruling out of consideration the policy that was largest single contributor to that deficit, namely the Bush-era tax cuts.

An additional significance of the chart: it identifies policy changes, the things over which Congress and Administration have some control, as opposed to largely external shocks -- like the repercussions of the 9/11 attacks or the deep worldwide recession following the 2008 financial crisis. Those external events make a big difference in the deficit, and they are the major reason why deficits have increased faster in absolute terms during Obama's first two years than during the last two under Bush. (In a recession, tax revenues plunge, and government spending goes up - partly because of automatic programs like unemployment insurance, and partly in a deliberate attempt to keep the recession from getting worse.) If you want, you could even put the spending for wars in Iraq and Afghanistan in this category: those were policy choices, but right or wrong they came in response to an external shock. 

The point is that governments can respond to but not control external shocks. That's why we call them "shocks." Governments can control their policies. And the policy that did the most to magnify future deficits is the Bush-era tax cuts. You could argue that the stimulative effect of those cuts is worth it ("deficits don't matter" etc). But you cannot logically argue that we absolutely must reduce deficits, but that we absolutely must also preserve every penny of those tax cuts. Which I believe precisely describes the House Republican position.

Read More:

The Problem With Memoirs

by Neil Genzlinger

A moment of silence, please, for the lost art of shutting up.


There was a time when you had to earn the right to draft a memoir, by accomplishing something noteworthy or having an extremely unusual experience or being such a brilliant writer that you could turn relatively ordinary occur­rences into a snapshot of a broader historical moment. Anyone who didn’t fit one of those categories was obliged to keep quiet. Unremarkable lives went unremarked upon, the way God intended.

But then came our current age of oversharing, and all heck broke loose. These days, if you’re planning to browse the “memoir” listings on Amazon, make sure you’re in a comfortable chair, because that search term produces about 40,000 hits, or 60,000, or 160,000, depending on how you execute it.

Sure, the resulting list has authors who would be memoir-eligible under the old rules. But they are lost in a sea of people you’ve never heard of, writing uninterestingly about the unexceptional, apparently not realizing how commonplace their little wrinkle is or how many other people have already written about it. Memoirs have been disgorged by virtually every­one who has ever had cancer, been anorexic, battled depression, lost weight. By anyone who has ever taught an underprivileged child, adopted an under­privileged child or been an under­privileged child. By anyone who was raised in the ’60s, ’70s or ’80s, not to mention the ’50s, ’40s or ’30s. Owned a dog. Run a marathon. Found religion. Held a job.

So in a possibly futile effort to restore some standards to this absurdly bloated genre, here are a few guidelines for would-be memoirists, arrived at after reading four new memoirs. Three of the four did not need to be written, a ratio that probably applies to all memoirs published over the last two decades. Sorry to be so harsh, but this flood just has to be stopped. We don’t have that many trees left.

Read more:

Wrong Again

by Barry Ritholtz

The recession is well behind us now, and Wall Street seems to think this recovery should be all wrapped up.

Consider this: The federal non-farm jobs report for June was pretty awful. The private sector created 57,000 jobs. Federal, state and local governments cut 39,000 positions (the eighth straight monthly decrease in government employment). We picked up a mere 18,000 net new jobs.

Not a single forecaster in Bloomberg’s monthly survey of 85 Wall Street economists got it anywhere close to right. The most common reaction was “surprise.” That any professional can sincerely claim to be surprised by continued weakness — in employment, GDP or retail sales — was the only revelation.

Let’s put the number into context: In a nation of 307 million people with about 145 million workers, we have to gain about 150,000 new hires a month to maintain steady employment rates. So 18,000 new monthly jobs misses the mark by a wide margin.

Why have analysts and economists on Wall Street gotten this so wrong? In a word: context. Most are looking at the wrong data set, using the post-World War II recession recoveries as their frame of reference.

History suggests the correct frame of reference is not the usual contraction-expansion cycles, but rather credit-crisis collapse and recovery. These are not your run-of-the-mill recessions. They are far rarer, more protracted and much more painful.

Fortunately, a few economists have figured this out and provide some insight into what we should expect. Among the most prescient are professors Carmen M. Reinhart and Kenneth S. Rogoff. Back in January 2008 (!), they published a paper warning that the U.S. subprime mortgage debacle was turning into a full-blown credit crisis. Looking at five previous financial crises — Japan (1992), Finland (1991), Sweden (1991), Norway (1987) and Spain (1977) — the professors warned that we should expect a prolonged slump. These other crises had a number of surprisingly consistent elements:

First, asset market collapses were prolonged and deep. Real housing prices declined an average of 35 percent over six years, while equity prices collapsed an average of 55 percent. Those numbers were stunningly close to what occurred in the U.S. crisis of 2007-09.

Second, they’ve noted that the aftermaths of banking crises “are associated with profound declines in employment.” They found that following a crisis, the average increase in the unemployment rate was 7 percentage points over four years. U.S. unemployment climbed 6 percentage points (from about 4 percent to about 10 percent), while the broadest measure of joblessness gained over 7 percentage points (from about 9 percent to about 16 percent). Again, they were right on the money.

Third, the professors warned that “government debt tends to explode, rising an average of 86 percent.” Surprisingly, the primary cause is not the costs of bailing out the banking system, but the “inevitable collapse in tax revenues that governments suffer in the wake of deep and prolonged contractions.” They also warned that “ambitious countercyclical fiscal policies aimed at mitigating the downturn” also tend to be costly.

Hmmm, plummeting tax revenues just as the government tries to stimulate the economy . . . does any of this sound familiar? It should.

Read more:
image credit:  Rhett Maxwell, Creative Commons

Smash the Ceiling

by James Surowiecki

In the past few years, the U.S. economy has been beset by the subprime meltdown, skyrocketing oil prices, the Eurozone debt crisis, and even the Tohoku earthquake. Now it’s staring at a new problem—a failure to raise the debt ceiling, which would almost certainly throw the economy back into recession. Unlike those other problems, however, this one would be wholly of our own making. If the economy suffers as a result, it’ll be what a soccer fan might call the biggest own goal in history.

The truth is that the United States doesn’t need, and shouldn’t have, a debt ceiling. Every other democratic country, with the exception of Denmark, does fine without one. There’s no debt limit in the Constitution. And, if Congress really wants to hold down government debt, it already has a way to do so that doesn’t risk economic chaos—namely, the annual budgeting process. The only reason we need to lift the debt ceiling, after all, is to pay for spending that Congress has already authorized. If the debt ceiling isn’t raised, we’ll face an absurd scenario in which Congress will have ordered the President to execute two laws that are flatly at odds with each other. If he obeys the debt ceiling, he cannot spend the money that Congress has told him to spend, which is why most government functions will be shut down. Yet if he spends the money as Congress has authorized him to he’ll end up violating the debt ceiling.

As it happens, the debt ceiling, which was adopted in 1917, did have a purpose once—it was a way for Congress to keep the President accountable. Congress used to exercise only loose control over the government budget, and the President was able to borrow money and spend money with little legislative oversight. But this hasn’t been the case since 1974; Congress now passes comprehensive budget resolutions that detail exactly how the government will tax and spend, and the Treasury Department borrows only the money that Congress allows it to. (It’s why TARP, for instance, required Congress to pass a law authorizing the Treasury to act.) This makes the debt ceiling an anachronism. These days, the debt limit actually makes the President less accountable to Congress, not more: if the ceiling isn’t raised, it’s President Obama who will be deciding which bills get paid and which don’t, with no say from Congress.

Read more:

Sunday, July 24, 2011

Harold Melvin and the Bluenotes


[with Teddy Pendergrass]

Raking In Hip-Hop Millions and Snorting Your Way To Ruin

by Gus Garcia-Roberts

When Scott Storch was 8 years old, he was dizzied by a soccer cleat to the head. His mom did not take such injuries in stride. She had been apoplectic when Scott lost his baby teeth in a living-room dive five years earlier, leaving him with a Leon Spinks grin. "I was an overly worrisome mother," admits Joyce Yolanda Storch, who goes mainly by her middle name. "I was overbearing to a fault."

Mom banned Scotty from participating in sports. Instead, she enrolled him in piano classes at Candil Jacaranda Montessori in Plantation, about 15 minutes from their Sunrise home. An old jazz pianist named Jack Keller taught him. A singer herself, Yolanda stopped taking weekday gigs so she could drive Scott to the lessons and scraped together enough cash to buy him a baby grand.

The scrawny, creative kid wasn't much of an athlete anyway. But it turns out he was a virtuoso on the keys. By age 12, he was landing paid gigs. As an adult, he parlayed that ability into studio production, eventually becoming one of hip-hop's elite beatmakers. He laid backdrops for nearly every rap or R&B superstar of the past decade, including Jay-Z, Beyoncé, Dr. Dre, Lil Wayne, and 50 Cent.

At age 33, in 2006, his fee hit six figures per beat, which he could produce in 15 minutes. The money turned the Sunrise kid into a Palm Island Lothario. Hip-hop's blinged-out white boy lived in an expansive villa in the Miami Beach enclave, kept more than a dozen exotic vehicles — including a $1.7 million sports car — and docked a $20 million yacht.

So Yolanda, who raised Scott and his brother Matthew after she divorced their father in 1983, has reason to cling to the fact that she introduced Scott to the piano. It's the consolation prize of her life. "It's not that I want to toot my own horn, but I was always very supportive of his music," she says. "It's just too bad that everything went sour."

She perches gingerly on a bottomed-out wooden patio chair outside the modest two-bedroom red-brick home she shares with her 88-year-old father, Julius. The years have battered Yolanda's former starlet looks, but she's still a handsome woman, instantly identifiable as Scott's Mother by her ghostly fair skin, blue eyes, and prominent jaw. Keeping large eyeglasses atop a nest of bleached hair, she wears pink slippers, gray sweatpants, and a T-shirt bearing a cartoon bird saying, "How about a Christmas goose?" A burned-out Doral Ultra Light 100 is wedged between her fingers.

Yolanda is, to put it one way, quirky. A Catholic convert of Lithuanian-Jewish descent, she's obsessed with all things Italian. Especially Al Pacino. She calls the abstract prospect of meeting the actor "the reason I get up in the morning."

For her and her gifted son, nothing has turned out the way it should have. She watched Scott blow his fortune in spectacular, infamous fashion, giving millions of dollars in diamonds and cars to his girlfriends, which included America's holy trinity of floozydom: Paris Hilton, Lindsay Lohan, and Kim Kardashian.

In the meantime, Yolanda, who cares full-time for her partially blind father, waited in this $81,000 house for her son to remember her. Instead, Scott descended into a cocaine binge that crashed his career, propelled him into massive financial litigation and bankruptcy, and sent him to rehab.

Read more:
Clare Woods

The Meaninglessness of "Terrorism"

by Glenn Greenwald

For much of the day yesterday, the featured headline on The New York Times online front page strongly suggested that Muslims were responsible for the attacks on Oslo; that led to definitive statements on the BBC and elsewhere that Muslims were the culprits. The Washington Post's Jennifer Rubin wrote a whole column based on the assertion that Muslims were responsible, one that, as James Fallows notes, remains at the Post with no corrections or updates. The morning statement issued by President Obama -- "It's a reminder that the entire international community holds a stake in preventing this kind of terror from occurring" and "we have to work cooperatively together both on intelligence and in terms of prevention of these kinds of horrible attacks" -- appeared to assume, though (to its credit) did not overtly state, that the perpetrator was an international terrorist group.

But now it turns out that the alleged perpetrator wasn't from an international Muslim extremist group at all, but was rather a right-wing Norwegian nationalist with a history of anti-Muslim commentary and an affection for Muslim-hating blogs such as Pam Geller's Atlas Shrugged, Daniel Pipes, and Robert Spencer's Jihad Watch. Despite that, The New York Times is still working hard to pin some form of blame, even ultimate blame, on Muslim radicals (h/t sysprog):

Terrorism specialists said that even if the authorities ultimately ruled out Islamic terrorism as the cause of Friday’s assaults, other kinds of groups or individuals were mimicking Al Qaeda's brutality and multiple attacks.

"If it does turn out to be someone with more political motivations, it shows these groups are learning from what they see from Al Qaeda," said Brian Fishman, a counterterrorism researcher at the New America Foundation in Washington.


Al Qaeda is always to blame, even when it isn't, even when it's allegedly the work of a Nordic, Muslim-hating, right-wing European nationalist. Of course, before Al Qaeda, nobody ever thought to detonate bombs in government buildings or go on indiscriminate, politically motivated shooting rampages. The NYT speculates that amonium nitrate fertilizer may have been used to make the bomb because the suspect, Anders Behring Breivik, owned a farming-related business and thus could have access to that material; of course nobody would have ever thought of using that substance to make a massive bomb had it not been for Al Qaeda. So all this proves once again what a menacing threat radical Islam is.

Read more:
Boats and birds by Alicque
via: