Monday, December 2, 2013

The Big Sleep

One evening in late May, four senior employees of Merck, the pharmaceutical company, sat in the bar of a Hilton Hotel in Rockville, Maryland, wearing metal lapel pins stamped with the word “team.” They were in a state of exhausted overpreparedness. The next morning, they were to drive a few miles to the headquarters of the Food and Drug Administration and attend a meeting that would decide the future of suvorexant, a new sleeping pill that the company had been developing for a decade. Merck’s team hoped to persuade a committee of seventeen, composed largely of neurologists, that suvorexant was safe and effective. The committee, which would also hear the views of F.D.A. scientists, would deliver a recommendation to the agency. If the government approved suvorexant—whose mechanism, inspired partly by research into narcoleptic dogs, is unlike anything on the market—it would be launched within a year. Some industry analysts had described it as a possible blockbuster, a term usually reserved for drugs with annual earnings of a billion dollars. Merck had not created a blockbuster since 2007, when it launched Januvia, a diabetes drug. The company was impatient. A factory in Las Piedras, Puerto Rico, was ready to start production.

David Michelson, who runs Merck’s clinical research in neuroscience, said of suvorexant, “It’s huge. It’s a major product.” He was sitting perfectly still in his chair; his hair flopped a little over his forehead. He looked as if he were waiting in an airport for a very late flight.

For months, in rooms across Merck’s archipelago of mismatched buildings north of Philadelphia, Michelson had taken part in role-playing rehearsals for the F.D.A. meeting. The focus had been on readying Joe Herring, another Merck neuroscientist; he would be the primary speaker, having run the later clinical trials of suvorexant. Herring, a straight-backed, athletic-looking man in his fifties, had just gone up to his room, for an early night. “Joe had to find a way to be authentic,” Michelson recalled. “He had to find a way to engage with the audience without becoming too informal.” During the meeting, Herring would have access to a library of twenty-one hundred and seventy PowerPoint slides.

The Merck team was frustrated. The F.D.A. had just shown them the draft of a presentation, titled “Suvorexant Safety,” that would be delivered by Ronald Farkas, an F.D.A. neuroscientist who had reviewed thousands of pages of Merck data. In a relentless PowerPoint sequence, Farkas made suvorexant sound disquieting, almost gothic. He noted suicidal thoughts among trial participants, and the risk of next-day sleepiness. He quoted from Merck’s patient notes: “Shortly after sleep onset, the patient had a dream that something dark approached her. The patient woke up several times and felt unable to move her arms and legs and unable to speak. Several hours later, she found herself standing at the window without knowing how she got there.” A woman of sixty-eight lay down to sleep “and had a feeling as if shocked, then felt paralyzed and heard vivid sounds of people coming up the stairs, with a sense of violent intent.” A middle-aged man had a “feeling of shadow falling over his body, hunted by enemies, hearing extremely loud screams.”

An F.D.A. presentation that focusses on individual “adverse events”—and draws attention to patients feeling “hunted by enemies”—is discouraging to a drug’s sponsor. Michelson called the presentation “somewhat unusual,” and emitted a dry laugh.

Darryle Schoepp, the head of Merck’s neuroscience division, was at the other end of the table. During the human trials of suvorexant, he noted, it had been taken two hundred and seventy thousand times, and “every time you take a drug it’s an opportunity for something to happen that the user can report.” He added, “Go back to the early days of Ambien. I wonder how many patient days of data they had with Ambien.”

Ambien, which is now available generically as zolpidem, is one of America’s most popular drugs, and it played a role—silent or spoken—in many conversations that I had heard on visits to the Merck offices. Zolpidem was the cheap drug that suvorexant had to take on, if not unseat, in order to succeed in the sleep-medication market. In addition, rising public worry about risks associated with taking Ambien—ranging from amnesiac devouring of Pop-Tarts to premature death—had reduced the F.D.A.’s tolerance for side effects in sleep medications.

John Renger was also at the bar. A forty-four-year-old neuroscientist, he has a round face, cropped hair, and a neat goatee. He helped lead the company to the suvorexant molecule, and ran the first tests on rats, mice, dogs, and rhesus monkeys. He, too, was politely indignant about the F.D.A. “They’ve taken the emphasis off efficacy,” he said, adding, “They’re saying any residual effects are bad. But they’re not looking at the balance—‘What is the improvement in this mechanism?’ ”

The central nervous system is in an ever-adjusting balance between inhibition and excitation. Ambien, like alcohol or an anesthetic, triggers the brain’s main inhibitory system, which depends on binding between gaba—gamma-aminobutyric acid, a neurotransmitter—and gaba receptors on the surface of billions of neurons. gabareceptors can be found throughout the brain, and when they’re activated the brain slows. Ambien encourages the process by sticking to the receptors, holding open the door to the neurotransmitter. Suvorexant, which Merck describes as “rationally designed”—rather than stumbled upon, like most drugs—influences a more precise set of neurotransmitters and receptors. Orexin neurotransmitters, first identified fifteen years ago, promote wakefulness. When suvorexant is in the brain, orexin is less likely to reach orexin receptors. Instead of promoting general, stupefying brain inactivity, suvorexant aims at standing in the way of a keep-awake signal. This difference may or may not come to mean a lot to insomniacs, but Merck’s marketing is likely to encourage the perception that suvorexant ends the dance by turning off the music, whereas a drug like Ambien knocks the dancer senseless.

If the Merck scientists succeeded at the F.D.A., they would be the first to bring an orexin-related drug to market. “It’s an amazing achievement,” Richard Hargreaves, the fourth colleague at the Hilton, said. “Everyone should be really proud.” But, he added, “my worry is that a new mechanism is being evaluated on the science of an old mechanism.”

“With Ambien, you’ve got a drug that’s got basically only onset,” Renger said, dismissively. That is, it sends you to sleep but might not keep you asleep. “Suvorexant has the onset, but it has the great maintenance, especially in the last third of the night, where other drugs fail.” And even though suvorexant keeps working longer than Ambien, suvorexant patients don’t feel groggier afterward, as you might expect. Impassioned, Renger imagined himself addressing the F.D.A.: “Why aren’t you giving this a chance?”

“Drugs usually have some side effects,” Schoepp said. “It’s all benefit-risk.” He added, “There is some dose where suvorexant will be ultimately safe—because nothing will happen. If you go low enough, it becomes homeopathic.”

They stood to go to their rooms. Schoepp murmured, “I’d love to take it right now.”

by Ian Parker, New Yorker |  Read more:
Image: Kenji Aoki

Emma Thompson for Variety magazine
via:

Sunday, December 1, 2013


Seymour Templar, Social Lights 2011
via:

Far From My Tree

[ed. The flipside to Holding Them Closer.]

My eldest son is 20 years old, lives in a house crammed with seven scrabbly roommates, works part time in a restaurant kitchen, doesn’t drive, is a vegetarian, and has homemade tattoos etched into his thighs.

He’s firmly a musician – a drummer in a loud punk band, and he loves nothing more than to tour across North America, playing gigs in sketchy houses in Oakland, Calif., and south Chicago.

He appears to have only one pair of pants – dirty, black cutoff jeans, and his shirts are also of the ripped-off-arms variety. I’m not sure who has been ripping up all his clothes. Maybe there’s a wild dog living in his house.

I’m both proud of and horrified for my boy. His jaw is squarely set, and he’s acutely committed to what he wants to do. And that is to tour with his band in their black-panel van, crisscrossing borders, dodging death in dubious neighborhoods, sleeping on strangers’ couches, and eating vegetarian burritos.

As my children traveled through their teenage years, I emphasized to them: Find your passion and follow it. What I really meant was: Find your passion, but do it in the way I did it. That is, go to college first, get a liberal arts degree, meander through your 20s, and then supplement your undergraduate degree with graduate studies. All while wearing clean, intact clothing.

But what if, as Andrew Solomon so eloquently addresses in his masterpiece, “Far From the Tree,” your child ends up so very different from you? I read “Far From the Tree” because it speaks of children with disabilities (and my youngest son has Down syndrome), but I gained a deeper knowledge of all children who stray from their parents. If we face reality squarely, and give our children the space to be who they want to be, every single child should be different from his parents, and should be allowed and even encouraged to fall far from our trees.

My oldest boy does not show up to family events in his collared shirt and pressed pants. In fact, he rarely shows up at all. He doesn’t respond to calls from grandparents, although he will send thank you texts for birthday gifts, so he still has a sliver of decorum. He’s proudly anti-establishment, and my current lifestyle with my husband (and his stepfather) – living in the suburbs and driving a BMW – clearly disgusts him.

I watch my friends’ children embarking on their second year of college, most of them still living at home with their parents. They are clean-cut, unfailingly polite, sit quietly at dinner parties and patiently dole out answers to questions from adults. Inevitably, someone asks me, “What’s your son doing?” and then I feel a strange mix of pride and apology. “He’s living his life,” I say. “But what graduate program? What path is he taking?” “He’s not in a program,” I say. “He’s working and playing in a band.” They take a deep gulp of wine and look down at their expensive shoes.

by Sue Robins, NY Times |  Read more:
Image: uncredited

All Is Fair in Love and Twitter


Right in the center of South Park, a large, grassy oval near San Francisco’s financial district, there is a rinky-dink playground with slides, ladders and firefighter poles, all dented and dinged, connected by gritty brown pylons. Yet for many in Silicon Valley, this playground is hallowed ground. It was here, one breezy day in 2006, according to legend, that Jack Dorsey ordered burritos with two co-workers, scaled a slide and, in a black sweater and green beanie, like a geeked-out Moses on Mount Sinai, presented his idea for an Internet service that would allow users to update their current status and share what they were doing. “That playground right up there is where I first brought up the idea,” Dorsey, whose present-day uniform is a white Dior dress shirt and tailored dark blazer, told CNBC earlier this year.

In Silicon Valley, ideas are not in short supply. At every coffee shop, beer garden and technology conference, there are legions of start-up founders, like screenwriters clutching their scripts, desperate to show off an app or site that they believe will be the next big thing. Yet around 75 percent of start-ups fail. Usually it’s not simply because the ideas are bad (although some certainly are), but because of a multitude of other problems. They are either ahead of their time or too late. Some have too much money and collapse under their own weight; others have trouble raising the capital they need to survive. Still others implode because of the poor management and infighting of founders who have no experience actually running companies.

For the ones that make it, success often comes down to a lot of luck. YouTube was one of dozens of video-sharing sites in existence when it was purchased for $1.7 billion by Google. Instagram wasn’t the first app on iTunes to share photos, yet Facebook still paid $1 billion for it. Twitter wasn’t the first place to share a status online; it was certainly the luckiest. Celebrities joined the service, then a queen, presidents, news organizations and, of course, Justin Bieber. Seven years after it was founded, the company with a catchy name had more than 2,000 employees, more than 200 million active users and a market value estimated at $16 billion. When it makes its initial public offering, many of Twitter’s co-founders, employees and investors are going to become very, very rich. Evan Williams, a co-founder who financed the company out of his pocket during its first year, is expected to make more than $1 billion. Dorsey, the company’s executive chairman and putative mastermind, will make hundreds of millions.

But in Silicon Valley, luck can be a euphemism for something more sinister. Twitter wasn’t exactly conceived in a South Park playground, and it certainly wasn’t solely Dorsey’s idea. In fact, Dorsey forced out the man who was arguably Twitter’s most influential co-founder before the site took off, only to be quietly pushed out of the company himself later. (At which point, he secretly considered joining his biggest competitor.) But, as luck would have it, Dorsey was able to weave a story about Twitter that was so convincing that he could put himself back in power just as it was ready to become a mature company. And, perhaps luckiest of all, until now only a handful of people knew what really turned Twitter from a vague idea into a multibillion-dollar business.

Genesis stories tend to take on an outsize significance in Silicon Valley. Steve Jobs dropped out of Reed College, traveled the world, dated Joan Baez and helped create a revolutionary computing company. Mark Zuckerberg wrote the initial code for Facebook while ranking the attractiveness of girls in his Harvard dorm room. In the Valley, these tales are called “the Creation Myth” because, while based on a true story, they exclude all the turmoil and occasional back stabbing that comes with founding a tech company. And while all origin stories contain some exaggerations, Twitter’s is cobbled together from an uncommon number of them.

by Nick Bilton, NY Times |  Read more:
Image: Paul Sahre

The Empire Strikes Back

You can hardly turn on the television or open a newspaper without hearing about the nation’s impressive, much celebrated housing recovery. Home prices are rising! New construction has started! The crisis is over! Yet beneath the fanfare, a whole new get-rich-quick scheme is brewing.

Over the last year and a half, Wall Street hedge funds and private equity firms have quietly amassed an unprecedented rental empire, snapping up Queen Anne Victorians in Atlanta, brick-faced bungalows in Chicago, Spanish revivals in Phoenix. In total, these deep-pocketed investors have bought more than 200,000 cheap, mostly foreclosed houses in cities hardest hit by the economic meltdown.

Wall Street’s foreclosure crisis, which began in late 2007 and forced more than 10 million people from their homes, has created a paradoxical problem. Millions of evicted Americans need a safe place to live, even as millions of vacant, bank-owned houses are blighting neighborhoods and spurring a rise in crime. Lucky for us, Wall Street has devised a solution: It’s going to rent these foreclosed houses back to us. In the process, it’s devised a new form of securitization that could cause this whole plan to blow up -- again.

Since the buying frenzy began, no company has picked up more houses than the Blackstone Group, the largest private equity firm in the world. Using a subsidiary company, Invitation Homes, Blackstone has grabbed houses at foreclosure auctions, through local brokers, and in bulk purchases directly from banks the same way a regular person might stock up on toilet paper from Costco.

In one move, it bought 1,400 houses in Atlanta in a single day. As of November, Blackstone had spent $7.5 billion to buy 40,000 mostly foreclosed houses across the country. That’s a spending rate of $100 million a week since October 2012. It recently announced plans to take the business international, beginning in foreclosure-ravaged Spain.

Few outside the finance industry have heard of Blackstone. Yet today, it’s the largest owner of single-family rental homes in the nation -- and of a whole lot of other things, too. It owns part or all of the Hilton Hotel chain, Southern Cross Healthcare, Houghton Mifflin publishing house, the Weather Channel, Sea World, the arts and crafts chain Michael’s, Orangina, and dozens of other companies.

Blackstone manages more than $210 billion in assets, according to its 2012 Securities and Exchange Commission annual filing. It’s also a public company with a list of institutional owners that reads like a who’s who of companies recently implicated in lawsuits over the mortgage crisis, including Morgan Stanley, Citigroup, Deutsche Bank, UBS, Bank of America, Goldman Sachs, and of course JP Morgan Chase, which just settled a lawsuit with the Department of Justice over its risky and often illegal mortgage practices, agreeing to pay an unprecedented $13 billion fine.

In other words, if Blackstone makes money by capitalizing on the housing crisis, all these other Wall Street banks -- generally regarded as the main culprits in creating the conditions that led to the foreclosure crisis in the first place -- make money too.

An All-Cash Goliath

In neighborhoods across the country, many residents didn’t have to know what Blackstone was to realize that things were going seriously wrong.

Last year, Mark Alston, a real estate broker in Los Angeles, began noticing something strange happening. Home prices were rising. And they were rising fast -- up 20% between October 2012 and the same month this year. In a normal market, rising home prices would mean increased demand from homebuyers. But here was the unnerving thing: the homeownership rate was dropping, the first sign for Alston that the market was somehow out of whack.

The second sign was the buyers themselves.

“I went two years without selling to a black family, and that wasn’t for lack of trying,” says Alston, whose business is concentrated in inner-city neighborhoods where the majority of residents are African American and Hispanic. Instead, all his buyers -- every last one of them -- were besuited businessmen. And weirder yet, they were all paying in cash.

by Laura Gottesdiener, TomDispatch | Read more:
Image: via:

Jean-Pierre Ruel, banc de poissons 2009
via:

Dating Tips for Uptown Divorcées: Middle-Aged Millionaires Are Just Not That Into You

I was at my usual banquette table at Cipriani catching up with my dear friend and fellow gala charity chair, an impossibly blond and glamorous socialite. She looked up, over her grilled salmon and leeks. “Do you have anyone for my friend Leanne? Her divorce just became final.”

I recalled a lithe brunette who looked good in Lilly, making the rounds of the Hamptons charity cocktail circuit along with her pint-sized now-ex-husband.

“Is she realistic yet?” I asked.

“I think so.”

“Good.” I sipped my Bellini.

My friend and I, while an unlikely matchmaking duo, have been informally setting up divorced friends and “children of” on the Upper East Side for years, with solid results. We always say we should charge a commission for our dating service, but that temptingly profitable idea would be too déclassé.

Our biggest challenge, time and again, is matching up middle-aged divorcées in the “pre-realist” stage, who have not realized that they have a choice of sex, money or companionship —but not necessarily all three in the same package.

“How did she make out in the divorce?” I asked my friend.

“All I know,” she revealed, “Is that the husband made her include her Birkins as part of the settlement.” She added: “At the current retail price.” Bien sur!

“She most likely will want the money, then.” I paused, Rolodexing in my head the range of the newly wed and nearly dead. As I gave the hand signal for the check, I thought of a few years’ divorced friend who could use a chatelaine for his manor, and she was an ideal prospect.

“Oh yes, I think I have a good old-fashioned septuagenarian billionaire in Palm Beach for her. Not exactly scintillating, but his real estate portfolio has a personality all its own.”

“Perfect,” she said. “I’ll call her with the good news.”

A few years back, I co-wrote a fairly well-known relationship book for women called Closing The Deal; the premise was that two married men’s advice could help turn single women into deal closers. While we had no formal training as relationship experts, we just implicitly understood that if women understood men better, they’d have a better shot at closing the deal. Knowing your audience is always key, whether personally or professionally, and we offered advice on topics from hygiene to foreplay.

Where most rich divorcées fail is in assuming they can replace their husbands with a newer model pretty much like the old one. Sorry to say, this tends not to be the case. Most of the time, the divorced well-to-do male is not looking for his equal, but rather for a sexretary from the Midwest, preferably without an opinion. As one recently divorced hedge funder told me: “Being married to a smart, opinionated woman is work! Now I just want tits on a stick, a blonde wig and someone to tell me I’m great when I get home.”

Women who take a tough line often wind up lonelier for it. At a political fund-raiser, my wife Dana and I were chatting with a well-regarded financier’s ex-wife, who clearly exhibited pre-realistic dating tendencies. She laid out her requests like the Marshall Plan: “My age or younger. I won’t date a geezer. Rich—the richer the better. Sexy. Okay, let’s just cut to the chase: my ex if he had abs and a personality.”

“Don’t you think you shouldn’t have a list?” Dana asked innocently.

“That’s for other people,” she snapped.

She is still on the prowl.

by Richard Kirshenbaum, NY Observer | Read more:
Image: (Illo: Brian Taylor

Holding Them Closer

Nearly 30 years ago, sociologist Robert Bellah and his team of co-authors in Habits of the Heart (1985) described the American parenting ideal as the production of independent children who “leave home,” both figuratively and literally. To never leave home, they wrote, violated the cardinal American virtue of self-reliance, contradicting self-understandings that individuals should “earn everything we get, accept no handouts or gifts, and free ourselves from our families of origin.” The essence of parenting was preparing children for just such a separation, reflecting the American belief that a meaningful life could be had only by breaking free from family and giving birth, in a sense, to oneself. “However painful the process of leaving home, for parents and for children, the really frightening thing for both would be the prospect of the child never leaving home.” Successful launching was the quest, and the empty nest, even though it required adjustment, the reward. If these were the habits of the parenting heart in the 1980s, American parents clearly have had a change of heart.

Consider these recent findings from the Culture of American Families Survey, conducted by the Institute for Advanced Studies in Culture. Two-thirds of American parents of school-age children now say they would “willingly support a 25-year-old child financially” if needed. Two-thirds say they would encourage a 25-year-old to move back home if he or she had difficulty affording housing. Parents still hope, of course, that their adult children will attain financial independence, but this aspiration is no stronger than the hope that children will retain “close ties with parents and family”—both are considered “essential” by about half of American parents. The quest for long-term connection with children has taken central stage. Parenting is still about formation, but its overriding concern has pivoted from formation to connection. One has only to consider parents’ responses to the statement “I hope to be best friends with my children when they are grown” to know something new is happening at home. Almost three-quarters of today’s parents of school-age children (72 percent) agree that they eventually want to be their children’s best friends; only 17 percent disagree. The successful formation and launching of children still matters; it is just that parents don’t want to launch them very far.

Ambiguous Adulthood

With this as their goal, it is no wonder American parents increasingly welcome twenty-something “boomerang children” back into “accordion families.” Compared to a generation ago, increasing numbers of adults in their twenties and thirties regularly call home to their parents, and regularly call their parents’ households “home.” American parents, meanwhile, more than parents in some nations, have felt the need to justify welcoming adult children back—as helping to finance a child’s schooling or the purchase of a separate residence, for example. But the practice has become so commonplace that justifying narratives are less and less necessary; the stigma attached to living with Mom and Dad is waning. Sociologist Katherine Newman suggests in The Accordion Family: Boomerang Kids, Anxious Parents, and the Private Toll of Global Competition (2012) that the very notions of “adulthood” and “independence” are increasingly ambiguous. Criteria such as residential independence, the creation of a new family, and economic autonomy have given way to something more elusive: You become an adult when you feel like one. It is the self-perception of autonomy and freedom that matters. Adulthood has become a subjective category.

Or the criteria may simply have changed, with young adults substituting personal autonomy (in their purchases, leisure pursuits, and lifestyles) and popular cultural knowledge (of the sort derived from exposure to popular media) for traditional signs of adulthood. Consider these findings from the Culture of American Families Survey. The typical older teen (16–19) has both a cell phone and a social networking account. She texts and talks with friends on her cell phone multiple times per day. She spends an hour or two daily on the Internet and streams videos several times a week. And she gets together with friends with no adult supervision about once a week. (It is important to note here that the Culture of American Families Survey is a nationally representative study of parents of school-age children. So these findings present parents’ understandings of what their children are doing.) What is more, the social backdrop for her semi-autonomous and plugged-in world often includes parents who themselves have positive attitudes toward the new technologies or, if not, at least accept them as the wave of the future, a wave their child cannot miss. So beyond the subjective feelings of adulthood, older teenagers’ very real freedom in consumption and leisure choices, their media connections with the world beyond the home, and the cultural knowledge that accumulates from these activities and links reinforce their perceptions that they deserve to be treated as adults.

Something unacknowledged in their rush to “adulthood” is not only their lack of economic independence but also their incompetence in practical matters. Older teens—legally “adults”—may mock their parents’ ignorance of the latest web trends or media celebrities, but they are often stumped by things their parents, at the same age, would have considered basic: changing a tire, replacing a button, ironing clothes, applying for a job, and the like. A recent analysis of such practical torpor by psychologists Joseph Allen and Claudia Worrell Allen goes so far as to suggest that 25 years of age is the new 15. In Escaping the Endless Adolescence: How We Can Help Our Teenagers Grow Up Before They Grow Old (2009), they note, “We’ve worked with macho teenage boys—high school seniors who were more than able to take their licks on an athletic field or jousting with peers—who were reduced to near paralysis when told to go to a shopping center on their own and approach store managers about possible job opportunities. So far removed and so beyond them did the adult world seem that these teens felt unable to enter it alone, even in the most rudimentary ways.” Other researchers have pointed to the incomplete development of the adolescent brain as the source of adolescent troubles, but the Allens instead highlight the insular nature of the adolescent world. Adolescents, they contend, grow up in a peer-dominated bubble, cut off from adult contacts, adult roles, and the adult world in general (other than their parents). This leaves little beyond their gadgets, studies, and peer-centered activities to serve as the basis for a broader sense of life’s meaning and purpose. What is more, the values absorbed from their media-defined world contrast markedly, the authors contend, with the traits valued by their parents and the adult world in general.

Overall, the Allens suggest, this adolescent bubble makes it harder for young people to engage in more meaningful pursuits or to make more significant contributions, leaving them to study, text, and tweet into the wee hours of the night. For their part, young adults defer plans for creating their own families, tangle fitfully with a challenging labor market, and rely on their parents’ financial and practical support until well into their twenties, if not beyond. Young adults from economically secure families sometimes opt for extended periods of self-discovery and vocational experimentation, relying on parents as security blankets for hard times. The more demanding route of competitive higher education, an ambitious career track, and the shouldering of vocational and family responsibilities is seen as something that can wait. But this leisurely stroll toward “leaving home” doesn’t preclude the embrace of an adult identity, even if it is framed in the language of “emerging adults” or “young adults.”

by Carl Desportes Bowman, Hedgehog Review |  Read more:
Image: Hedgehog Review

Saturday, November 30, 2013


Philippe Chancel. Desert Spirit
via:

Utagawa Hiroshige (1797-1858)
via:

Bowl with human feet, made in Egypt, c.3900-3650 BC (source).
via:
[ed. I used to have a replica of this (from the Met). I wonder what happened to it?]

How Could This Happen to Annie Leibovitz?

[ed. I missed this fascinating article when it first came out in 2009. According to the most recent entry listed for Annie in Wikipedia her debt has since been restructured, although it's still not clear whether she has retained rights to her incredible catalog of work.]

Annie Leibovitz clearly hated what a lifetime-achievement award implied about her—that the best days of her 40-year career were behind her. “Photography is not something you retire from,” the 59-year-old Leibovitz said from the stage, accepting the honor from the International Center of Photography last May at Pier 60. She was turned out in a simple black dress and glasses, her long straight hair a little unruly, as usual. Photographers, she said, “live to a very old age” and “work until the end.” She noted that Lartigue lived to be 92, Steichen 93, and Cartier-Bresson 94. “Irving Penn is going to be 92 next month, and he’s still working.” Then her tone turned rueful. “Seriously, though, this really is a big deal,” she said, hoisting her Infinity Award statuette, her voice quavering to the point where it seemed she might cry. “It means so much to me, you know, especially right now. It’s, it’s a very sweet award to get right now. I’m having some tough times right now, so … ”

The 700 friends and colleagues who had come to share the evening with her knew about the “tough times.” Two vendors had sued her for more than $700,000 in unpaid bills, and in February, the New York Times ran a front-page story reporting that in order to secure a loan, Leibovitz had essentially pawned the copyrights to her entire catalogue of photographs. Even those who had known she was in trouble were shocked by the extent of it. Leibovitz was responsible for some of the world’s most iconic magazine covers—a naked John Lennon with Yoko Ono for Rolling Stone, Demi Moore, naked and pregnant, for Vanity Fair. She had moved from celebrity portraiture to fashion photography to edgier, more artistic pictures; some considered her the heir to Richard Avedon or Helmut Newton.

Despite being a compulsive perfectionist whose shoots cost a fortune to produce, Leibovitz was very much in demand. People spoke of a fabled “contract for life” from Condé Nast, thought to bring her as much as $5 million annually. (The estimate didn’t seem far-fetched; a decade ago, the Times reported that Condé Nast chairman Si Newhouse had instructed Vanity Fair editor Graydon Carter not to “nickel and dime” Leibovitz over the issue of an extra quarter-million dollars in her contract.) She was said to earn a day rate of $250,000 just to set foot in a studio for an advertising job for clients like Louis Vuitton. Over the years, Leibovitz had bought and sold a small fortune in real estate—a penthouse in Chelsea with a photo studio nearby, a sprawling townhouse in Greenwich Village, a compound in Rhinebeck once owned by the Astor family, and a Paris pied-à-terre overlooking the Seine. Virtually anyone (the Queen of England, for instance) would agree to be photographed by her, and she had a longtime relationship with the celebrated writer and intellectual Susan Sontag.

Lately, however, Leibovitz’s life had taken a decidedly dark turn. Her reference to “tough times” was significantly understated. In the past five years, Sontag and both of Leibovitz’s parents have died. Her debts now total a staggering $24 million, consolidated with one lender with whom she is engaged in a lawsuit and due in September. If she can’t meet that deadline, she may lose her homes and the rights to her life’s body of work.

Friends say Leibovitz has begun to think of herself less as a celebrity artist leading a charmed life and more as a single mother of three fighting to keep a roof over her head and food on her family’s table. It isn’t surprising, then, that she bristled at a lifetime-achievement award. The fear of no longer working is terrifying to her. She has to work. What remains mystifying is the simple question on everyone’s mind that night: How on earth could something like this have happened to Annie Leibovitz?

by Andrew Goldman, New York Magazine |  Read more:
Image: John Keatley/Redux

The Death and Life of Great Internet Cities


In its prime, the ancient neighborhood of Petsburg may have had as many as 10,000 homes. And though they are now abandoned, traces of their inhabitants still hide in the ruins. “I just LOVE attention!,” a resident named Gypsy once wrote in her long-forgotten journal. Another resident named Cosmo confessed, “I don’t mean to be a bad cat, but being good is very difficult.” Names of these neighbors linger on the walls and in guestbooks. There was Fuzzy, Tinker, Nipper, Spice, Boomer, Lady Sustina, Whisky, and countless others too, now scattered, if they're still anywhere at all.

Petsburg appears to have enjoyed a bustling commercial district during its heyday. There were shops where one could do everything from having their age in dog years calculated to adopting one of the community’s abandoned pets. The neighborhood library recorded the histories of ancient creatures, like, for instance, the Egyptian Mau (“To gaze upon this beautiful and engaging [cat] is an opportunity to view a living relic,” one historian wrote.) Meanwhile, its scientists painstakingly chronicled the stages of gerbil pregnancies (“12/23/98: Moonflash and Chequers could double as gourds! they are both expecting any time now,” a researcher observed.) Visitors traveled the neighborhood on Webrings, leaving their mark in each home’s guestbook. The local newspaper was the Petsburg Post, though no copy has survived the community’s complete collapse.

Petsburg was just one of the 40 neighborhoods that made up the metropolis of Geocities, which, in its 15 years of existence, housed some 38 million online residents. It was arguably the world’s first and last Internet city. Were it a physical place, it would have been by far the largest urban area in the world.

It was shuttered in 2009, but several archive groups and individuals have sought to preserve it on mirror sites like Reocites and Oocities, and in massive torrent files. A relic from the early days of the Internet, frozen and downloadable, the files tell the story of the city’s rise and downfall, the story of how we found our place online.

Geocities began in 1994, advertising an enticing 15 megabytes of free space to any homesteader looking to make their place on the Web. The World Wide Web was only a few years old when this digital Northwest Ordinance was issued, and so its users, often referred to as netizens, were necessarily having their first interactions with the Internet, learning to make a place for themselves in the newly discovered online world. Millions of netizens with little to no experience or understanding of how a webpage “should” look utilized the site’s built-in development tools to create clapboard homes spattered with stray GIFs, looping MIDI files, and busy backgrounds. It was the Internet's Wild West.

It is easy to dismiss these pages as a sort of outsider art. But outside of what? There was no such thing as a personal page before Geocities. And, in almost every meaningful sense of that word, there is no equivalent today. Consider a page from the Heartland neighborhood, where one resident wrote, “Hi! My name is Sherry, my husband is Richard. We have three children, Colleen, Alicia and James and we are out here in the desert of southern California.” Further down on the page is a link to “Richard’s Original Bedtime Stories”: “Once upon a time there was a mad scientist, and this mad scientist had a laboratory. In his lab the scientist had a shelf and on the shelf was a jar. In that jar was a pickle and in the pickle was DNA. This DNA was different, it was dinosaur DNA...”

Where today do families publish their homemade bedtime stories about giant pickles? Sites like these have simply disappeared.

Geocities was bought by Yahooin 1999, during the height of its popularity. Then along came Myspace in 2003, Facebook in 2004, and Twitter in 2006. And by 2009, Petsburg, Heartland, and the rest of Geocities had been shuttered. The world had chosen the pre-fab aesthetics of social networks over the 15-megabyte tracts of open land offered by Geocities. Jacques Mattheij, the founder of the Geocities archive site Reocities, explained this choice to me: “The Geocities environment offered more freedom for expression. Don't like blue? Then Facebook probably isn't for you.”

Whatever we may ultimately make of our move towards sites like Facebook, it’s almost certainly the case that, for the average netizen, it was a movement away from online literacy. Instead of slogging through the HTML editor of Geocities—and coming to terms with how these tools can be used to express oneself in a digital space—we chose the sleek, standardized layouts of Facebook and Myspace.

“There's no way to look at the Facebook page and not know that you're on Facebook,” Jason Scott, who worked on a Geocities preservation project with Archive Team, told me. “In fact, it's hard to be on Facebook and even feel like people are contributing much beyond links and a paragraph of text.”

Online and before, we’ve always made these choices. The American west is littered with forgotten towns, single-economy communities that cropped up to mine gold, build railroads, or raise livestock as the country made its stubborn progress to the Pacific. Those that survived eventually lost their pioneer character.

by Joe Kloc, Daily Dot |  Read more:
Image: Jason Reed

Close the Store, It’s the Year’s Big Game in Alabama

[ed. See also: The Most Poisonous Rivaly in Sports. Postscript: It was Mayhem. Man,what a game.]

The Saturday evening Mass at St. Michael the Archangel Catholic Church here will be a cappella because the organist has football tickets. Weddings and funerals will be rarities. Car dealerships are expected to go dark. The J & M Bookstore will close at kickoff — and reopen after the game only if the home team wins.

The normal course of civil society in this state is transformed every year when Auburn University and the University of Alabama meet for what some believe is just a football game and what others see as a test of moral virtue. But the 78th matchup in what is now known as the Iron Bowl will be the first time the winner will grab the usual statewide bragging rights while simultaneously keeping its national title hopes alive and earning a spot in the Southeastern Conference championship game.

As a result, almost everything outside Jordan-Hare Stadium figures to sputter to a halt for a four-hour stretch on Saturday as top-ranked Alabama seeks an undefeated regular season, No. 4 Auburn enjoys its abrupt resurgence as a football power, and the state proves there are few limits to its infatuation with all things pigskin-related.

And the observances won’t end on Saturday. The Sunday sermon at the Auburn Church of Christ will be about humility because, as its sign along South College Street put it, “We’ll either have it or need it.”

“It’s gigantic. It’s for all the marbles,” said Eric Stamp, who owns a print shop in Auburn. “People change their Thanksgiving weekend plans to accommodate the Iron Bowl.” (For decades, the game was played in Birmingham, known for iron and steel production.)

The Alabama faithful concur. “Everyone knows going in that if your team loses, it will hurt you for decades. Just the mention of it in 25 years will cause certain people to retch in despair,” said Warren St. John, a former reporter for The New York Times whose book “Rammer Jammer Yellow Hammer,” documenting the zeal of Crimson Tide supporters, was once the textbook for a University of Alabama course about the culture surrounding Southern football.

by Alan Blinder, NY Times |  Read more:
Image: Dustin Chambers