Tuesday, May 14, 2013

Angelina Jolie Has Done Something Extraordinary

[ed. As have many other extraordinary women. By the way, the so-called 'cancer gene' BRCA1 can be detected with a blood test and women with a family history of breast or ovarian cancer should consider getting tested.]

Of course, Angelina Jolie is not the first actress to have had a mastectomy, that most medical of terms referring to the removal of at least one of the anatomical attributes that actresses are expected to hoik up for the sake of their career. In fact, off the top of my head, I can name four: Christina Applegate, Olivia Newton John, Lynn Redgrave and Kathy Bates have all publicly discussed their mastectomies.

Nor is she the first to have a preventative double mastectomy: Sharon Osbourne (not an actress but very much a woman in the public eye) announced only last year that she had one after discovering, as she told Hello! magazine, that she had "the breast-cancer gene".

Yet while Jolie may not be the first, she has done something that is – by any standards – pretty extraordinary and brave, even on top of having a preventative double mastectomy. She is certainly the highest-profile woman to make such an announcement in a long time, and she is arguably one with the most at stake. For a young, beautiful actress to announce that she has had her breasts removed is, as career moves go, somewhat akin to a handsome leading man announcing he is gay, and that is disgusting and ridiculous on both counts. Ultimately, she has challenged not just her own public image but also the wearisome cliche of what makes a woman sexy, and how a woman considered to be sexy talks about her body.

Judging from her clear, calm and plain-speaking article in the New York Times discussing why she elected to undergo a double mastectomy, Jolie views publicising her decision as simply a matter of public service:

"I chose not to keep my story private because there are many women who do not know that they might be living under the shadow of cancer. It is my hope that they, too, will be able to get gene tested, and that if they have a high risk they, too, will know they have strong options," she writes, while acknowledging the issues of financial access that prevent too many women from getting tested and treated. (...)

That breasts do not exist just to turn on other people will not come as a surprise to any sentient adult human being. Nor, it should go without saying but sadly does not, do breasts make the woman. But brutal, mature reality does not generally have much of a place in the fantasy land where the myths of celebrities and public perception intermix. In fact, in this fantasy land of celebrity puffery and tabloid nonsense, Angelina Jolie was, only 24 hours ago, still, in the eyes of the media, the sex-crazed, blood-drinking, man-stealing seductress (albeit one with six children) that she has been pretty much since she came to the public eye decades ago. In fact, only last weekend I read an article – and I'm using that term in the loosest sense – claiming that Jolie was so adamant to have her wedding before Jennifer Aniston's that she and Brad Pitt had already booked a "romantic getaway honeymoon" for themselves. Now we know that, contrary to looking up "sexxxxxy hotels" on the internet while having mind-blowing sexy sex, Pitt and Jolie have actually been otherwise engaged at the Pink Lotus Breast Center, while Jolie was being treated for her double mastectomy. Rarely has the disjunct between celebrity gossip rubbish and the actual truth looked so ridiculously exposed.

by Hadley Freeman, The Guardian |  Read more:
Photo: AP

Monday, May 13, 2013


Philip Barlow, Sea of Glass.
via:

Harry Benson, Little Rock, Arkansas, 1992.
via:

Why I Despise The Great Gatsby

The best advice I ever got about reading came from the critic and scholar Louis Menand. Back in 2005, I spent six months in Boston and, for the fun of it, sat in on a lit seminar he was teaching at Harvard. The week we were to read Gertrude Stein’s notoriously challenging Tender Buttons, one student raised her hand and asked—bravely, I thought—if Menand had any advice about how best to approach it. In response, he offered up the closest thing to a beatific smile I have ever seen on the face of a book critic. “With pleasure,” he replied.

I have read The Great Gatsby five times. The first was in high school; the second, in college. The third was in my mid-twenties, stuck in a remote bus depot in Peru with someone’s left-behind copy. The fourth was last month, in advance of seeing the new film adaptation; the fifth, last week. There are a small number of novels I return to again and again: Middlemarch, The Portrait of a Lady, Pride and ­Prejudice, maybe a half-dozen others. But Gatsby is in a class by itself. It is the only book I have read so often despite failing—in the face of real effort and sincere intentions—to derive almost any pleasure at all from the experience.

I know how I’m supposed to feel about Gatsby: In the words of the critic Jonathan Yardley, “that it is the American masterwork.” Malcolm Cowley admired its “moral permanence.” T. S. Eliot called it “the first step that American fiction has taken since Henry James.” Lionel Trilling thought Fitzgerald had achieved in it “the ideal voice of the novelist.” That’s the received Gatsby: a linguistically elegant, intellectually bold, morally acute parable of our nation.

I am in thoroughgoing disagreement with all of this. I find Gatsby aesthetically overrated, psychologically vacant, and morally complacent; I think we kid ourselves about the lessons it contains. None of this would matter much to me if Gatsby were not also sacrosanct. Books being borderline irrelevant in America, one is generally free to dislike them—but not this book. So since we find ourselves, as we cyclically do here, in the middle of another massive Gatsby ­recrudescence, allow me to file a minority report.

The plot of The Great Gatsby, should you need a refresher, is easily told. Nick Carraway, an upstanding young man from the Midwest, moves to New York to seek his fortune in the bond business. He rents a cottage on Long Island, next to a mansion occupied by a man of mysterious origins but manifest wealth: Jay Gatsby, known far and wide for his extravagant parties. Gradually, we learn that Gatsby was born into poverty, and that everything he has acquired—­fortune, mansion, entire persona—is designed to attract the attention of his first love: the beautiful Daisy, by chance Nick’s cousin. Daisy loved Gatsby but married Tom Buchanan, who is fabulously wealthy, fabulously unpleasant, and conducting an affair with a married working-class woman named Myrtle. Thanks to Nick, Gatsby and Daisy reunite, but she ultimately balks at the prospect of leaving Tom and, barreling back home in Gatsby’s car, kills Myrtle in a hit-and-run. Her husband, believing that Gatsby was both the driver and Myrtle’s lover, tracks him to his mansion and shoots him. Finis, give or take some final reflections from Nick.

When this tale was published, in 1925, very few people aside from its author thought it was or would ever become an American classic. Unlike his first book—This Side of Paradise, which was hailed as the definitive novel of its era—The Great Gatsby emerged to mixed reviews and mediocre sales. Fewer than 24,000 copies were printed in Fitzgerald’s lifetime, and some were still sitting in a warehouse when he died, in 1940, at the age of 44. Five years later, the U.S. military distributed 150,000 copies to service members, and the book has never been out of print since. Untold millions of copies have sold, including 405,000 in the first three months of this year.

But sales figures don’t capture the contemporary Gatsby phenomenon. In recent years, the book has been reinvented as a much-admired experimental play (Gatz) and a Nintendo video game—“Grand Theft Auto, West Egg,” as the New York Times dubbed it. This Thursday, Stephen Colbert will host a Gatsby book club; the new movie opens Friday. (Read David Edelstein's review here.) If you need a place to take your date afterward and have $14,999 to spare, you can head to the Trump hotel, which is offering a glamorous “Great Gatsby Package”: three nights in a suite on Central Park West, a magnum of Champagne, cuff links and a tailored suit for men, and, “for the ladies, an Art Deco shagreen and onyx cuff, accompanied by a personal note from Ivanka Trump.” Car insurance is not included.

So Gatsby is on our minds, on our screens, on our credit cards, on top of the Amazon best-seller list. But even in quieter days, we never really forget Fitzgerald’s novel. It is, among other things, a pedagogical perennial, in part for obvious reasons. The book is short, easy to read, and full of low-hanging symbols, the most famous of which really do hang low over Long Island: the green light at the end of Daisy’s dock; the unblinking eyes of Dr. T. J. Eckleburg, that Jazz Age Dr. Zizmor. But the real appeal of the book, one assumes, is what it lets us teach young people about the political, moral, and social fabric of our nation. Which raises the question: To our students, and to ourselves, exactly what kind of Great Gatsby Package are we selling?

by Kathryn Schulz, Vulture |  Read more:
Image via: Wikipedia

Saska Pomeroy, Still Life With Cat, 2013.
via:

Jeremy Parnell, Big Chook, (2005). Installed on Tamarama Beach in Sydney.
via:

Liz Brizzi. Mojave Sands, 2013
via:

The Rationalist Way of Death


Rationalists and secularists in the old plain style were very clear about death and dying, or at least they tried to be. “It’s just a nothing,” they would say: “the lights go out and then the curtain falls.” I won’t exist after I die, but then I didn’t exist before I was born, so what’s the big deal? It’s going to happen anyway, so just get over it. We are only forked animals after all, and when the time comes you should give my body to medical science, or burn it and use it as fertiliser; or why not eat it, if you’re hungry, or feed it to the pigs? And for goodness sake, don’t worry about how I died – whether peacefully or in pain – and don’t speculate about my last thoughts, my last sentiments or my last words. Why attach more importance to my dying moments than to any other part of my life? As for the business of seeing the body and saying goodbye, and the trouble and expense of coffins and flowers and funerals: what are they but relics of morbid superstitions that we should have got rid of centuries ago? So no fuss, please: the world belongs to youth and the future, not death and the past: go ahead and have a party if you must, with plenty to drink, but no speeches, nothing maudlin, no tears, nothing that might silence the laughter of children. And I beg you, no memorials of any kind: no stones, no plaques, no shrines, no park benches, no tree-plantings, no dedications: let the memory of who I was die with me.

In practice it has not always been so easy, and those of us who think of ourselves as CORPSES (Children of Rationalist Parents) may find ourselves seriously embarrassed when it comes to carrying out the wishes of our progenitors when they die. Bans on mourning and demands for oblivion are not going to have much effect when we are wracked with grief – when happiness is the last thing we want, when we find ourselves dwelling in remorse and remembrance and will not be comforted. Hence one of the most conspicuous elements in the transformation of rationalism in recent decades: the rise of a burgeoning service industry supplying secular celebrants for humanist funerals, to fill a ritualistic gap that earlier generations would not have wanted to acknowledge.

The decline of hardline rationalism about bereavement may be part of a global social trend towards blubbering sentimentality and public exhibitions of grief: Princess Diana and all that. But there could be something more serious behind it too: a suspicion that the no-nonsense approach to death advocated by pure-minded atheists bears a horrible resemblance to the attitudes that lie behind the great political crimes of the 20th century – Hiroshima and Nagasaki, the massified deaths of two world wars, the millions discarded as obstacles to progress in the Soviet Union and China, and of course the Nazi death camps.

If Holocaust stories are uniquely hard to bear, it is not because they describe suffering, death and humiliation on a bewildering scale, but because of the calculated impersonality and disinterested anonymity with which they were inflicted on their victims. (...)

As far as the old-style rationalists were concerned, any desire to ritualise death and remember the dead was a sign of a failure of nerve, and an inability to grow out of religious indoctrination – especially all that Christian stuff about personal survival, arraignment before a divine judge and consignment to heaven or hell. But in fact Christianity does not speak with one voice when it comes to death and dying. In the gospels of Matthew and Luke, Jesus issued a severe reprimand to a disciple who wanted to give his father a proper funeral: get back to work at once, he said, and “let the dead bury their dead.” The rebuke may seem like an enlightened anticipation of 20th-century rationalism, but it is also perfectly consistent with some main doctrines of Christianity: if the body is just a temporary home for an immortal soul, and a perpetual temptation to sin, then the sooner we shuffle it off the better.

The Egyptians, lacking the assurance of eternal life, had favoured mummification and entombment, at least for the ruling elite, while the Greeks and Romans preferred cremation and a good epitaph, and the Jews went in for speedy burials, usually in communal graves. But the Christians, with their confident expectation of a life after death, had no need for such pagan mumbo-jumbo.

by Jonathan Ree, Rationlist Association |  Read more:
Image: Jessica Chandler

Laptop U

When people refer to “higher education” in this country, they are talking about two systems. One is élite. It’s made up of selective schools that people can apply to—schools like Harvard, and also like U.C. Santa Cruz, Northeastern, Penn State, and Kenyon. All these institutions turn most applicants away, and all pursue a common, if vague, notion of what universities are meant to strive for. When colleges appear in movies, they are verdant, tree-draped quadrangles set amid Georgian or Gothic (or Georgian-Gothic) buildings. When brochures from these schools arrive in the mail, they often look the same. Chances are, you’ll find a Byronic young man reading “Cartesian Meditations” on a bench beneath an elm tree, or perhaps his romantic cousin, the New England boy of fall, a tousle-haired chap with a knapsack slung back on one shoulder. He is walking with a lovely, earnest young woman who apparently likes scarves, and probably Shelley. They are smiling. Everyone is smiling. The professors, who are wearing friendly, Rick Moranis-style glasses, smile, though they’re hard at work at a large table with an eager student, sharing a splayed book and gesturing as if weighing two big, wholesome orbs of fruit. Universities are special places, we believe: gardens where chosen people escape their normal lives to cultivate the Life of the Mind.

But that is not the kind of higher education most Americans know. The vast majority of people who get education beyond high school do so at community colleges and other regional and nonselective schools. Most who apply are accepted. The teachers there, not all of whom have doctorates or get research support, may seem restless and harried. Students may, too. Some attend school part time, juggling their academic work with family or full-time jobs, and so the dropout rate, and time-to-degree, runs higher than at élite institutions. Many campuses are funded on fumes, or are on thin ice with accreditation boards; there are few quadrangles involved. The coursework often prepares students for specific professions or required skills. If you want to be trained as a medical assistant, there is a track for that. If you want to learn to operate an infrared spectrometer, there is a course to show you how. This is the populist arm of higher education. It accounts for about eighty per cent of colleges in the United States.

It is also under extreme strain. In the mid-nineteen-sixties, two economists, William J. Baumol and William G. Bowen, diagnosed a “cost disease” in industries like education, and the theory continues to inform thinking about pressure in the system. Usually, as wages rise within an industry, productivity does, too. But a Harvard lecture hall still holds about the same number of students it held a century ago, and the usual means of increasing efficiency—implementing advances in technology, speeding the process up, doing more at once—haven’t seemed to apply when the goal is turning callow eighteen-year-olds into educated men and women. Although educators’ salaries have risen (more or less) in measure with the general economy over the past hundred years, their productivity hasn’t. The cost disease is thought to help explain why the price of education is on a rocket course, with no levelling in sight.

Bowen spent much of the seventies and eighties as the president of Princeton, after which he joined the Mellon Foundation. In a lecture series at Stanford last year, he argued that online education may provide a cure for the disease he diagnosed almost half a century ago. If overloaded institutions diverted their students to online education, it would reduce faculty, and associated expenses. Courses would become less jammed. Best of all, the élite and populist systems of higher education would finally begin to interlock gears and run as one: the best-endowed schools in the country could give something back to their nonexclusive cousins, streamlining their own teaching in the process. Struggling schools could use the online courses in their own programs, as San José State has, giving their students the benefit of a first-rate education. Everybody wins. At Harvard, I was told, repeatedly, “A rising tide lifts all boats.”

Does it, though? On the one hand, if schools like Harvard and Stanford become the Starbucks and Peet’s of higher education, offering sophisticated branded courses at the campus nearest you, bright students at all levels will have access. But very few of these students will ever have a chance to touch these distant shores. And touch, historically, has been a crucial part of élite education. At twenty, at Dartmouth, maybe, you’re sitting in a dormitory room at 1 a.m. sharing Chinese food with two kids wearing flip-flops and Target jeans; twenty-five years later, one of those kids is running a multibillion-dollar tech company and the other is chairing a Senate subcommittee. Access to “élite education” may be more about access to the élites than about access to the classroom teaching. Bill Clinton, a lower-middle-class kid out of Arkansas, might have received an equally distinguished education if he hadn’t gone to Georgetown, Oxford, and Yale, but he wouldn’t have been President.

Meanwhile, smaller institutions could be eclipsed, or reduced to dependencies of the standing powers. “As a country we are simply trying to support too many universities that are trying to be research institutions,” Stanford’s John Hennessy has argued. “Nationally we may not be able to afford as many research institutions going forward.” If élite universities were to carry the research burden of the whole system, less well-funded schools could be stripped down and streamlined. Instead of having to fuel a fleet of ships, you’d fuel the strongest ones, and let them tug the other boats along.

by Nathan Heller, New Yorker |  Read more:
Illustration by Leo Espinosa.

Jan Versnel - Graphic Designer, 1962
via:

Rebuilding the Shores, Increasing the Risks


This might be a good time to take a look at the most important environmental law that nobody has ever heard of.

The real estate industry fought that law bitterly in Congress, but lost, and it landed on Ronald Reagan’s desk in 1982. The president not only signed it, but did so with a rhetorical flourish, calling it a “triumph for natural resource conservation and federal fiscal responsibility.”

The law — the Coastal Barrier Resources Act — was intended to protect much of the American coastline, and it did so in a clever way that drew votes from the most conservative Republicans and the most liberal Democrats.

It is worth bringing up today because we are once again in an era when our coasts are at risk and our national coffers are strained. The $75 billion in damages from Hurricane Sandy, coming only seven years after the $80 billion from Hurricane Katrina, told us this much: We need a plan.

The climate is changing, the ocean is rising, more storms are coming, and millions of Americans are in harm’s way. The costs of making people whole after these storms are soaring. Without ideas that stand some chance of breaking the political gridlock in Washington, the situation will eventually become a national crisis.

The law that Reagan signed in 1982 might just offer a model of how to move forward. (...)

It should be obvious that the more people we move out of harm’s way in the reasonably near future, the better off we will ultimately be.

But we are doing the opposite, offering huge subsidies for coastal development. We proffer federally backed flood insurance at rates bearing no resemblance to the risks. Even more important, we go in after storms and write big checks so towns can put the roads, sewers and beach sand right back where they were.

We are, in other words, using the federal Treasury to shield people from the true risks that they are taking by building on the coasts. Coastal development has soared as a direct consequence, and this rush toward the sea is the biggest factor in the rising costs of storm bailouts.

So what was so clever about that 1982 law, and how can we learn from it?

Development pressure on the nation’s coasts was intense back then, but hundreds of miles of barrier islands and beaches were as yet unspoiled. Environmental groups would have loved a national ban on further coastal development, but conservatives would never have gone along with that.

Two Republicans, Senator John H. Chafee of Rhode Island and Representative Thomas B. Evans Jr. of Delaware, found the magic formula. Their bill simply declared that on sensitive coastlines that were then undeveloped, any future development would have to occur without federal subsidies.

In other words, no flood insurance and no fat checks after storms.

by Justin Gillis, NY Times |  Read more:
Image: Mario Tama/Getty Images

A North American's Guide to the Use and Abuse of the Modern PhD

You applied to the program, and you got in. Then you spent the next four, six, eight or more years stroking the capricious egos of professors, jockeying for position within your peer group and marking bad undergraduate essays for the minimum wage. You completed the research, the grant applications, the writing, the comprehensive exams, and finally the defence.

You got through it all, somehow – and now it's yours. You walked across the stage at a graduation ceremony, and an Important Person in a robe gave you the paper scroll that made it official. You are no longer a Mr. or a Ms. Now, you are a Doctor. You have a PhD.

A PhD isn't just something you've acquired, it's something you've become. It's part of who you are – and you're proud that you've transformed yourself in a way that's meaningful to you. Now that you can hold it in your hands, you feel you are someone special, and you want to tell the whole world.

But can you – or should you? And if so, how?

This is where it gets tricky. Indeed, knowing when it is professionally and socially acceptable to "use" your PhD – to call yourself Doctor, and to hope to be addressed as such in return – is a minefield where values, conventions and contexts intersect in fluid and intricate ways. And nowhere has the question ever been more perplexing than in North America today.

Ironically, this issue is often less troublesome in parts of Europe, Asia and Latin America. In many societies, scholarship and professional rank are highly respected things – and terms of address are an art form, requiring subtlety and precision. It would be tantamount to an insult to fail to address any kind of a doctor as Doctor.

But in North America – where traditions are discarded, hierarchies are flouted, and everything is supposed to be so much easier as a result – the rules surrounding the PhD designation are as clear as mud. Today's freshly minted scholars stand on shifting sands, and often have no idea when or where – or even if – it is acceptable to casually slip the initials Dr. in front of their name.

Google "PhD etiquette" and you'll find a clutch of anxious academics who have turned to the internet for advice. Timidly yet earnestly they raise the issue in chat rooms and on bulletin boards, begging an opinion about the use of Doctor from anyone who cares to offer one. However, the responses are an unhelpful mishmash – ranging from you're fully entitled to it at all times to what kind of jerk would even ask such a question?

by Colin Eatock, 3 Quarks Daily |  Read more:
Image: uncredited

Sunday, May 12, 2013

Happy Mother's Day


I had coffee with awesome lady Nia Vardalos, writer and star of My Big Fat Greek Wedding. We talked about adoption and her new book, which she didn’t want to write: Instant Mom. I wrote about it here.
via:

Claire Malhoney
via:

source: unknown

Zuckerberg's FWD: Making Sure They Get It Right


Mark Zuckerberg built himself a political action committee called FWD.us, and they're diving headfirst into trying to change immigration policy as their first priority. They seem to have good goals, and they've already adopted some extremely polarizing tactics, so I've tried to collect my thoughts here, as informed by a roundtable conversation yesterday which included FWD.us President and co-founder Joe Green. Spoilers: I don't have a simple, easy "It sucks!" or "It's great!" conclusion about FWD.us, but hopefully I've put together enough perspective here to help inform the discussion, provide some specific areas of improvement for the PAC, and offer a useful starting point for the discussion within the tech community of how we'd like to be effective in driving policy, whether specifically about immigration or on any broader issue.

It's already clear that with FWD.us, the tech industry is going to have to reckon with exactly how real the realpolitik is going to get. If we're finally moving past our innocent, naive and idealistic lack of engagement with the actual dirty dealings of legislation, then let's try to figure out how to do it without losing our souls.

The Fundamentals

Mark Zuckerberg wrote an editorial in the Washington Post a few weeks ago announcing the launch of FWD.us, in concert with a list of prominent Silicon Valley supporters. (Post CEO/Chairman Donald Graham is on Facebook's board, hence the choice of platform.) Zuck started by listing top-tier tech execs like Reid Hoffman, Eric Schmidt and Marissa Mayer, went through listing VCs and investors who are well known within the industry, and concludes with former Facebookers Aditya Agarwal and Ruchi Sanghvi, who aren't big names in the industry but are actual immigrants, in contrast to most of the other backers. Shortly after launch, names like Bill Gates, Reed Hastings and Fred Wilson were added as they apparently became financial backers as well.

All those dollars are being spent to support an organization that's pretty small — half a dozen people in Silicon Valley and four people on the ground in DC. ADrian Chen's excellent look at FWD.us offers lots of good perspective on the functioning and funding of FWD.us, but this is an organization that seems to be built with a long-term mission in mind.

I've long wanted the tech industry to engage in a serious and effective way with the policy world. At the peak of the protests against SOPA and PIPA, my dream was that we might black out our sites in protest of torture as state policy rather than simply focusing on self-serving goals. And while we've thus far had limited avenues for participation such as the White House's innovative petition platform, we obviously haven't played in the serious realm of policy before, either with our attention and interest or with the greasing of palms that actually makes legislation happen in DC.

So if we've got a practical organization working on meaningful problems and that's what I've wanted the tech industry to do, why am I so concerned? Let's take a look.

by Anil Dash, Making Culture |  Read more:
Image: FWD.us

Saturday, May 11, 2013

Ronnie Earl and the Broadcasters


[ed. One of my all-time favorite acoustic guitar performances.]

Shaun O'Dell, We Do Not Advance Through Fixed Gradations, 2010
via:

Norman Rockwell: The Dugout (cover illustration for The Saturday Evening Post, September 4, 1948).
via: