Thursday, February 12, 2015


[ed. Quite a sunrise over Honolulu this morning. Photo: markk]

How One Stupid Tweet Blew Up Justine Sacco’s Life

As she made the long journey from New York to South Africa, to visit family during the holidays in 2013, Justine Sacco, 30 years old and the senior director of corporate communications at IAC, began tweeting acerbic little jokes about the indignities of travel. There was one about a fellow passenger on the flight from John F. Kennedy International Airport:

“ ‘Weird German Dude: You’re in First Class. It’s 2014. Get some deodorant.’ — Inner monologue as I inhale BO. Thank God for pharmaceuticals.”

Then, during her layover at Heathrow:

“Chilly — cucumber sandwiches — bad teeth. Back in London!”

And on Dec. 20, before the final leg of her trip to Cape Town:

“Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!”

She chuckled to herself as she pressed send on this last one, then wandered around Heathrow’s international terminal for half an hour, sporadically checking her phone. No one replied, which didn’t surprise her. She had only 170 Twitter followers.

Sacco boarded the plane. It was an 11-hour flight, so she slept. When the plane landed in Cape Town and was taxiing on the runway, she turned on her phone. Right away, she got a text from someone she hadn’t spoken to since high school: “I’m so sorry to see what’s happening.” Sacco looked at it, baffled.

Then another text: “You need to call me immediately.” It was from her best friend, Hannah. Then her phone exploded with more texts and alerts. And then it rang. It was Hannah. “You’re the No. 1 worldwide trend on Twitter right now,” she said.

Sacco’s Twitter feed had become a horror show. “In light of @Justine-Sacco disgusting racist tweet, I’m donating to @care today” and “How did @JustineSacco get a PR job?! Her level of racist ignorance belongs on Fox News. #AIDS can affect anyone!” and “I’m an IAC employee and I don’t want @JustineSacco doing any communications on our behalf ever again. Ever.” And then one from her employer, IAC, the corporate owner of The Daily Beast, OKCupid and Vimeo: “This is an outrageous, offensive comment. Employee in question currently unreachable on an intl flight.” The anger soon turned to excitement: “All I want for Christmas is to see @JustineSacco’s face when her plane lands and she checks her inbox/voicemail” and “Oh man, @JustineSacco is going to have the most painful phone-turning-on moment ever when her plane lands” and “We are about to watch this @JustineSacco bitch get fired. In REAL time. Before she even KNOWS she’s getting fired.”

The furor over Sacco’s tweet had become not just an ideological crusade against her perceived bigotry but also a form of idle entertainment. Her complete ignorance of her predicament for those 11 hours lent the episode both dramatic irony and a pleasing narrative arc.(...)

By the time Sacco had touched down, tens of thousands of angry tweets had been sent in response to her joke. Hannah, meanwhile, frantically deleted her friend’s tweet and her account — Sacco didn’t want to look — but it was far too late. “Sorry @JustineSacco,” wrote one Twitter user, “your tweet lives on forever.”

In the early days of Twitter, I was a keen shamer. When newspaper columnists made racist or homophobic statements, I joined the pile-on. Sometimes I led it. The journalist A. A. Gill once wrote a column about shooting a baboon on safari in Tanzania: “I’m told they can be tricky to shoot. They run up trees, hang on for grim life. They die hard, baboons. But not this one. A soft-nosed .357 blew his lungs out.” Gill did the deed because he “wanted to get a sense of what it might be like to kill someone, a stranger.”

I was among the first people to alert social media. (This was because Gill always gave my television documentaries bad reviews, so I tended to keep a vigilant eye on things he could be got for.) Within minutes, it was everywhere. Amid the hundreds of congratulatory messages I received, one stuck out: “Were you a bully at school?”

Still, in those early days, the collective fury felt righteous, powerful and effective. It felt as if hierarchies were being dismantled, as if justice were being democratized. As time passed, though, I watched these shame campaigns multiply, to the point that they targeted not just powerful institutions and public figures but really anyone perceived to have done something offensive. I also began to marvel at the disconnect between the severity of the crime and the gleeful savagery of the punishment. It almost felt as if shamings were now happening for their own sake, as if they were following a script.

Eventually I started to wonder about the recipients of our shamings, the real humans who were the virtual targets of these campaigns. So for the past two years, I’ve been interviewing individuals like Justine Sacco: everyday people pilloried brutally, most often for posting some poorly considered joke on social media. Whenever possible, I have met them in person, to truly grasp the emotional toll at the other end of our screens. The people I met were mostly unemployed, fired for their transgressions, and they seemed broken somehow — deeply confused and traumatized.

by Jon Ronson, NY Times |  Read more:
Image: Andrew B. Myers. Prop stylist: Sonia Rentsch

Anselm Kiefer (German, b. 1945), Geheimnis der Farne [Mystery of ferns], 2006
via:

Scorched Earth, 2200AD

I stare out the window from my tiny flat on the 300th floor, hermetically sealed in a soaring, climate-controlled high-rise, honeycombed with hundreds of dwellings just like mine, and survey the breathtaking vistas from my lofty perch more than half a mile above ground: the craftsman cottages with their well-tended lawns, the emerald green golf courses, the sun-washed aquamarine swimming pools and the multimillion-dollar mansions that hug the sweeping sands from Malibu to Palos Verdes. These images evoke feelings of deep nostalgia for a Los Angeles that doesn’t exist anymore, back in the halcyon days before my great-grandparents were born, when procreation wasn’t strictly regulated and billions of people roamed freely on Earth.

There are only about 500 million of us left, after the convulsive transformations caused by climate change severely diminished the planet’s carrying capacity, which is the maximum population size that the environment can sustain. Most of us now live in what the British scientist James Lovelock has called ‘lifeboats’ at the far reaches of the northern hemisphere, in places that were once Canada, China, Russia and the Scandinavian countries, shoehorned into cities created virtually overnight to accommodate the millions of desperate refugees where the climate remains marginally tolerable.

What I ‘see’ outside my window is an illusion, a soothing virtual imitation of a world that once was, summoned by impulses from my brain. Yet the harsh reality is unsettling. As far as the eye can see, what’s left of civilised society is sheathed in glass – the ribbons of highways ferrying the bullet trains that encircle megacities where millions cram into skyscrapers hundreds of stories high; the vast tracts of greenhouses covering chemically enhanced farms where fruits and vegetables are grown and livestock graze; and even the crowded subterranean villages artificially lit to mimic the experience of walking outside on a sunny, spring day. (...)

It seems like hubris to think we can somehow save ourselves through Lovelockian lifeboats strung across the landscape, given the extent of the damage some experts believe we will wreak. Climate models predict temperatures could rise by four degrees Celsius (7.2 degrees Fahrenheit) or more by the end of this century, a level that Kevin Anderson of the UK’s Tyndall Centre for Climate Change Research described as ‘incompatible with any reasonable characterisation of an organised, equitable and civilised global community’.

‘We will see temperatures higher than any known during human civilisation – temperatures that we are simply not adapted to,’ says Heidi Cullen, chief scientist for the NPO Climate Central in Princeton, and author of The Weather of the Future (2010). ‘With each passing year, our “new normal” is being locked in with the full impacts arriving towards the latter part of this century,’ she says. ‘It’s hard for us to imagine that large parts of the planet would be unlivable outdoors.’

An increase of seven degrees Fahrenheit would see mass migrations from some of the most humid places on Earth – the Amazon, parts of India, northern Australia. Rising sea levels of four feet or more and ferocious storms would flood coastal cities from Tokyo to Mumbai, and submerge low-lying areas such as Bangladesh and Florida, displacing millions. Earth’s most populated areas, that belt of land extending from central China and most of Europe, Africa, Australia, the US and Latin America, would be parched by this century’s end, drying up surface water and killing crops that hundreds of millions depend upon for survival. Nearly half the world’s population, almost 4 billon people, could be enduring severe water scarcity and starvation, numerous studies suggest.

Scorching heat waves and cataclysmic fires will spark food riots, famine and mass migrations of millions. An explosion in insects will trigger widespread outbreaks of typhus, cholera, yellow fever, dengue, malaria and a host of long-dormant or even novel pathogens, unleashing epidemics reminiscent of the Black Death which killed as many as 200 million people in the 14th century. Once-teeming metropolises would become watery ghost towns: Picture Manhattan, Tokyo, São Paulo underwater, sparsely populated colonies of hardy survivors who eke out vampire-like subterranean existences, emerging only at night when the temperatures dip into the low triple digits.

Worse yet, temperatures won’t conveniently stabilise at just seven degrees of warming – Earth’s climate won’t reach a new equilibrium for hundreds of years because of all the heat trapping carbon dioxide that’s already been dumped into the environment. ‘We have only felt a fraction of the climate change from the gases already in the atmosphere,’ said James Hansen, a leading climatologist and director of the Earth Institute at Columbia University, recently. ‘Still more is in the pipeline because the climate system has enormous inertia and doesn’t change that quickly.’ The planet will continue to heat up, triggering feedback loops of runaway climate change, until we can kiss most of civilisation goodbye.

by Linda Marsa, Aeon |  Read more:
Image: Ed Freeman/Getty

Wednesday, February 11, 2015

Free Fall

It was close. With the full activation of the dark rhetorical skills he has honed over the past five years, Tiger Woods nearly changed the narrative of his ongoing saga again last week at Torrey Pines, despite what has been happening before our eyes.

It started Wednesday when, in explaining the alarming chunks and skulls and even shanks, Woods went back to old reliable: swing-change jargon. He spoke of “release patterns,” the search for “a consistent bottom,” and being “caught right dead in between” the teaching of deposed coach Sean Foley and new consultant Chris Como. All grounded in an audacious mantra, given the surreal and totally unprecedented nature of the shots in questions: “I’ve been through it before.” Really?

The next day, after withdrawing on his 12th hole of the Farmers Insurance Open because of tightness in his back—his third WD in his last eight events—Woods used a hasty parking-lot press conference to invoke his other favorite fallback topic: injury. The ensuing pedantry regarding activated and deactivated glutes will probably have a longer shelf life than LeBron James’ “I’m taking my talents to South Beach.”

Then on Saturday, Notah Begay III shared on Golf Channel that he had exchanged texts with Woods, reporting that his friend did not consider his latest problems “a major concern.”

Thus did a 14-time major winner, still the most powerful man in golf, attempt to use his influence to deflect attention from what he doesn’t want others, and probably himself, to believe: that his game is on a cliff’s edge, teetering more toward retirement than resurgence.

Woods’ stubborn ability to stay unceasingly on message and concede nothing has long made him a difficult subject to present with any depth, and never more so than now. When he deems the topic positive, he gives little information. When he deems it negative, he gives none. It’s understandable for a relentlessly scrutinized athlete who wants to minimize the noise, but in the process, he basically dares journalists to call him a liar. Very few have gone there.

Woods’ PGA Tour peers are unwittingly in on the obfuscation. First, circling the wagons is part of the player code. Second, they know if they don’t, the man they least need as an enemy will be displeased. A parade of public euphemisms, led by Fred Couples’ dismissive “he’s fine,” ensues. Meanwhile, every close observer is wondering the same thing.

Is Tiger Woods done?

Even that question carries the caveat, of course, that professional golfers, who have the most ability to come back because of the nature of golf and the many years one can play, get the benefit of the doubt. The greater the champion, the greater the benefit.

But here’s the thing. At Torrey, as it had been at the Waste Management Phoenix Open the previous week and at the Hero World Challenge in December, what transpired was simply too graphic not to trust our eyes. The emperor has no clothes.

Sure, pros fall into chipping slumps, which are getting more prevalent and prolonged as agronomy allows fairway lies to get ever tighter. But it’s extremely rare for a touring pro to hit shots as badly as often as Woods has, the cumulative effect more alarming than the most wildly sprayed drives of Seve Ballesteros, Ian Baker-Finch and David Duval combined.

Plenty of evidence defined the state of affairs at Torrey. Couples, not even in the event, took the unusual step of dropping in to watch and, seemingly, casually interact with Woods for his nine-hole Wednesday practice round, an emotionally intelligent elder in crisis-intervention mode. Fellow players gathered around a more sociable Woods on the practice tee, kibitzing easily until a horrifying Woods skull or shank left them speechless and facially frozen. They also saw Woods, on the crowded practice putting green, drop three balls to try some short shots, only to pick them up after a third went speeding past its target.

Come Thursday’s first round, there were the low murmurs from Woods’ stunned gallery, and the awkward body language of playing partners Rickie Fowler and Billy Horschel, who like Jordan Spieth and Patrick Reed at Phoenix, were forced at short range to witness an icon slowly stripped of his aura. On Woods’ first hole, when he bellied an otherwise simple chip, Nick Faldo waited a beat before intoning on Golf Channel, “That’s quite frightening.”

So what is really going on here?

Well, I have a theory, admittedly speculative and uncomfortable for many, who would rather—almost as much as Woods—not go there. But it’s based on the relevant history of a historic figure, the only thing that seems proportional in scale to what has ensued: The scandal that changed Woods’ life after Thanksgiving of 2009.

by Jamie Diaz, Golf Digest |  Read more:
Image: Getty

Destination Whatever: Touring the Cruise Industry of the Caribbean

With open, smiling mouths and wide, fixated eyes, a group of racially diverse “wholesome” families is featured against the new slogan of Royal Caribbean: “Our ships are designed to WOW.” Notable here is the emphasis on the ships’ design, rather than on any particular destination. As ships become more extreme in size, scale, and amenities, the travel experience is designed to make the floating vacation a familiar and comfortable one. The process of interiorizing hundreds of atmospheres into one floating mega-container has much to do with design, engineering, and management—but mostly economics.

And the effects of the current generation of cruise tourism in the Caribbean are only beginning to unfold. Adrenaline Beach, Barefoot Beach Club, Dragon’s Plaza. Entering Columbus Cove, Freedom of the Seas sails into Royal Caribbean’s Buccaneer’s Bay. The bay is flanked by recreational attractions including the world’s longest zip line, Dragon’s Breath, and the Dragon’s Tail Coaster atop Santa Maria Mountain. The only landmark is the 19th-century citadel sitting above the horizon—3,000 feet above sea level. The fortress is named for Henri Christophe Laferrière, a key leader of the first black slave rebellion that lead Haiti to independence from the French in 1804.

The boat, Royal Caribbean’s flagship, docks on the northwestern coast of Hispaniola Island, a territory called Labadee®—a registered trademark. Royal Caribbean leased the peninsula from the Haitian government on a 99-year contract. According to Royal Caribbean’s Port Explorer & Shopping Guide, the leased land is “strengthening the cooperative effort between the government of Haiti and RCCL® […] and has been solidified by extensive on-site development through the company’s investment of tens of millions of dollars.” Testifying to the economic humanitarianism of the deal, the guide mentions that “it is a clear vote of confidence in the people, nation, and future of the country as our guests continue to have the exclusive opportunity to enjoy a relaxing fun-filled day in the clear blue waters of Haiti’s northern coast. The sailors who joined Christopher Columbus and first came ashore here centuries ago obviously knew a good thing when they saw it.”

Since the 1980s, RCCL® has held exclusive rights to docking at the once small fishing village and coastal town of Labadie, named for the first French settler in the late 17th century. Anglicized, Labadee® “was specifically designed and built to provide guests with a variety of opportunities to have fun in the sun.” As the guide claims, Royal Caribbean “is honored and proud to be a pioneering partner with a people and country which has such a rich heritage and the tremendous potential to become one of the Caribbean’s premier tour destinations.”

These private destinations—compounds really—have become the new ports of call for big cruise ships. Today, nearly 10 private islands are owned by eight major cruise operators in the Caribbean. By buying or leasing islands, or anchoring at unregulated, deserted stretches of beach, cruise operators can reduce the number of days in official ports and divert the expenditures of travelers to locations under control of the industry. Piers, once perceived as extensions of land that connected ship to destination, have today become extensions of ships. Usually fenced or cordoned off, these extensions are fictional territory: they guide travelers away from local neighborhoods and people, toward areas owned and scripted by cruise companies. They are programmed to deliver isolated, worry-free experiences, supplementing those offered by the ship itself—pristine water, white sand, Caribbean beaches. Off-site excursions are limited to day-trips to tourist-friendly destinations—a colonial city or pre-Hispanic archaeological ruin.

Bigger Boats, Bigger Piers

As ships grow, they become destinations in themselves, ultimately devaluing the role of land-based destinations. From the already impressive 961-foot, 2,200-passenger Queen Elizabeth in the late 1960s, the size and scale of cruising vessels has nearly tripled to a whopping 6,300 passengers (Royal Caribbean’s Allure of the Seas). The increase in size of cruise ships, in width as much as in height, enables considerable spatial and programmatic complexity—streets, entertainment spaces, dining rooms, bars, pools, stores, water parks—with nonstop events that keep passengers entertained, day and night. Ships are designed as small, floating cities. Like an urban theme park, the ships include a multiplicity of landscape decks, terraces, surfing pools, and running paths that consume all available space on the ship’s roof deck.

Although the layout of vessels is organized around a double-loaded corridor down the middle—the “main street,” which provides each room with exterior views and direct access to amenities—it is the central kitchen that forms ships’ cores and ensures their functionality. Precisely engineered and impeccably designed, ship kitchens provide food tailored to a wide range of dining experiences, from luxurious, romantic dinners to basic midday snacks. Upwards of 15,000 meals can be served in one day, delivered through 30 different menus. Onboard infrastructure is vital: desalination plants for drinking water, crushing and compacting systems for recyclables, dehydrators and incinerators for food waste, with leftover ash disposed of offshore.

by Martin Delgado, Zuzanna Koltowska, Félix Madrazo & Sofia Saavedra, Harvard Design |  Read more:
Image: Carlos Weeber

The Future of an Illusion

Has there ever been a medical specialty as beleaguered as psychiatry? Since the profession’s founding in 1844, the doctors of the soul have had to contend with suspicions that they do not know what mental illness is, what type their patients might have, or what they should do about it—in other words, that they are doctors who do not practice real medicine. Some of the worry comes from the psychiatrists themselves, such as Pliny Earle, who in 1886 complained that “in the present state of our knowledge, no classification of insanity can be erected upon a pathological basis.” In 1917, psychiatrist Thomas Salmon lamented that the classification of diseases was still “chaotic”—a “condition of affairs [that] discredits the science of psychiatry and reflects unfavorably upon our association,” and that left the profession unable to meet “the scientific demands of the present day.” In 1973, the American Psychiatric Association voted to declare that homosexuality was no longer a mental illness, a determination that, however just, couldn’t possibly be construed as scientific. And for the six years leading up to the 2013 release of the fifth edition of its diagnostic manual, the DSM-5, the APA debated loudly and in public such questions as whether Asperger’s disorder were a distinct mental illness and if people still upset two weeks after the death of a loved one could be diagnosed with major depression. (The official conclusions, respectively: no and yes.)

To the diagnostic chaos was added the spectacle of treatments. Psychiatrists superintended horrifyingly squalid asylums; used insulin and electricity to send patients into comas and convulsions; inoculated them with tuberculin and malaria in the hope that fever would cook the mental illness out of them; jammed ice picks into their brains to sever their frontal lobes; placed them in orgone boxes to bathe in the orgasmic energy of the universe; psychoanalyzed them interminably; primal-screamed them and rebirthed them and nursed their inner children; and subjected them to medications of unknown mechanism and unanticipated side effects, most recently the antidepressant drugs that we love to hate and hate to love and that, either way, are a daily staple for 11 percent of adults in America.

It’s not just diagnostic uncertainty or therapeutic disasters that cast suspicion on the profession. It’s also the bred-in-the-bone American conviction that no one should tell us who we are. For that is what psychiatrists (and the rest of us in the mental-health professions) do, no matter whether we want to or not. To say you know what mental health and illness are is to say you know how life should go, and what we should do when it goes otherwise. You’d better know what to do when you’ve made a grievous error in those weighty matters, or at the very least, how to ask for forgiveness. And you’d better hope that, apologies offered, you can give the public a reason to believe that at long last you know what you are doing.

This is the unenviable task that Jeffrey Lieberman, past president of the APA, chairman of psychiatry at Columbia University’s medical school, chief of psychiatry at its hospital, and director of the New York State Psychiatric Institute, has taken on in his book Shrinks: The Untold Story of Psychiatry. “Psychiatry has earned its stigma,” he writes at the outset, and its practitioners must “own up to our long history of mistakes.” Otherwise it will remain “the black sheep of the medical family, scorned by physicians and patients alike.”

In Lieberman’s history, most of the profession’s travails can be traced to the mischief caused by one man: the Viennese neurologist who, on arriving for his first (and only) visit to America, said, “They don’t realize that we are bringing them the plague.” That at any rate is what, according to legend, Sigmund Freud said to Carl Jung as their ship pulled into New York harbor in 1909. Lieberman agrees wholeheartedly that Freud unleashed a plague. The pathogen was not, Lieberman says, the self-doubt and pessimism for which Freud is justly famous, but his autocratic approach to his patients and his insistence that his disciples remain in lockstep. Worst of all, says Lieberman, Freud “blurred the boundary between mental illness and mental health” by maintaining that conflict among the various agencies of the mind, set off by early childhood experience, was unavoidable.

In the early twentieth century, according to Lieberman, the members of the APA weren’t interested in Freud. American psychoanalysts, however, were interested in the APA. The analysts’ campaign for recognition eventually succeeded. Lieberman argues that this was largely because psychoanalysis offered psychiatrists “a way out of the asylum” and into cushy private practices ministering to the well-heeled “worried well.” Having convinced doctors (and patients) that we were all at least a little neurotic, Freud had opened the way to travesties like the pathologizing of homosexuality and endless and ineffective stays on the analytical couch.

by Gary Greenberg, Bookforum |  Read more:
Image: Sigmund Freud's office photo: uncredited

Friday, February 6, 2015


[ed. Another short break while we try to stay ahead of the authorities. Enjoy the archives.]

Pyke Koch, The Harvest. 1953
via:

Why Doctors Die Differently

Years ago, Charlie, a highly respected orthopedist and a mentor of mine, found a lump in his stomach. It was diagnosed as pancreatic cancer by one of the best surgeons in the country, who had developed a procedure that could triple a patient's five-year-survival odds—from 5% to 15%—albeit with a poor quality of life.

Charlie, 68 years old, was uninterested. He went home the next day, closed his practice and never set foot in a hospital again. He focused on spending time with his family. Several months later, he died at home. He got no chemotherapy, radiation or surgical treatment. Medicare didn't spend much on him.

It's not something that we like to talk about, but doctors die, too. What's unusual about them is not how much treatment they get compared with most Americans, but how little. They know exactly what is going to happen, they know the choices, and they generally have access to any sort of medical care that they could want. But they tend to go serenely and gently.

Doctors don't want to die any more than anyone else does. But they usually have talked about the limits of modern medicine with their families. They want to make sure that, when the time comes, no heroic measures are taken. During their last moments, they know, for instance, that they don't want someone breaking their ribs by performing cardiopulmonary resuscitation (which is what happens when CPR is done right).

In a 2003 article, Joseph J. Gallo and others looked at what physicians want when it comes to end-of-life decisions. In a survey of 765 doctors, they found that 64% had created an advanced directive—specifying what steps should and should not be taken to save their lives should they become incapacitated. That compares to only about 20% for the general public. (As one might expect, older doctors are more likely than younger doctors to have made "arrangements," as shown in a study by Paula Lester and others.)

Why such a large gap between the decisions of doctors and patients? The case of CPR is instructive. A study by Susan Diem and others of how CPR is portrayed on TV found that it was successful in 75% of the cases and that 67% of the TV patients went home. In reality, a 2010 study of more than 95,000 cases of CPR found that only 8% of patients survived for more than one month. Of these, only about 3% could lead a mostly normal life.

by Ken Murray, WSJ |  Read more:
Image: Arthur Giron

Creator or Buyer: Who Really Owns Art?

When we purchase an item, whether it’s a blender, a car, or a really cool toboggan for snowmageddon races, the purchaser owns what they bought and can modify it to their heart’s content. Buying an artistic work, on the other hand and the ownership is joint, with some right going to the buyer while others are retained by the work’s creator. Whether the purchase is an original oil painting or a corporate logo, ownership rights are not the same as owning a toboggan, even if it is handmade from ancient oak found in the forests of Valhalla.

As you can imagine, many lawsuits are fought over ownership right for artistic works and other intellectual property, many of which would not have happened had the parties known the basic rules surrounding IP ownership. Although every case is unique and requires a thorough analysis (that why we have lawyers after all), looking at a few hypothetical scenarios, should help us map but some of the boundaries of ownership rights when purchasing visual art. Imagine the following situation:


  • A wealthy executive purchases an oil painting from a living artist to be the centerpiece of his private library. After hanging the work, he feels he may have made a mistake in purchasing the painting, but thinks that if he cuts it into three smaller pieces, it might look better in the room.
  • After some negative reaction to his idea, the executive instead decides it would be better just to sell it and consigns it to a reputable gallery for the sale.
  • Before the gallery takes possession of the oil painting, a major fashion magazine rents the executive’s home for a photo shoot. The photographer uses the private library as the main setting and the oil painting is shown in the background of several photos, which the magazine publishes in its next issue.
  • Impressed with the photographer’s work, the executive commissions her to shoot his home. A couple of years later, the executive puts the home on the market and gives the photos to his real estate agent to use in the listing. The agent’s brokerage posts the photos on its site and also u0loads them to a Multiple Listing Service.
  • The brokerage is in the midst of a branding redesign including a new website. The company hires several freelancers to create the new designs, including some amazing drawings of streets in the area, which the company uses, along with photos of the executive’s home on its homepage.
  • The Brokerage also makes large posters of the drawings that it sells on its website.
The purchaser in each of these scenarios may be infringing on the rights of the artist or creator. Let’s look at each scenario and see what the purchaser may have done wrong and whether there are any defenses to get them out of trouble.

by Steve Schlackman, Art Law Journal | Read more:
Image:uncredited

Wednesday, February 4, 2015

What's Another Word For "Misremembering"?

[ed. Conflating? Hmm no... I don't think that's the one.]

NBC Nightly News anchor Brian Williams admitted Wednesday he was not aboard a helicopter hit and forced down by RPG fire during the invasion of Iraq in 2003, a false claim that has been repeated by the network for years.

Williams repeated the claim Friday during NBC’s coverage of a public tribute at a New York Rangers hockey game for a retired soldier that had provided ground security for the grounded helicopters, a game to which Williams accompanied him. In an interview with Stars and Stripes, he said he had misremembered the events and was sorry.

The admission came after crew members on the 159th Aviation Regiment’s Chinook that was hit by two rockets and small arms fire told Stars and Stripes that the NBC anchor was nowhere near that aircraft or two other Chinooks flying in the formation that took fire. Williams arrived in the area about an hour later on another helicopter after the other three had made an emergency landing, the crew members said.

“I would not have chosen to make this mistake,” Williams said. “I don’t know what screwed up in my mind that caused me to conflate one aircraft with another.”

Williams told his Nightly News audience that the erroneous claim was part of a "bungled attempt" to thank soldiers who helped protect him in Iraq in 2003. “I made a mistake in recalling the events of 12 years ago,” Williams said. “I want to apologize.”

Late Wednesday, Williams’ Twitter account, with 212,000 followers, appeared to have been wiped clean.

Williams made the claim about the incident while presenting NBC coverage of the tribute to the retired command sergeant major at the Rangers game Friday. Fans gave the soldier a standing ovation.

“The story actually started with a terrible moment a dozen years back during the invasion of Iraq when the helicopter we were traveling in was forced down after being hit by an RPG,” Williams said on the broadcast. “Our traveling NBC News team was rescued, surrounded and kept alive by an armor mechanized platoon from the U.S. Army 3rd Infantry.”

Williams and his camera crew were actually aboard a Chinook in a formation that was about an hour behind the three helicopters that came under fire, according to crew member interviews.

That Chinook took no fire and landed later beside the damaged helicopter due to an impending sandstorm from the Iraqi desert, according to Sgt. 1st Class Joseph Miller, who was the flight engineer on the aircraft that carried the journalists.

“No, we never came under direct enemy fire to the aircraft,” he said Wednesday.

The helicopters, along with the NBC crew, remained on the ground at a forward operating base west of Baghdad for two or three days, where they were surrounded by an Army unit with Bradley fighting vehicles and Abrams M-1 tanks. (...)

“It was something personal for us that was kind of life-changing for me. I know how lucky I was to survive it,” said Lance Reynolds, who was the flight engineer. “It felt like a personal experience that someone else wanted to participate in and didn’t deserve to participate in.”

by Travis J. Tritten, Stars and Stripes | Read more:
Image: Stars and Stripes

Don’t Be Like That

Does black culture need to be reformed?

It was just after eight o’clock on a November night when Robert McCulloch, the prosecuting attorney for St. Louis County, announced that a grand jury would not be returning an indictment in the police killing of Michael Brown, who was eighteen, unarmed, and African-American. About an hour later and eight hundred miles away, President Obama delivered a short and sober speech designed to function as an anti-inflammatory. He praised police officers while urging them to “show care and restraint” when confronting protesters. He said that “communities of color” had “real issues” with law enforcement, but reminded disappointed Missourians that Brown’s mother and father had asked for peace. “Michael Brown’s parents have lost more than anyone,” he said. “We should be honoring their wishes.”

Even as he mentioned Brown’s parents, Obama was careful not to invoke Brown himself, who had become a polarizing figure. To the protesters who chanted, “Hands up! Don’t shoot!,” Brown was a symbol of the young African-American man as victim—the chant referred to the claim that Brown was surrendering, with his hands up, when he was killed. Critics of the protest movement were more likely to bring up the video, taken in the fifteen minutes before Brown’s death, that appeared to show him stealing cigarillos from a convenience store and then shoving and intimidating the worker who tried to stop him—the victim was also, it seemed, a perpetrator.

After the Times described Brown as “no angel,” the MSNBC host Melissa Harris-Perry accused the newspaper of “victim-blaming,” arguing that African-Americans, no matter how “angelic,” will never be safe from “those who see their very skin as a sin.” But, on the National Review Web site, Heather MacDonald quoted an anonymous black corporate executive who told her, “Michael Brown may have been shot by the cop, but he was killed by parents and a community that produced such a thug.” And so the Michael Brown debate became a proxy for our ongoing argument about race: where some seek to expose what America is doing to black communities, others insist that the real problem is what black communities are doing to themselves.

Sociologists who study black America have a name for these camps: those who emphasize the role of institutional racism and economic circumstances are known as structuralists, while those who emphasize the importance of self-perpetuating norms and behaviors are known as culturalists. Mainstream politicians are culturalists by nature, because in America you seldom lose an election by talking up the virtues of hard work and good conduct. But in many sociology departments structuralism holds sway—no one who studies African-American communities wants to be accused, as the Times was, of “victim-blaming.” Orlando Patterson, a Jamaica-born sociologist at Harvard with an appetite for intellectual combat, wants to redeem the culturalist tradition, thereby redeeming sociology itself. In a manifesto published in December, in the Chronicle of Higher Education, he argued that “fearful” sociologists had abandoned “studies of the cultural dimensions of poverty, particularly black poverty,” and that the discipline had become “largely irrelevant.” Now Patterson and Ethan Fosse, a Harvard doctoral student in sociology, are publishing an ambitious new anthology called “The Cultural Matrix: Understanding Black Youth” (Harvard), which is meant to show that the culturalist tradition still has something to teach us.

The book arrives on the fiftieth anniversary of its most important predecessor: a slim government report written by an Assistant Secretary of Labor and first printed in an edition of a hundred. The author was Daniel Patrick Moynihan, and the title was “The Negro Family: The Case for National Action.” At first, the historian James T. Patterson has written, only one copy was allowed to circulate; the other ninety-nine were locked in a vault. Moynihan’s report cited sociologists and government surveys to underscore a message meant to startle: the Negro community was doing badly, and its condition was probably “getting worse, not better.” Moynihan, who was trained in sociology, judged that “most Negro youth are in danger of being caught up in the tangle of pathology that affects their world, and probably a majority are so entrapped.” He returned again and again to his main theme, “the deterioration of the Negro family,” which he considered “the fundamental source of the weakness of the Negro community”; he included a chart showing the rising proportion of nonwhite births in America that were “illegitimate.” (The report used the terms “Negro” and “nonwhite” interchangeably.) And, at the end, Moynihan called—briefly, and vaguely—for a national program to “strengthen the Negro family.”

The 1965 report was leaked to the press, inspiring a series of lurid articles, and later that year the Johnson Administration released the entire document, making it available for forty-five cents. Moynihan found some allies, including Martin Luther King, Jr. In a speech in October, King referred to an unnamed “recent study” showing that “the Negro family in the urban ghettos is crumbling and disintegrating.” But King also worried that some people might attribute this “social catastrophe” to “innate Negro weaknesses,” and that discussions of it could be “used to justify neglect and rationalize oppression.” Many sociologists were harsher. Andrew Billingsley argued that in assessing the problems caused by dysfunctional black families Moynihan had mistaken the symptom for the sickness. “The family is a creature of society,” he wrote. “And the greatest problems facing black families are problems which emanate from the white racist, militarist, materialistic society which places higher priority on putting white men on the moon than putting black men on their feet on this earth.” This debate had influence far beyond sociological journals: when Harris-Perry accused the Times of “victim-blaming,” she was using a term coined by the psychologist William Ryan, in a book-length rebuttal to the Moynihan report, “Blaming the Victim.”

Orlando Patterson thinks that, half a century later, it’s easier to appreciate all that Moynihan got right. “History has been kind to Moynihan,” he and Fosse write, which might be another way of saying that history has not been particularly kind to the people Moynihan wrote about—some of his dire predictions no longer seem so outlandish. Moynihan despaired that the illegitimacy rate for Negro babies was approaching twenty-five per cent. According to the Centers for Disease Control and Prevention, the equivalent rate in 2013 was 71.5 per cent. (The rate for non-Hispanic white babies was 29.3 per cent.) Even so, Patterson and the other contributors avoid pronouncing upon “ghetto culture” or “the culture of poverty,” or even “black culture.” Instead, the authors see shifting patterns of belief and behavior that may nevertheless combine to make certain families less stable, or certain young people less employable. The hope is that, by paying close attention to culture, sociologists will be better equipped to identify these patterns, and help change them.

by Kelefa Sanneh, New Yorker |  Read more:
Image: Tony Rodriguez

Bill Nunn as Radio Raheem in Do The Right Thing
via:

Tuesday, February 3, 2015

Why Beef Jerky Is So Popular

Beef jerky is big business, and Hershey wants in.

The international chocolate giant announced on Thursday that it has agreed to purchase upscale meat snack maker Krave for as much as $300 million. The deal is unprecedented for Hershey, because it marks the first time Hershey has purchased a company that doesn't sell candy, chocolate, or other sweets.

On the surface, it's fairly easy to see why Hershey, let alone any company interested in a half decent investment, would want to acquire Krave. The high-end snack company, which prides itself on its lineup of beef jerky offerings with no artificial ingredients, closed 2014 with $36 million in sales after only five years in business. The jerky's colorful packaging, which encloses a variety of flavors, will soon be found at Starbucks restaurants across the country. And it is anticipated to continue growing—and fast. Jon Sebastiani, the company's founder, said he expects Krave to more than double its business next year.

"Krave jerky is a great fit to our portfolio," Michele Buck, president of Hershey's North America, said in a press release.

But acquiring Krave offers Hershey more than merely the opportunity to share in the company's success. It also, and perhaps more importantly, ushers the chocolate giant into one of the savory snack world's most promising foodstuffs: jerky.

The market for jerky has ballooned into a nearly $1.5 billion industry in the United States. Sales are up by 13 percent since 2013, and by 46 percent since 2009, according to data from market research firm IRI. Jack Link's, the largest jerky maker in America, now sells more than $1 billion in meat snacks each year.

The demand for dried meat has risen to such heights that it now dwarfs that of other once comparable snacks. A recent report by market research firm Euromonitor found that jerky outsells seeds, party mixes, and pita chips—combined.

The rise of dried meat is in part the result of a general uptick in snacking among Americans. The U.S. snacks business, which includes not only jerky, but also chips, bars, nuts, and other fare, is now a $120 billion behemoth. Pepsi now relies on snacks—not soda—for growth. And it's easy to see why. A recent survey by Nielsen found that one in ten people in this country say they eat snacks instead of meals, a number which the research company expects will increase.

But jerky's popularity also owes a great deal to this country's obsession with protein. More than half of Americans say they want more of it in their diet, and they have proven that the talk isn't cheap. The protein shake business has become a behemoth. So too has the protein bar market, which was already worth more than $500 million in 2013. Sales of health and wellness bars, which often dangle high protein content, are growing more than twice as fast as the overall food industry.

Beef jerky, which is high in protein, low in calories, highly portable, and can last for a long time, has benefited greatly from its ability to double as both a practical and healthy snack.

by Roberto A. Ferdman, Washington Post | Read more:
Image: Mike Blake/Reuters