Wednesday, August 27, 2014

What Kind of Father Am I?

[ed. Repost]

One evening—not long after my family moved to the old country farmhouse where my wife and I have lived for 45 years—our youngest son (my namesake, Jim, then three-year-old Jimmy) came into the woodshed, while I was there putting away some tools. “Look,” he said proudly, cradling in his arms the largest rat I had ever seen.

Instinctively, in what no doubt would be a genetic response of any parent, I tried to grab the rat from his arms before it bit him; but, as I reached toward it, the rat tightened its body, menacing me with its sharp teeth. At once, I stepped back: that, too, was an instinctive response, though rational thought immediately followed it. Was the rat rabid? Whether that was so or not, it was clear that the rat trusted Jimmy but not me, and yet it might bite both of us if I threatened it further.

“Where did you find it?” I asked my son.

“In the barn.”

“Which barn? The one with all the hay?”

“Yes.”

“It was just lying there, on the hay?”

“Yes, and he likes me.”

“I can see that it does.”

With the possible exception of the difference in our use of pronouns (which just now came to me without conscious intent; could it have risen from some submerged level of my memory?), that little dialogue isn’t an exact transcription—not only because it happened decades ago, but because while I was talking, my mind was elsewhere. I was looking at the garden tools I’d just returned to the wall behind Jimmy, thinking I might ask him to put the rat on the floor so that I could kill it with a whack of a shovel or some other implement. But my son trusted me, just as the rat apparently trusted him; and what kind of traumatic shock would I be visiting upon Jimmy if I smashed the skull of an animal he considered his friend?

The woodshed is in a wing of the house connected to the kitchen, where my wife, Jean, had been preparing dinner. She surprised me by coming quietly to my side; apparently she had overheard our conversation through the screen door and now was offering a solution to the dilemma. She said, “We need to find something to put your pet in, Jimmy.”

“A box,” I said. “Just keep holding it while I find one.” For I remembered at that moment a stout box I had seen while rummaging among all the agricultural items that had collected over the years in the carriage barn across the road—items that fell into disuse after the fields had been cleared, the house and barns constructed, and finally after tractors and cars had replaced horses. Amid the jumble of old harnesses, horse-drawn plow parts, scythes, and two-man saws was a small oblong box that might have contained dynamite fuses or explosives for removing stumps. It had been sawed and sanded from a plank about two inches thick. Like the house itself, it was made of wood far more durable than anything available since the virgin forests were harvested, and all of its edges were covered in metal. Though I felt guilty for leaving Jimmy and Jean with the rat, I was glad to have remembered the box I had admired for its craftsmanship, and I ran in search of it. For the longest time, I couldn’t find it and thought (as I often did later, whenever I found myself unable to resolve a crisis besetting one of our adolescent sons), What kind of father am I? I was close to panic before I finally found the box, more valuable to me at that moment than our recently purchased Greek-revival farmhouse—the kind of family home I’d long dreamed of owning.

A film of these events still runs through my mind, but I will summarize the rest of it here. Jimmy was initially the director of this movie, with Jean and me the actors obedient to his command: that is to say, he obstinately refused to put the rat into the box until a suitable bed was made for it—old rags wouldn’t do, for it had to be as soft as his favorite blanket. The rat gave him his authority, for it trusted Jean no more than it trusted me; it remained unperturbed in his embrace for a few minutes more, while Jean searched for and then cut several sections from a tattered blanket. Our son was satisfied with that bed, and the rat—whose trust in a three-year-old seemed infinite—seemed equally pleased, permitting Jimmy to place it on the soft strips. As soon as we put the lid on the box, I called the county health department, only to be told that the office had closed; I was to take in the rat first thing in the morning so that its brain could be dissected.

In response to Jean’s immediate question, “Did the rat bite you?” Jimmy said, “No, he kissed me.” Could any parent have believed an answer like that? My response was simply to put the box outside. Before giving our son a bath, we scrutinized every part of his body, finding no scratches anywhere on it. During the night the rat gnawed a hole through the wood, and by dawn it had disappeared.

Forty-odd years ago, rabies vaccination involved a lengthy series of shots, each of them painful, and occasionally the process itself was fatal. Neither the health department nor our pediatrician would tell us what to do. Once again we searched Jimmy’s body for the slightest scratch and again found nothing; so we decided to withhold the vaccination—though Jean and I slept poorly for several nights. Long after it had become apparent that our son had not contracted a fatal disease, I kept thinking—as I again do, in remembering the event—of the errors I had made, of what I should have done instead, of how helpless I had felt following my discovery that the rat had escaped.

While reading a recent biography of William James by Robert D. Richardson Jr., I found myself recalling those suspenseful and seemingly never-ending hours. As Richardson demonstrates, James was aware of the extent that circumstance and random events (like the one that led my young son to a particular rat so long ago) can alter the course of history as well as the lives of individuals, making the future unpredictable. James, like my favorite writer, Chekhov, was trained as a medical doctor and became an author—though not of stories and plays (his younger brother Henry was the fiction writer) but of books and articles on philosophical, psychological, and spiritual matters. One of the founders of American pragmatism, James rejected European reliance on Platonic absolutes or on religious and philosophical doctrines that declared the historical necessity of certain future events. Despite his realization that much lies beyond our present and future control, James still believed in the independence of individual will, a view essential to the long-lasting but often precarious freedom underlying our democratic system.

by James McConkey, American Scholar |  Read more:
Image: via:

Why Is Bumbershoot Better This Year Than Previous Years?

[ed. Personally, I wouldn't go to a rock festival again if you paid me. But I understand everyone has a different burnout point. See also: A Rational Conversation: Do We Really Need A Rock Festival?]

Something feels different about Bumbershoot this year. In the weeks after One Reel announced the lineup for this summer's festival, artists, musicians, critics, and friends began saying something I hadn't heard in years: "Wow, this year's Bumbershoot looks amazing." (...)
Anecdotally, it feels like a better spread and a break from Bumbershoots past that seemed to spend a huge amount of money on superstars like Bob Dylan and leave the rest of the acts in relative neglect. I'm sure the folks at One Reel would take issue with any implication that they weren't working their asses off every year, but the public perception was that it felt less like an integrated music and culture festival and more like a Tacoma Dome gig with a few ragtag bands invited to busk in the parking lot.

The reason this year feels different, say the people at One Reel, is because it actually is. "Any given year is one person's best-ever year and another person's worst-ever year," says One Reel executive director Jon Stone. "Every year we are beat up and held up as champions at the same time, which is part of the fun." But he also says that things changed dramatically in the wake of the 2010 festival, which starred Bob Dylan, Mary J. Blige, Weezer, and Hole—and turned out to be a bust, forcing One Reel to lay off 8 of its 14 full-time, year-round festival employees. Soon after that, Teatro ZinZanni, which had started as a One Reel project, spun off and became its own entity. (...)

Why was 2010 such a crucible for the festival?

The first reason, Stone says, is that Bumbershoot found itself pouring "phenomenal resources" into headline acts. "That part is inversely proportional to the death of the record industry," he explains. "Artists used to make money on record sales and tour as a loss leader. Now artists make nothing on record sales... so fees for performances went up." In the early 2000s, he says, it cost $30,000 to put a main-stage name in Memorial Stadium and fill it up. But by the late 2000s, that number increased tenfold, costing One Reel $350,000 or more to do the same thing. "It was us not seeing the writing on the wall," he says.

The second reason was something more like hubris. In the 2000s, Stone says, Bumbershoot was getting national media attention and being compared to the big shots like Coachella and Bonnaroo. "We began to drink that Kool-Aid and thought, 'We've got to follow the leaders'" and book superstars. In retrospect, he says, that was "a huge mistake" for a few reasons. "What's been happening with the music industry in general—and festivals in particular—is a path towards unsustainability. They're not local, curated celebrations anymore. Global corporations run them." And when global corporations take over music festivals, he says, "innovation stops and the soulless and relentless milking of the consumer dollar starts."

by Brendan Killey, The Stranger | Read more:
Image: Mark Kaufman

Tuesday, August 26, 2014

The One Who Knocks

[ed. I started watching Breaking Bad several weeks ago while house-sitting at my kids' place - six episodes the first night. It's that good. After re-starting my Netflix account just to continue the saga, I'm now up to episode 56. I wouldn't say Bryan Cranston is the second coming of Marlon Brando, but he does an admirable job as Walter White (meek chemistry teacher turned meth kingpin). And everyone else in the series is first rate, too. Spoiler alert: if you haven't seen the show, you might not want to read this review.]

For years, Cranston scrabbled after guest-star roles on crappy TV shows while making his living in commercials. He played a bland smoothie with bread-loaf hair who just happened to love Shield Deodorant Soap, Arrow Shirts, Coffee-mate, and Excedrin, a middle-of-the-bell-curve guy who, despite his initial skepticism, was really sold on the product: “Now you can relieve inflamed hemorrhoidal tissue with the oxygen action of Preparation H.” He says, “I had that everyman look—nonthreatening, non-distracting, no facial hair. I fit in.”

As he was often the last person cast on a show or film, his strategy was to play the opposite of what the ensemble already had. Drama is conflict, after all. When he auditioned for the father on “Malcolm in the Middle,” the Fox sitcom about a crew of unruly brothers, he knew that the boys’ mother was bombastic, fearless, and insightful, so he played the father as gentle, timid, and obtuse. “It was a genius way to make an underwritten part work,” Linwood Boomer, the show’s creator, says. “By the third episode, we realized we had to do a lot more writing for the guy.”

“Malcolm” aired from 2000 to 2006, and established Cranston as a television fixture, if not a star. Yet even after he landed the lead in “Breaking Bad,” in 2007, he framed his character, Walter White, as an opposite—in this case, the opposite of the man Walter would become. The show is about a fifty-year-old high-school chemistry teacher in Albuquerque who, after getting a diagnosis of terminal lung cancer, secretly works with a former student, the sweet yo-yo Jesse Pinkman (Aaron Paul), to make enough crystal meth to leave a nest egg for his family. Walt’s extremely pure product becomes wildly successful, but at great cost to everyone around him.

Vince Gilligan, the show’s creator and executive producer, had sold it to the AMC network as “a man who goes from Mr. Chips to Scarface,” and, in the pilot, Walt tells his students that chemistry is “the study of change.” But Cranston quietly shifted the arc from good-man-becomes-bad to invisible-man-becomes-vivid. In pre-production, Gilligan recalls, Cranston began to construct an ideal nebbish: “Bryan said, ‘I think I should have a mustache, and it should be light and thin and look like a dead caterpillar, and I should be pale, and a little doughier, a hundred and eighty-six pounds.’ ”

Cranston explains, “I wanted Walt to have the body type of my dad, who’s now eighty-nine, like Walt was a much older man. When I was studying my dad, taking on his posture and burdens—I didn’t tell him I was doing it—I noticed I was also taking on some of his characteristics, the ‘Aw, jeez,’ or an eye roll, or”—he gave a skeptical grimace—“when Jesse did something stupid.”

Gilligan, an amiable, fatalistic Virginian, says, “I had a very schematic understanding of Walt in the early going. I was thinking structurally: we’d have a good man beset from all sides by remorseless fate.” Not only does Walt have cancer, an empty savings account, and searing regrets about his career path but his son has cerebral palsy and his wife, Skyler, is unexpectedly pregnant. Gilligan gave a wry smile. “The truth is you have to be very schematic indeed to force someone into cooking crystal meth.”

Instead, Cranston played the role so that Walter’s lung-cancer diagnosis catalyzes a gaudy midlife crisis—so that a luna moth breaks from the drabbest of cocoons. Across the show’s five seasons, which depict a lively two years, Walt is increasingly inhabited by Heisenberg, his drug-dealing pseudonym and alter ego—a figure Cranston describes as “the emotionless, brave, risk-taking imaginary friend who looks out for Walt’s best interests.” Early in the first season, when Walt scurries out of his Pontiac Aztek to retrieve the drug dealer Krazy-8, who lies unconscious on a suburban corner in broad daylight, he’s terrified of being seen, and takes tiny nerdy steps, his shoulders twitching with self-consciousness. There is a touch of Hal, the father Cranston played on “Malcolm in the Middle,” about him still—he might almost waggle his hands in panic for comic effect. (The first season of the show was particularly funny, if darkly so, and Vince Gilligan asked his colleagues whether he should submit it to the Emmys as a drama or a comedy.)

After undergoing chemotherapy, Walt shaves his head and grows a Vandyke, alpha-male plumage that helps him play the bruiser. By the end of the second season, he rousts two would-be meth cooks from his territory with pure assurance: a wide stance, arms relaxed yet poised to strike. And when he reveals his hidden powers to his wife in the famous “I am the one who knocks!” speech, he levels his hand at her like a gun. “The more believable humanity of Walter White—the discovery that he’s not a good man but an everyman—is due to Bryan,” Gilligan said. “The writers realized, from his acting, that Walt isn’t cooking for his family; he’s cooking for himself.”

By the fifth season, having killed Krazy-8 and become responsible for at least a hundred and ninety-four other deaths, Walt has no anxiety left. His voice is low and commanding, his manner brash—he’s eager to be seen. He was cowed at first by his brother-in-law, Hank Schrader, a bluff D.E.A. agent who treats him with kindly contempt. But soon enough he’s snarling at Hank, “I’m done explaining myself,” and taunting him for misidentifying Heisenberg: “This genius of yours, maybe he’s still out there.” Then he eliminates his boss, a drug lord named Gus Fring (Giancarlo Esposito), by blowing his face off with a wheelchair bomb. As Walt takes on the role of the dominant dealer, Cranston has him unconsciously appropriate some of Esposito’s coiled stillness. “I wanted to plant a subliminal thing with the audience,” he says. “But it was Bryan who modelled Walt’s body language on Gus’s—Walt didn’t know what he was doing. All he knew is that he felt more confident with his shoulders back.”

In movies, unless you’re the star, you’re going to play an archetype. Studios, noticing the authority in Cranston’s persona, have often cast him as a colonel (“Saving Private Ryan,” “John Carter,” “Red Tails”). Ben Affleck, who hired him to be the C.I.A.’s version of a colonel in “Argo,” says, “Bryan is the boss you might actually like. He’s not a general and he’s not a sergeant—he’s a colonel.” Yet Cranston’s friend Jason Alexander, who starred as George Costanza on “Seinfeld,” says, “Bryan doesn’t play an idea particularly well, those military roles. That’s because his strongest card is complexity, where you can’t figure out what he represents until he gradually reveals himself.” A producer friend of Cranston’s observes that he doesn’t stand out in such films as “Total Recall,” where he chewed the scenery as a dictator, “because he wasn’t reined in. Actors want to act, but you need someone who will say, ‘Give me the take where he’s doing less.’ ”

A cable series, a format that showcases accretive subtlety, is where Cranston could truly shine. Luckily, cable’s golden age arrived just as he did. “Bryan had to grow into his weight as an actor,” John O’Hurley, a close friend of Cranston’s since the mid-eighties, when they were both married to the same woman on the soap opera “Loving,” says. “He became dangerous when he began letting his eyes go dead. It’s the sign of a man with nothing to lose.”

by Tad Friend, New Yorker |  Read more:
Image: Ian Wright

How Plagues Really Work

The latest epidemic to terrify the Western world is Ebola, a virus that has killed hundreds in Africa in 2014 alone. No wonder there was so much worry when two infected health care workers from the United States were transported home from Liberia for treatment – why bring this plague to the US, exposing the rest of the country as well? But the truth is that Ebola, murderous though it is, doesn’t have what it takes to produce a pandemic, a worldwide outbreak of infectious disease. It spreads only through intimate contact with infected body fluids; to avoid Ebola, just refrain from touching sweat, blood or the bodies of the sick or dead.

Yet no logic can quell our pandemic paranoia, which first infected the zeitgeist with the publication of Laurie Garrett’s The Coming Plague (1994) and Richard Preston’s Hot Zone (1995). These books suggested that human incursion into rainforests and jungles would stir deadly viruses in wait; perturb nature and she nails you in the end. By the late 1990s, we were deep into the biological weapons scare, pumping billions of dollars in worldwide government funding to fight evil, lab-made disease. As if this weren’t enough, the panic caused from 2004 to 2007 by reports of the H5N1 or bird flu virus etched the prospect of a cross-species Andromeda strain in the Western mind.

The fear seems confirmed by historical memory: after all, plagues have killed a lot of people, and deadly diseases litter history like black confetti. The Antonine Plague, attributed to measles or smallpox in the year 165 CE, killed the Roman Emperor Marcus Aurelius and millions of his subjects. The Justinian Plague, caused by the deadly bacterial pathogen Yersinia pestis, spread from North Africa across the Mediterranean Sea to Constantinople and other cities along the Mediterranean. By 542, infected rats and fleas had carried the infection as far north as Rennes in France and into the heart of Germany. Millions died.

Then there was the Black Death of 1348-50, also caused by Yersinia pestis, but this time spread by human fleas and from human lung to human lung, through the air. The plague spread along the Silk Road to what is now Afghanistan, India, Persia, Constantinople, and thence across the Mediterranean to Italy and the rest of Europe, killing tens of millions worldwide. Of all the past pandemics, the 1918 influenza (also known as the Spanish flu) is now considered the über-threat, the rod by which all other pandemics are measured. It killed 40 million people around the globe.

It was the great Australian virologist Frank Macfarlane Burnet who argued that the deadliest diseases were those newly introduced into the human species. It seemed to make sense: the parasite that kills its host is a dead parasite since, without the host, the germ has no way to survive and spread. According to this argument, new germs that erupt into our species will be potential triggers for pandemics, while germs that have a long history in a host species will have evolved to be relatively benign.

Many health experts take the notion further, contending that any coming plague will come from human intrusion into the natural world. One risk, they suggest, comes when hungry people in Africa and elsewhere forge deep into forests and jungles to hunt ‘bushmeat’ – rodents, rabbits, monkeys, apes – with exposure to dangerous pathogens the unhappy result. Those pathogens move silently among wild animals, but can also explode with terrifying ferocity among people when humans venture where they shouldn’t. According to the same line of thought, another proposed risk would result when birds spread a new pandemic strain to chickens in factory farms and, ultimately, to us.

But there’s something in these scenarios that’s not entirely logical. There is nothing new in the intimate contact between animals and people. Our hominid ancestors lived on wildlife before we ever evolved into Homo sapiens: that’s why anthropologists call them hunter-gatherers, a term that still applies to some modern peoples, including bushmeat hunters in West Africa. After domesticating animals, we lived close beside them, keeping cows, pigs and chickens in farmyards and even within households for thousands of years. Pandemics arise out of more than mere contact between human beings and animals: from an evolutionary point of view, there is a missing step between animal pathogen and human pandemic that’s been almost completely overlooked in these terrifying but entirely speculative ideas.

by Wendy Orent, Aeon |  Read more:
Image: Stefano Rellandini/Reuters

Monday, August 25, 2014


Photo: markk

Photo: markk

Mimicking Airlines, Hotels Get Fee-Happy

[ed. Companion piece to the post following this one. From cable charges, to airline fees, to road tolls, to credit/debit card penalties, to miscellaneous utility assessments and on and on and on... consumers are getting dinged like never before.] 

Forget bad weather, traffic jams and kids asking, "Are we there yet?" The real headache for many travelers is a quickly-growing list of hotel surcharges, even for items they never use.

Guaranteeing two queen beds or one king bed will cost you, as will checking in early or checking out late. Don't need the in-room safe? You're likely still paying. And the overpriced can of soda may be the least of your issues with the hotel minibar.

Vacationers are finding it harder to anticipate the true cost of their stay, especially because many of these charges vary from hotel to hotel, even within the same chain.

Coming out of the recession, the travel industry grew fee-happy. Car rental companies charged extra for services such as electronic toll collection devices and navigation systems. And airlines gained notoriety for adding fees for checking luggage, picking seats in advance, skipping lines at security and boarding early. Hotel surcharges predate the recession, but recently properties have been catching up to the rest of the industry.

"The airlines have done a really nice job of making hotel fees and surcharges seem reasonable," says Bjorn Hanson, a professor at New York University's hospitality school.

This year, hotels will take in a record $2.25 billion in revenue from such add-ons, 6 percent more than in 2013 and nearly double that of a decade ago, according to a new study released Monday by Hanson. Nearly half of the increase can be attributed to new surcharges and hotels increasing the amounts of existing fees.

by Scott Mayerowitz, AP |  Read more:
Image: John Locher/AP

Did Congestion Charging Just Go Viral?

[ed. I'd never heard of congestion charging until today. Sounds like a pretty hard-sell.]

Congestion charging or pricing is the practice of setting up cordon tolls around the city on a large-scale to charge entrants for entering during peak hours. Ideally, this is done in an automatic fashion with cameras registering your license plate and directly billing you. This is different from low emissions zones, which are specific zones that limit the type of vehicles that can enter, and when.

City-scale congestion charging is picking up steam as a policy tool to free cities from crippling traffic.Singapore led the way starting in 1975, and LondonMilan, and Stockholm have since followed suit. In 2008, the former Mayor of New York City Michael Bloomberg led a valiant, but eventually doomed effort to install congestion charging around Manhattan. However, despite New York’s setback and otherwise sporadic progress, three news items make me wonder if congestion pricing is reaching a tipping point:

First, despite New York’s failed attempt, it looks as if a bottom-up plan could revive the city’s efforts. With crippling congestion and underfunded transit projects, New Yorkers are starting to rally to the cause. The key to success this time might be better consultation and more community engagement. So far so good.

Second, Stockholm’s at-first shaky congestion pricing plan is now considered an unobtrusive part of life. In fact, its popularity spurred Gothenburg to adopt it, and there are now proposals for all major Swedish cities to adopt the system [in Swedish].

Finally, we turn to the mother lode of traffic: China. Not only have Beijing andShanghai studied the possibility of congestion charging for a while now, it appears that Beijing is going to institute it next year, using its many ring roads to its advantage.

by Tali Trigg, Scientifc American | Read more:
Image: Stockholm Transport Styrelsen.

Sunday, August 24, 2014


Xiao Wen Ju fronts the Lane Crawford Spring/Summer 2014 Campaign
via:

Mutablend (on Flickr), No Communication No Love
via:

Every Insanely Mystifying Paradox in Physics: A Complete List


Today’s brain-melter: Every Insanely Mystifying Paradox in Physics. It’s all there, from the Greisen-Zatsepin-Kuzmin limit to quantum immortality to, of course, the tachyonic antitelephone.
A tachyonic antitelephone is a hypothetical device in theoretical physics that could be used to send signals into one’s own past. Albert Einstein in 1907 presented a thought experiment of how faster-than-light signals can lead to a paradox of causality, which was described by Einstein and Arnold Sommerfeld in 1910 as a means “to telegraph into the past”.
If you emerge with your brain intact, at the very least, you’ll have lost a couple of hours to the list.

by Cliff Pickover, Sprott Physics, Univ. of Wisconson | Read more:
via: Kottke.org

Saturday, August 23, 2014

R.E.M.


[ed. Repost. This disappeared for a while but is now back up on YouTube. Great to see them so young and going for it.]

The Best Drones

Of the dozens of drones aimed at the aspiring aerial photographer/videographer, the $1,300 DJI Phantom 2 Vision+ is the one we recommend for most people, as it’s the only one that is easy to control while having great battery life and range, terrific safety features, and a smartphone app that lets you preview your on-drone camera for photography and piloting ease.

It was the obvious favorite going into this guide due to its numerous editorial accolades and positive user reviews, so I tried my best to find something better. But after over 25 hours of research, 10 hours of interviews with experts, and half a day of hands-on testing against its closest competition (on top of over 100 previous drone flights of my own), I had to agree with the crowd. Nothing comes close to the Phantom 2 Vision+. Its combination of ease-of-use and advanced features simply can’t be matched by anything currently available.

In addition to being easy to fly, the V+ comes equipped with a relatively high-quality camera that’s almost as good as the GoPro Hero 3+ (rare), a three-plane gimbal for image stabilization (rare), and a Wi-Fi extender that gives you the ability to see real-time stats and what you’re shooting from over 2,000 feet away on a smartphone you mount to your radio controller (also rare). It also has pre-programmed flight controls tailored to beginners and advanced pilots, a standout 2,000-foot range, a battery that lasts a stellar 25 minutes instead of the usual 12, the ability to fly autonomously (thanks to a recently announced Ground Station function), and the standard safety setting that prompts the drone to return to the launch pad if it loses connection with the radio transmitter.

In other words, it’s exactly what you’d want, expect, and need from a camera drone.

You can get most of these features on other drones if you have the technical know-how and are willing to figure things out, tools in hand, but unless you’re into tinkering for tinkering’s sake, it’s just not worth the time.

$1,300 sounds like a lot of money, but the Phantom 2 Vision+ (henceforth referred to as the “V+”) is a surprisingly great value if you run the numbers: in order to get similar capabilities from a cheaper drone using aftermarket parts, you’d have to spend over $1,500 and futz with the inside wiring of your drone. And you’d still wind up with lesser capabilities.

That said, $1,300 is a lot of money to spend on a thing that you could crash on its maiden voyage. If you’re unfamiliar with how to fly drones or just need to fine tune your skills (and who doesn’t), we highly suggest getting a cheapo trainer drone before putting your $1,300 investment aloft. For that, we recommend the highly-touted $90 Blade Nano QX. It’s essentially a palm-sized quadcopter without the camera and fancy features like GPS-assisted position hold.

If you’re already confused by the terms in this guide, we’ve got you covered with a glossary.1 We explain any technical terms we use, but other sites don’t; we definitely recommend keeping it handy if you’re planning on clicking through to our sources.

by Eric Hansen, Wirecutter |  Read more:
Images: DJI Phantom 2 Vision+ and Blade Nano QX

John Piper, Eye and Camera: Red, Blue and Yellow (1980)
via:

The Truth We Won’t Admit: Drinking Is Healthy

Bob Welch, former star Dodgers pitcher, died in June from a heart attack at age 57. In 1981, Welch published (with George Vecsey) Five O’Clock Comes Early: A Cy Young Award-Winner Recounts His Greatest Victory, in which he detailed how he became an alcoholic at age 16: “I would get a buzz on and I would stop being afraid of girls. I was shy, but with a couple of beers in me, it was all right.”

In his early 20s, he recognized his “disease” and quit drinking. But I wonder if, like most 20-something problem drinkers (as shown by all epidemiological research), he would otherwise have outgrown his excessive drinking and drunk moderately?

If he had, he might still be alive. At least, that’s what the odds say.

Had Welch smoked, his obituaries would have mentioned it by way of explaining how a world-class athlete might have died prematurely of heart disease. But no one would dare suggest that quitting drinking might be responsible for his heart attack.

In fact, the evidence that abstinence from alcohol is a cause of heart disease and early death is irrefutable—yet this is almost unmentionable in the United States. Even as health bodies like the CDC and Dietary Guidelines for Americans(prepared by Health and Human Services) now recognize the decisive benefits from moderate drinking, each such announcement is met by an onslaught of opposition and criticism, and is always at risk of being reversed.

Noting that even drinking at non-pathological levels above recommended moderate limits gives you a better chance of a longer life than abstaining draws louder protests still. Yet that’s exactly what the evidence tells us. (...)

Given the multitude of studies of the effects of alcohol on mortality (since heart disease is the leading killer of men and women, drinking reduces overall mortality significantly), meta-analyses combining the results of the best-designed such studies can be generated. In 2006, the Archives of Internal Medicine, an American Medical Association journal, published an analysis based on 34 well-designed prospective studies—that is, research which follows subjects for years, even decades. This meta-analysis, incorporating a million subjects, found that “1 to 2 drinks per day for women and 2 to 4 drinks per day for men are inversely associated with total mortality.”

So the more you drink—up to two drinks a day for woman, and four for men—the less likely you are to die. You may have heard that before, and you may have heard it doubted. But the consensus of the science is overwhelming: It is true.

Although I dispute many of the caveats offered against the life-saving benefits of alcohol, I will endorse two. First, these outcome data do not apply to women with the “breast-cancer gene” mutations (BRCA 1 or 2) or a first-degree (mother, sister) relation who has had breast cancer, for whom alcohol consumption is far riskier. Second, drinking 10 drinks Friday and Saturday nights does not convey the benefits of two or three drinks daily, even though your weekly totals would be the same: Frequent, heavy binge drinking is unhealthy. But then you knew that already, didn’t you? If you don’t distinguish binge drinking from daily moderate drinking, that would be due to Americans’ addiction-phobia, which causes them to interpret any daily drinking as addictive.

The global summary of alcohol’s benefits raises a key question: How much do you have to drink regularly before you become as likely to die as an abstainer? We’ll see below.

by Stanton Peele, Pacific Standard |  Read more:
Image: Ben Hussman/Flickr

William Kendall, Wipe Out (1998)
via:

How Surf Mania Was Invented

John Severson's path towards becoming surfing’s first editor-in-chief began with a stroke of good fortune. Upon being drafted to the US Army in 1956, Severson was told that he would be serving out his active duty in Germany. However, after another draftee failed his Morse code exam, Severson’s presence was required elsewhere and he received new orders: “You’re going to Hawaii.”

To his surprise, Severson’s fellow troops were not charmed by the thought of spending two years in the middle of the Pacific. Almost fifty years later, he still recalls their complaints, “We’re going to Hawaii? There’s nothing to do there.” But, for Severson, things could not have been farther from the case. Like every other California surfer from the 50s, he had grown up riding a redwood board and dreaming of Hawaii’s gargantuan waves.

While working for the Army as an illustrator, Severson was encouraged to surf daily as a member of the US Army Surf Team and fell in with a generation of surfers who pioneered new techniques in big wave riding. After choosing to stay in Hawaii, he began selling his drawings and paintings on the beach, eventually being able to acquire the 16mm camera he used to make his notorious surf films.

Filled with DIY exuberance, Severson’s films of the early 60s were created as a way of celebrating the energy of surfing. Citing Leni Riefenstahl’s Olympia as an influence, Severson oriented his surf footage around the formal splendor of the body in motion. Captured in high contrast black-and-white, the remaining stills of these films depict bodies contorted and flexed against the enormous force of an ink-black ocean. Composed like scenes from another dimension, Severson’s films communicated the ineluctable verve of what was then a niche pastime, sending an invitation to those who had never surfed.

After witnessing the riot-like environment of excitement at his screenings, Severson decided to produce a booklet called The Surfer that he sold during the premieres of his 1960 film Surf Fever. It featured black-and-white photos, writing, and cartoons, as well as surf maps and instructional articles for new surfers. After the booklet sold out five thousand copies, Severson decided to dedicate his attention to creating the magazine now known as SURFER.

As its title suggested, SURFER was a publication that aimed towards expressing the culture of the person who surfed, rather than the sport itself. With hoards of newcomers being brought to surfing by the Beach Boys and Hollywood films such as Gidget, SURFER had a mission to set the record strait. Its editorial program defined surfing as a way of looking out onto the world, an all-encompassing lifestyle that had its own social responsibilities.

Since it was the first magazine of its kind, SURFER gave Severson the freedom to fully craft what has became a massive genre. As surf writer Sam George would later say, “Before John Severson, there was no ‘surf media,’ no ‘surf industry,’ no ‘surf culture’ – not at least in the way we understand it today.’”

by O32c |  Read more:
Image: Greg Noll at Pipeline, 1964. John Severson

What is the Great American Novel?

In Tracy Letts’s play Superior Donuts (2010), Arthur, a bakery owner, is presented with a bundle of notebooks by a new employee, Franco, who explains that they contain “the Great American Novel, my man. Authored by yours truly”. Franco attributes Arthur’s scepticism about this claim to racism: “You think I can’t write the Great American Novel ’cause I’m a black man”. Lawrence Buell’s study of the concept of the “Great American Novel” (or “GAN” as Buell, using Henry James’s acronym, calls it) explains Arthur’s reaction. Before the mid-twentieth century, only one critic believed that a GAN could be written by someone who was not white, and Buell’s survey of literature from Washington Irving to Jonathan Franzen suggests there has been little change since in perceptions of literary greatness.

One of the central claims of The Dream of the Great American Novel is that novels are uniquely well suited to the task of representing what is quintessentially American because they are “carriers and definers of evolving ‘national imaginaries’”. This long and detailed study considers the works of fiction that have, at various times, been deemed contenders for the crown, and attempts to explain why. Buell is also interested in those books that were lauded in their own moment but went out of fashion, and others whose merits are more evident with the benefit of hindsight. Along with exploring the nation’s most distinguished fiction, Buell considers why America might “dream” of locating the single novel that best expresses Americanness. He acknowledges the paradox that, although the concept of the Great American Novel seems to articulate “national swagger”, the most praised GANs are “anything but patriotic”. Rather, they convey “national self-criticism”, typically on the grounds of social inequality.

Introduced in print by John W. De Forest in January 1868, the phrase “Great American Novel” had already been used by P. T. Barnum to mock publishers for puffing their latest books, Buell writes, confirming that the GAN is at least as much a marketing device as a reliable measure of literary merit. The first novel to be named “the Greatest Book of the Age” was Uncle Tom’s Cabin (1852) by Harriet Beecher Stowe, which supposedly provoked the “great war” that ended slavery in the Southern states. For this reason, Buell deems it the preeminent American example of activist art: it “changed the world” and so its status endures despite criticism of its depiction of black people. It also shows that it is possible for a GAN to be written by a woman, although critical consensus suggests that hardly any have been. The heyday of serious debate over the Great American Novel ran from the 1860s to the 1920s, when the promise of the American Dream was equally prominent. After The Great Gatsby (1925) killed the Dream, along with its hero, interest in pinpointing GANs waxed and waned in popularity, perhaps because an increasingly heterogeneous nation found it hard to believe that a single novel – even a very long one – could represent America in all its variety.

Since the function of the GAN is to represent Americanness, Buell proposes that its aims are best fulfilled by a body of work rather than a single novel. The key works he identifies are famous and familiar. Rather than discussing these books in order of publication, Buell arranges them according to four themes, or “scripts”. This decision refreshes critical debate by showing how novels are “in conversation” with each other, and leads to some intriguing comparisons, such as William Faulkner’s Absalom! Absalom! with Margaret Mitchell’s Gone with the Wind, both published in 1936.

by Sarah Graham, TLS |  Read more:
Image: Moby Dick