Tuesday, January 28, 2014

Seven Questions for Bob Dylan

Bob Dylan is either the most public private man in the world or the most private public one. He has a reputation for being silent and reclusive; he is neither. He has been giving interviews—albeit contentious ones—for as long as he's been making music, and he's been making music for more than fifty years. He's seventy-two years old. He's written one volume of an autobiography and is under contract to write two more. He's hosted his own radio show. He exhibits his paintings and his sculpture in galleries and museums around the world. Ten years ago, he cowrote and starred in a movie, Masked and Anonymous, that was about his own masked anonymity. He is reportedly working on another studio recording, his thirty-sixth, and year after year and night after night he still gets on stage to sing songs unequaled in both their candor and circumspection. Though famous as a man who won't talk, Dylan is and always has been a man who won't shut up.

And yet he has not given in; he has preserved his mystery as assiduously as he has curated his myth, and even after a lifetime of compulsive disclosure he stands apart not just from his audience but also from those who know and love him. He is his own inner circle, a spotlit Salinger who has remained singular and inviolate while at the same time remaining in plain sight.

It's quite a trick. Dylan's public career began at the dawn of the age of total disclosure and has continued into the dawn of the age of total surveillance; he has ended up protecting his privacy at a time when privacy itself is up for grabs. But his claim to privacy is compelling precisely because it's no less enigmatic and paradoxical than any other claim he's made over the years. Yes, it's important to him—"of the utmost importance, of paramount importance," says his friend Ronee Blakley, the Nashville star who sang with Dylan on his Rolling Thunder tour. And yes, the importance of his privacy is the one lesson he has deigned to teach, to the extent that his friends Robbie Robertson and T Bone Burnett have absorbed it into their own lives. "They both have learned from him," says Jonathan Taplin, who was the Band's road manager and is now a professor at the University of Southern California. "They've learned how to keep private, and they lead very private lives. That's the school of Bob Dylan—the smart guys who work with him learn from him. Robbie's very private. And T Bone is so private, he changes his e-mail address every three or four weeks."

How does Dylan do it? How does he impress upon those around him the need to protect his privacy? He doesn't. They just do. That's what makes his privacy Dylanesque. It's not simply a matter of Dylan being private; it's a matter of Dylan's privacy being private—of his manager saying, when you call, "Oh, you're the guy writing about Bob Dylan's privacy. How can I not help you?" (...)

"I've always been appalled by people who come up to celebrities while they're eating," says Lynn Goldsmith, a photographer who has taken pictures of Dylan, Springsteen, and just about every other god of the rock era. "But with Dylan, it's at an entirely different level. With everybody else, it's 'We love you, we love your work.' With Dylan, it's 'How does it feel to be God?' It's 'I named my firstborn after you.' In some ways, the life he lives is not the life he's chosen. In some ways, the life he leads has been forced upon him because of the way the public looks upon him to be."

That's the narrative, anyway—Dylan as eternal victim, Dylan as the measure of our sins. There is another narrative, however, and it's that Dylan is not just the first and greatest intentional rock 'n' roll poet. He's also the first great rock 'n' roll asshole. The poet expanded the notion of what it was possible for a song to express; the asshole shrunk the notion of what it was possible for the audience to express in response to a song. The poet expanded what it meant to be human; the asshole noted every human failing, keeping a ledger of debts never to be forgotten or forgiven. As surely as he rewrote the songbook, Dylan rewrote the relationship between performer and audience; his signature is what separates him from all his presumed peers in the rock business and all those who have followed his example. "I never was a performer who wanted to be one of them, part of the crowd," he said, and in that sentence surely lies one of his most enduring achievements: the transformation of the crowd into an all-consuming but utterly unknowing them.

"We played with McCartney at Bonnaroo, and the thing about McCartney is that he wants to be loved so much," Jeff Tweedy says. "He has so much energy, he gives and gives and gives, he plays three hours, and he plays every song you want to hear. Dylan has zero fucks to give about that. And it's truly inspiring.

by Tom Junod, Esquire |  Read more:
Image: Fame Pictures; historic Dylan photos: AP

Monday, January 27, 2014


Brenda Cablayan, Houses on the Hill
via:

Obliquity

If you want to go in one direction, the best route may involve going in the other. Paradoxical as it sounds, goals are more likely to be achieved when pursued indirectly. So the most profitable companies are not the most profit-oriented, and the happiest people are not those who make happiness their main aim. The name of this idea? Obliquity

The American continent separates the Atlantic Ocean in the east from the Pacific Ocean in the west. But the shortest crossing of America follows the route of the Panama Canal, and you arrive at Balboa Port on the Pacific Coast some 30 miles to the east of the Atlantic entrance at Colon.

A map of the isthmus shows how the best route west follows a south-easterly direction. The builders of the Panama Canal had comprehensive maps, and understood the paradoxical character of the best route. But only rarely in life do we have such detailed knowledge. We are lucky even to have a rough outline of the terrain.

Before the canal, anyone looking for the shortest traverse from the Atlantic to the Pacific would naturally have gazed westward. The south-east route was found by Vasco Nunez de Balboa, a Spanish conquistador who was looking for gold, not oceans.

George W. Bush speaks mangled English rather than mangled French because James Wolfe captured Quebec in 1759 and made the British crown the dominant influence in Northern America. Eschewing obvious lines of attack, Wolfe’s men scaled the precipitous Heights of Abraham and took the city from the unprepared defenders. There are many such episodes in military history. The Germans defeated the Maginot Line by going round it, while Japanese invaders bicycled through the Malayan jungle to capture Singapore, whose guns faced out to sea. Oblique approaches are most effective in difficult terrain, or where outcomes depend on interactions with other people. Obliquity is the idea that goals are often best achieved when pursued indirectly.

Obliquity is characteristic of systems that are complex, imperfectly understood, and change their nature as we engage with them. (...)

The distinction between intent and outcome is central to obliquity. Wealth, family relationships, employment all contribute to happiness but these activities are not best conducted with happiness as their goal. The pursuit of happiness is a strange phrase in the US constitution because happiness is not best achieved when pursued. A satisfying life depends above all on building good personal relationships with other people – but we entirely miss the point if we seek to develop these relationships with our personal happiness as a primary goal.

Humans have well developed capacities to detect purely instrumental behaviour. The actions of the man who buys us a drink in the hope that we will buy his mutual funds are formally the same as those of the friend who buys us a drink because he likes our company, but it is usually not too difficult to spot the difference. And the difference matters to us. “Honesty is the best policy, but he who is governed by that maxim is not an honest man,” wrote Archbishop Whately three centuries ago. If we deal with someone for whom honesty is the best policy, we can never be sure that this is not the occasion on which he will conclude that honesty is no longer the best policy. Such experiences have been frequent in financial markets in the last decade. We do better to rely on people who are honest by character rather than honest by choice.

by John Kay, Financial Times via Naked Capitalism |  Read more:
Image: via:

Happiness and Its Discontents

A quick survey of our culture—particularly our self-help culture—confirms Freud's observation. One could even say that, in our era, the idea that we should lead happy, balanced lives carries the force of an obligation: We are supposed to push aside our anxieties in order to enjoy our lives, attain peace of mind, and maximize our productivity. The cult of "positive thinking" even assures us that we can bring good things into our lives just by thinking about them. (...)

Needless to say, our fixation on the ideal of happiness diverts our attention from collective social ills, such as socioeconomic disparities. As Barbara Ehrenreich has shown, when we believe that our happiness is a matter of thinking the right kinds of (positive) thoughts, we become blind to the ways in which some of our unhappiness might be generated by collective forces, such as racism or sexism. Worst of all, we become callous to the lot of others, assuming that if they aren't doing well, if they aren't perfectly happy, it's not because they're poor, oppressed, or unemployed but because they're not trying hard enough.

If all of that isn't enough to make you suspicious of the cultural injunction to be happy, consider this basic psychoanalytic insight: Human beings may not be designed for happy, balanced lives. The irony of happiness is that it's precisely when we manage to feel happy that we are also most keenly aware that the feeling might not last. Insofar as each passing moment of happiness brings us closer to its imminent collapse, happiness is merely a way of anticipating unhappiness; it's a deviously roundabout means of producing anxiety.

Take the notion that happiness entails a healthy lifestyle. Our society is hugely enthusiastic about the idea that we can keep illness at bay through a meticulous management of our bodies. The avoidance of risk factors such as smoking, drinking, and sexual promiscuity, along with a balanced diet and regular exercise, is supposed to guarantee our longevity. To a degree, that is obviously true. But the insistence on healthy habits is also a way to moralize illness, to cast judgment on those who fail to adhere to the right regimen. Ultimately, as the queer theorist Tim Dean has illustrated, we are dealing with a regulation of pleasure—a process of medicalization that tells us which kinds of pleasures are acceptable and which are not.

I suspect that beneath our society's desperate attempts to minimize risk, and to prescribe happiness as an all-purpose antidote to our woes, there resides a wretched impotence in the face of the intrinsically insecure nature of human existence. As a society, we have arguably lost the capacity to cope with this insecurity; we don't know how to welcome it into the current of our lives. We keep trying to brush it under the rug because we have lost track of the various ways in which our lives are not meant to be completely healthy and well adjusted.

Why, exactly, is a healthy and well-adjusted life superior to one that is filled with ardor and personal vision but that is also, at times, a little unhealthy and maladjusted? Might some of us not prefer lives that are heaving with an intensity of feeling and action but that do not last quite as long as lives that are organized more sensibly? Why should the good life equal a harmonious life? Might not the good life be one that includes just the right amount of anxiety? Indeed, isn't a degree of tension a precondition of our ability to recognize tranquillity when we are lucky enough to encounter it? And why should our lives be cautious rather than a little dangerous? Might not the best lives be ones in which we sometimes allow ourselves to become a little imprudent or even a tad unhinged?

by Mari Ruti, Chronicle of Higher Education |  Read more:
Image: Geoffrey Moss

Sunday, January 26, 2014

Game Change


By the time the contenders for Super Bowl XLVIII were set, two weekends ago, a hero and a villain had been chosen, too. The Denver Broncos’ quarterback, the aging, lovable Peyton Manning, had outplayed the Patriots to win the A.F.C. title. Meanwhile, in the N.F.C. championship game, Richard Sherman, a cornerback for the Seattle Seahawks, became the designated bad guy. With thirty-one seconds left to play, Colin Kaepernick, the San Francisco 49ers quarterback, had the ball on Seattle’s eighteen-yard line—the 49ers were losing by six points and needed a touchdown. He spotted Michael Crabtree, a wide receiver, and sent him the pass. Sherman twisted up in the air until he seemed almost in synch with the ball’s spiralling, then tipped the ball into the hands of another defender for an interception, and won the game.

Sherman was swarmed by his teammates but broke away to chase after Crabtree. He stretched out a hand and said, “Hell of a game, hell of a game,” to which Crabtree responded by shoving him in the face mask. Moments later, Sherman was surrounded by reporters and cameramen; by then, he had acquired an N.F.C. champions’ cap, which left his eyes in shadow, and his long dreadlocks hung loose. When Erin Andrews, of Fox Sports, asked him about the final play, he more or less exploded. “I’m the best corner in the game!” he proclaimed. “When you try me with a sorry receiver like Crabtree, that’s the result you gonna get! Don’t you ever talk about me!”

“Who was talking about you?” Andrews asked.

“Crabtree! Don’t you open your mouth about the best, or I’m gonna shut it for you real quick! L.O.B.!”

L.O.B.: that’s Legion of Boom, the nickname of the Seattle defense. The video of the “epic rant,” as it was called, went viral. Andrews told GQ that the response was so overwhelming that her Twitter account froze. She added, “Then we saw it was taking on a racial turn.” Some people expressed alarm that an angry black man was shouting at a blond-haired woman. (Andrews immediately shut down that line of complaint.) Many people expressed a hope that Manning would put Sherman in his place. The names that he was called were numerous, offensive, and explicitly racial, but one that stood out—it was used more than six hundred times on television, according to Deadspin—was “thug.”

by Amy Davidson, New Yorker |  Read more:
Image: via:

Almost Everything in "Dr. Strangelove" Was True

[ed. One of my all-time, top-ten movie favorites. Maybe top-five.]

This month marks the fiftieth anniversary of Stanley Kubrick’s black comedy about nuclear weapons, “Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb.” Released on January 29, 1964, the film caused a good deal of controversy. Its plot suggested that a mentally deranged American general could order a nuclear attack on the Soviet Union, without consulting the President. One reviewer described the film as “dangerous … an evil thing about an evil thing.” Another compared it to Soviet propaganda. Although “Strangelove” was clearly a farce, with the comedian Peter Sellers playing three roles, it was criticized for being implausible. An expert at the Institute for Strategic Studies called the events in the film “impossible on a dozen counts.” A former Deputy Secretary of Defense dismissed the idea that someone could authorize the use of a nuclear weapon without the President’s approval: “Nothing, in fact, could be further from the truth.” (See a compendium of clips from the film.) When “Fail-Safe”—a Hollywood thriller with a similar plot, directed by Sidney Lumet—opened, later that year, it was criticized in much the same way. “The incidents in ‘Fail-Safe’ are deliberate lies!” General Curtis LeMay, the Air Force chief of staff, said. “Nothing like that could happen.” The first casualty of every war is the truth—and the Cold War was no exception to that dictum. Half a century after Kubrick’s mad general, Jack D. Ripper, launched a nuclear strike on the Soviets to defend the purity of “our precious bodily fluids” from Communist subversion, we now know that American officers did indeed have the ability to start a Third World War on their own. And despite the introduction of rigorous safeguards in the years since then, the risk of an accidental or unauthorized nuclear detonation hasn’t been completely eliminated.

The command and control of nuclear weapons has long been plagued by an “always/never” dilemma. The administrative and technological systems that are necessary to insure that nuclear weapons are always available for use in wartime may be quite different from those necessary to guarantee that such weapons can never be used, without proper authorization, in peacetime. During the nineteen-fifties and sixties, the “always” in American war planning was given far greater precedence than the “never.” Through two terms in office, beginning in 1953, President Dwight D. Eisenhower struggled with this dilemma. He wanted to retain Presidential control of nuclear weapons while defending America and its allies from attack. But, in a crisis, those two goals might prove contradictory, raising all sorts of difficult questions. What if Soviet bombers were en route to the United States but the President somehow couldn’t be reached? What if Soviet tanks were rolling into West Germany but a communications breakdown prevented NATO officers from contacting the White House? What if the President were killed during a surprise attack on Washington, D.C., along with the rest of the nation’s civilian leadership? Who would order a nuclear retaliation then?

With great reluctance, Eisenhower agreed to let American officers use their nuclear weapons, in an emergency, if there were no time or no means to contact the President. Air Force pilots were allowed to fire their nuclear anti-aircraft rockets to shoot down Soviet bombers heading toward the United States. And about half a dozen high-level American commanders were allowed to use far more powerful nuclear weapons, without contacting the White House first, when their forces were under attack and “the urgency of time and circumstances clearly does not permit a specific decision by the President, or other person empowered to act in his stead.” Eisenhower worried that providing that sort of authorization in advance could make it possible for someone to do “something foolish down the chain of command” and start an all-out nuclear war. But the alternative—allowing an attack on the United States to go unanswered or NATO forces to be overrun—seemed a lot worse. Aware that his decision might create public unease about who really controlled America’s nuclear arsenal, Eisenhower insisted that his delegation of Presidential authority be kept secret. At a meeting with the Joint Chiefs of Staff, he confessed to being “very fearful of having written papers on this matter.”

President John F. Kennedy was surprised to learn, just a few weeks after taking office, about this secret delegation of power. “A subordinate commander faced with a substantial military action,” Kennedy was told in a top-secret memo, “could start the thermonuclear holocaust on his own initiative if he could not reach you.” Kennedy and his national-security advisers were shocked not only by the wide latitude given to American officers but also by the loose custody of the roughly three thousand American nuclear weapons stored in Europe. Few of the weapons had locks on them. Anyone who got hold of them could detonate them. And there was little to prevent NATO officers from Turkey, Holland, Italy, Great Britain, and Germany from using them without the approval of the United States.

by Eric Schlosser, New Yorker |  Read more:
Image: Columbia Pictures

Where'd All The Cocaine Go?

Toward the end of last year, the DEA published its 2013 National Drug Threat Assessment Summary, a 28-page report chronicling drug consumption trends across the United States. These include the continued rise in abuse of prescription drugs (second only to marijuana in popularity), the increase in the production of heroin in Mexico and its availability in the U.S., and the emergence of synthetic designer drugs.

Much of the report is unremarkable—until you arrive at the section on cocaine. “According to [National Seizure System] data,” it reads, “approximately 16,908 kilograms of cocaine were seized at the southwest Border in 2011. During 2012, only 7,143 kilograms of cocaine were seized, a decrease of 58 percent.”

That sharp decline echoes an ongoing trend: 40 percent fewer people in the United States used cocaine in 2012 than they did in 2006; only 19 percent of Chicago arrestees had cocaine in their system two years ago compared to 50 percent in 2000; and less high school seniors say they’ve used cocaine in the last 12 months than at any time since the mid-70s. In fact, the report indicates cocaine was sporadically unavailable in Chicago, Houston, Baltimore, and St. Louis in the spring of 2012. So where’d the blow go? (...)

To speak at greater length on the subject, I reached out UCLA professor of public policy, Mark Kleiman, the nation’s leading authority on drug policy. Earlier this year he gained chronic celebrity status when Washington tapped him to be the state’s “pot czar.” On a recent Sunday morning, Professor Kleiman and I discussed the disappearance of cocaine and whether it would ever come back.

Why would a drug like cocaine ever disappear?
Drug use tends to follow epidemic cycles. When a drug appears, or reappears, it tends to does so at the top of the social spectrum—it’s associated with glamour. People are using it for the first time, and they’re having a good time. Very few people are in trouble with it because use is relatively new and it has terrific word of mouth. People say, “Oh my God, this is wonderful, you have to try this!” So you literally get an exponential growth in use. Every new user is a potential source of additional new users. You get a very rapid rise in the number of users.

As David Musto pointed out in The American Disease, over time, two things happen. Once everybody susceptible to the suggestion of “Let’s try this” has tried it, there’s a declining pool of new users. And, some of the original users have been at it long enough to develop a bad habit. So now there are fewer users to tell all their friends that "this is wonderful," and more problem users who either tell their friends, or demonstrate by their behavior, that this is not wonderful.

How did his cycle play out with cocaine?
In the case of cocaine, there was a rapid price decrease as drug dealers crowded into the market to take advantage of the bonanza. The price of cocaine dropped by 80 percent, which brought a new user group into the population. The development of the crack market made crack available to anyone with $5 for a rock; in the powder cocaine market, the price of admission was $100 for a gram. So, the social status of the drug fell along with the user group. Now, using cocaine puts you in a class not with hedge-fund managers, but with $5 crack whores. Surprisingly, people prefer to be in the “hedge-fund-manager” category, which doesn’t necessarily reflect sound moral judgment but is a social fact.

All of those things created a peak in use. The number of people starting cocaine use peaked in about 1985, which was before the Len Bias affair. Then we got a set of panic-driven policies aimed at suppressing the crack epidemic. We got mandatory sentencing, aggressive law enforcement, and a ramping-up of the War on Drugs. (...)

Is that lifecycle built into every drug?
Yes, the question is when a drug moves from being purely epidemic to being endemic. And that’s happened with cocaine. Remember, the first cocaine epidemic—the long slow one that starts with Sigmund Freud. That one played itself out by the late 1920s. After that, cocaine use went close to zero. That didn’t happen this time—there is still cocaine initiation going on. I do not see any time soon when cocaine is not part of the American scene. It looks to me as if cocaine, the opiates and cannabis, like alcohol, now have a steady user base—not just an occasional flare up. But a drug that’s as destructive as cocaine is when heavily use—especially as crack—isn’t going to become endemic at a high level. The drug we have that’s endemic at a high level is alcohol, and unfortunately that’s not going away. And it looks to me like cannabis is going to join alcohol.

by Michael Zelenko, Vice | Read more:
Image: US Coast Guard

Saturday, January 25, 2014


Photo: markk, Rabbit Island

Twitter's Achilles' Heel

At some point, Twitter and the rest of social media became less about wanting to share the news and more about wanting to be the news.

Take Justin Bieber, for example.

As reports of the once-angelic and deeply troubled Canadian pop star’s arrest began to make its way around the web, reactions streamed onto Twitter, ranging from jokes to tongue clucks.

But by far, the most common refrain was something like this: “Why is this news??”

The simplest answer is that it wasn’t — at least not the most important news happening on that particular day. But Twitter isn’t really about the most important thing anymore — it stopped being about relevancy a long time ago. Twitter seems to have reached a turning point, a phase in which its contributors have stopped trying to make the service as useful as possible for the crowd, and are instead trying to distinguish themselves from one another. It’s less about drifting down the stream, absorbing what you can while you float, and more about trying to make the flashiest raft to float on, gathering fans and accolades as you go.

How did this happen?

A theory: The psychology of crowd dynamics may work differently on Twitter than it does on other social networks and systems. As a longtime user of the service with a sizable audience, I think the number of followers you have is often irrelevant. What does matter, however, is how many people notice you, either through retweets, favorites or the holy grail, a retweet by someone extremely well known, like a celebrity. That validation that your contribution is important, interesting or worthy is enough social proof to encourage repetition. Many times, that results in one-upmanship, straining to be the loudest or the most retweeted and referred to as the person who captured the splashiest event of the day in the pithiest way. (...)

It feels as if we’re all trying to be a cheeky guest on a late-night show, a reality show contestant or a toddler with a tiara on Twitter — delivering the performance of a lifetime, via a hot, rapid-fire string of commentary, GIFs or responses that help us stand out from the crowd. We’re sold on the idea that if we’re good enough, it could be our ticket to success, landing us a fleeting spot in a round-up on BuzzFeed or The Huffington Post, or at best, a writing gig. But more often than not, it translates to standing on a collective soapbox, elbowing each other for room, in the hopes of being credited with delivering the cleverest one-liner or reaction. Much of that ensues in hilarity. Perhaps an equal amount ensues in exhaustion.

by Jenna Wortham, NY Times | Read more:
Image: Joe Raedle/Getty Images

Out in the Great Alone


[ed. Missed this when it first came out, but with the Iditarod less than a couple months away it's a terrific read.]

“You’re not a pilot in Alaska,” Jay said, fixing me with a blue-eyed and somehow vaguely piratical stare, “until you’ve crashed an airplane. You go up in one of these stinkin’ tin cans in the Arctic? Sooner or later you’re gonna lose a motor, meet the wrong gust of wind, you name it. And OH BY THE WAY” (leaning in closer, stare magnifying in significance) “that doesn’t have to be the last word.” (...)

The plan was for me to spend a few nights in the apartment connected to the hangar — live with the planes, get the feel of them. I’d read that some Iditarod mushers slept with their dogs, to make themselves one with the pack. I needed flying lessons because the little Piper Super Cubs that would carry us to Nome were two-seaters, one in front, one behind. Jay wanted me prepared in case he had a fatal brain aneurysm (his words), or a heart attack (his words 10 seconds later), or keeled over of massive unspecified organ failure (“Hey, I’m gettin’ up there — but don’t worry!”) at 2,200 feet.

Choosing an airplane — that was the first step. Jay had four, and as the first ACTS client to arrive, I got first pick.

They were so small. Airplanes aren’t supposed to be so small. How can I tell you what it was like, standing there under the trillion-mile blue of the Alaska sky, ringed in by white mountains, resolving to take to the air in one of these winged lozenges? Each cockpit was exactly the size of a coffin. A desk fan could have blown the things off course. A desk fan on medium. Possibly without being plugged in.

“God love ’em,” Jay said. “Cubs are slower’n heck, they’ll get beat all to hell by the wind, and there’s not much under the hood. But bush pilots adore ’em, because you can mod ’em to death. And OH BY THE WAY … put ’em on skis and come winter, the suckers’ll land you anywhere.”

Two of the Cubs were painted bright yellow. I took an immediate liking to the one with longer windows in the back. Better visibility, I told myself, nodding. Jay said it had the smallest engine of any of the Cubs in our squadron. Less momentum when I go shearing into the treeline, I told myself, nodding.

The name painted in black on her yellow door read: NUGGET. She had a single propeller, which sat inquisitively on the end of her nose, like whiskers. Jay told me — I heard him as if from a great distance — that she’d had to be rebuilt not long ago, after being destroyed on a previous trip north. Was I hearing things, or did he say destroyed by polar bears?

I patted Nugget’s side. Her fuselage was made of stretched fabric. It flexed like a beach ball, disconcertingly.

Into the cockpit. Flight helmet strapped, restraints active. Mic check. Then Jay’s voice in my headset: “Are you ready!” It wasn’t exactly a question. (...)

We’d done some practice turns and picked out a lake; now all I had to do was get the plane on it. Jay explained to me about landing on snow, how the scatter of light tends to mask the true height of the ground. You can go kamikaze into the ice, thinking the earth is still 30 feet below you. To gauge your real altitude re: the whiteout, you have to use “references” — sticks poking through the snow, a line of trees on the bank. These supply you with vital cues, like “might want to ease down a touch” or “gracious, I’m about to fireball.”

I won’t bore you with the details of how to steer a Super Cub — where the stick was (imagine the porniest position possible; now go 6 inches pornier than that), how to bank, what the rudder pedals felt like. Suffice it to say that in theory, it was ridiculously simple. In practice …

“You have the aircraft.” Jay’s voice in my ear. “Just bring us down in a nice straight line.”

I felt the weight in my right hand as Jay released the stick. The lake was straight ahead, maybe three miles off, a white thumbnail in an evergreen-spammed distance. The plane was under my control.

Nugget — I’m not sure how to put this — began to sashay.

“Just a niiice straight line,” Jay reminded me. “And OH BY THE WAY … your pilot’s dead.” He slumped over in his seat.

Little lesson I picked up someplace: Once your pilot gives up the ghost, it is not so easy to see where you are headed from the backseat of a Super Cub. I mean at the “what direction is the plane even pointing right now” level. You will find that your deceased pilot, looming up against the windshield, blocks almost your entire forward view. To mitigate this, the savvy backseater will bank the wings one way while stepping on the opposite rudder pedal, causing the plane to twist 30 degrees or so to one side while continuing to travel in a straight line, like a runner sliding into base. That way, said enterprising backseater can see forward through the plane’s presumably non-corpse-occluded side window.

Yeah. Well. A thing about me as a pilot is that I do not, ever, want to see forward out of the side window. Especially not while plummeting toward a frozen lake. It’s like, bro, why create the hurricane. I figured that, as an alternative technique, I would just basically try to guess where we were going.

“How’s your speed?” my pilot’s (lifeless form) inquired.

The ground seemed to be making an actual screaming noise as it rushed up toward us. Hmm — maybe a little fast. I cut the throttle. Nugget kind of heaved and started falling at a different angle; more “straight down,” as the aeronautics manuals say. We were out over the lake. I had a sense of measureless whiteness lethally spread out below me. Either the landscape was baffled or I was. There were trees on the bank, but we were dropping too fast; I couldn’t relate them to anything. My references had gone sideways. At the last moment I pulled back on the stick.

There was a chiropractic skrrrk of skis entering snow. There was, simultaneously, a feeling of force transmitting itself upward into the plane. Nugget bounced, like a skipped stone, off the ice. We were tossed up and forward, maybe 15 feet into the air …

… and came down again, bounced again, came down again, and, unbelievably, slid to a stop.

“Guess what,” the reanimated form of my pilot said, popping up. “You just landed an airplane.” (...)

Anchorage, Alaska’s one real city. Fairbanks is a town, Juneau is an admin building with ideas. Anchorage is Tulsa, only poured into a little hollow in a celestially beautiful mountain range on the outer rim of the world.

When you’re there, it truly feels like you’re at the end of something. Like a last outpost. You’re in a coffee shop, you ordered cappuccino, you can see white mountains from the window, and on the other side of the mountains is wilderness that hasn’t changed since 1492.

That’s an exaggeration, but not as much of one as you might think.

by Brian Phillips, Grantland |  Read more:
Images: Brian Phillips and Jeff Schultz/AlaskaStock

Friday, January 24, 2014


[ed. Chinese New Year Jan. 31, 2014 - Year of the Horse]
via:

Diamonds Are Forever

Diamonds are supposed to be a girl's best friend. Now, they might also be her mother, father or grandmother.

Swiss company Algordanza takes cremated human remains and — under high heat and pressure that mimic conditions deep within the Earth — compresses them into diamonds.

Rinaldo Willy, the company's founder and CEO, says he came up with the idea a decade ago. Since then, his customer base has expanded to 24 countries.

Each year, the remains of between 800 and 900 people enter the facility. About three months later, they exit as diamonds, to be kept in a box or turned into jewelry.

Most of the stones come out blue, Willy says, because the human body contains trace amounts of boron, an element that may be involved in bone formation. Occasionally, though, a diamond pops out white, yellow or close to black – Willy's not sure why. Regardless, he says, "every diamond from each person is slightly different. It's always a unique diamond."

Most of the orders Algordanza receives come from relatives of the recently deceased, though some people make arrangements for themselves to become diamonds once they've died. Willy says about 25 percent of his customers are from Japan.

At between $5,000 and $22,000, the process costs as much as some funerals. The process and machinery involved are about the same as in a lab that makes synthetic diamonds from other carbon materials.

by Rae Ellen Bichell, NPR | Read more:
Image: Rinaldo Willy/Algordanza

What Jobs Will the Robots Take?

In the 19th century, new manufacturing technology replaced what was then skilled labor. Somebody writing about the future of innovation then might have said skilled labor is doomed. In the second half of the 20th century, however, software technology took the place of median-salaried office work, which economists like David Autor have called the "hollowing out" of the middle-skilled workforce.

The first wave showed that machines are better at assembling things. The second showed that machines are better at organization things. Now data analytics and self-driving cars suggest they might be better at pattern-recognition and driving. So what are we better at?

If you go back to the two graphs in this piece to locate the safest industries and jobs, they're dominated by managers, health-care workers, and a super-category that encompasses education, media, and community service. One conclusion to draw from this is that humans are, and will always be, superior at working with, and caring for, other humans. In this light, automation doesn't make the world worse. Far from it: It creates new opportunities for human ingenuity.

But robots are already creeping into diagnostics and surgeries. Schools are already experimenting with software that replaces teaching hours. The fact that some industries have been safe from automation for the last three decades doesn't guarantee that they'll be safe for the next one.

by Derek Thompson, The Atlantic | Read more:
Image:Reuters

The Pleasure and Pain of Speed


How long is now? According to Google, not much less than 250 milliseconds. In 2008, the company presented a research report that examined ideal “latency” times for search results. It concluded “that a response time over 1 second may interrupt a user’s flow of thought.” The ideal latency for a search engine, said Google, was right at the quarter-second mark.

Which seems safe enough, because psychologists have long estimated that it takes us humans at least a quarter of a second to do much of anything. William James, wondering more than a century ago what is “the minimum amount of duration which we can distinctly feel,” had it pegged around 50 milliseconds. James cited the seminal work of Austrian physiologist Sigmund Exner, who observed that people shown sets of flashing sparks stopped being able to recognize them as distinct entities around 0.044 seconds. This “now” time increases as you go up the ladder of complexity.

To do more than barely register an image as a stimulus, to actually see something for what it is, the neuroscientist Christof Koch notes in The Quest for Consciousness, requires an average of a quarter of a second (when we are told what to look for, recognition drops to 150 milliseconds). Google’s target response time is just on Koch’s cusp of perceivable consciousness. From there, went the implication of Google’s report, lay a sloping temporal despond: More slow, less happy.

A quarter of a second, then, is a biological bright line limiting the speed at which we can experience life. And the life that we are creating for ourselves, with the help of technology, is rushing towards that line. The German sociologist Hartmut Rosa catalogues the increases in speed in his recent book, Social Acceleration: A New Theory of Modernity. In absolute terms, the speed of human movement from the pre-modern period to now has increased by a factor of 100. The speed of communications (a revolution, Rosa points out, that came on the heels of transport) rose by a factor of 10 million in the 20th century. Data transmission has soared by a factor of around 10 billion.

As life has sped up, we humans have not, at least in our core functioning: Your reaction to stimuli is no faster than your great-grandfather’s. What is changing is the amount of things the world can bring to us, in our perpetual now. But is our ever-quickening life an uncontrolled juggernaut, driven by a self-reinforcing cycle of commerce and innovation, and forcing us to cope with a new social and psychological condition? Or is it, instead, a reflection of our intrinsic desire for speed, a transformation of the external world into the rapid-fire stream of events that is closest to the way our consciousness perceives reality to begin with? (...)

Referring to the theorist Walter Benjamin, Rosa argues that the greater the number of “lived events per unit of time,” the less likely it is these are to transform into “experiences.” Benjamin argued that we tried to capture these moments with physical souvenirs, including photographs, which could later be accessed in an attempt to reinvoke memories. Of course, this process has accelerated, and the physical souvenir is now as quaint as the physical photograph. In Instagram, we have even developed a kind of souvenir of the present: An endless photography of moments suggests that we do not trust that they will actually become moments, as if we were photographing not to know that the event happened, but that it is happening.

by Tom Vanderbilt, Nautilus |  Read more:
Image: Chad Hagen