Friday, September 5, 2014

All Roads Lead to Willie Nelson

Today, Nelson is wearing a black hoodie, sunglasses and dirty New Balance sneakers, his semibraided hair tumbling out of a black baseball cap that says ZEKE'S SOCIAL CLUB. He steers his Chevy through the property with sharp, jagged turns, occasionally lighting up a burned-out joint in a cup holder. At one point, he stops the truck and singles out a stable: "I have a sick horse in there – we tried to isolate him from the herd a little bit," he says. "This is just old, rough country. A lot of room to drive around, a lot of privacy. I like Texas." (...)

He fires up his coffee maker, then reaches into a 1950s-style Hopalong Cassidy lunchbox packed with loose green pot and pulls out a tightly wrapped, torpedo­shaped joint. He takes a slow hit, holding it in as he looks at a mounted cow's skull near the fireplace. Next, he produces a vaporizer pen. "Do you ever smoke these?" he asks. "It's just pot – no smoke, no heat. You can smoke 'em on the plane!"

Nelson has been arrested at least four times on marijuana offenses. In Waco, Texas, in 1994, police found him asleep in his Mercedes on the side of the road, a joint on him, after a late poker game. In Louisiana in 2006, en route to Texas Gov. Ann Richards' funeral, Nelson's bus was pulled over and police seized 1.5 pounds of weed and two ounces of hallucinogenic mushrooms. Four years later, he was driving back from Thanksgiving in California when the border patrol arrested him in Sierra Blanca, Texas. ("He feels great – he said he lost six ounces!" joked his harmonica player Mickey Raphael at the time.) "They mostly want autographs now," Nelson says of the law. "They don't really bother me anymore for the weed, because you can bust me now and I'll pay my fine or go to jail, get out and burn one on the way home. They know they're not stopping me.

"Weed is good for you," he says. "Jesus said one time that it's not what you put in your mouth, it's what comes out of your mouth. I saw the other day that [medical] weed is legal in Israel – there's an old-folks home there, and all these old men were walking around with bongs and shit. Fuck! They got it figured out before we did!"

Abruptly, he changes the subject. "Wanna ride around a bit?"

Nelson turned 81 in April. He can be forgetful – in concert, he sometimes needs to look over at Raphael, a veteran of his band for more than 30 years, to see if they've played "Georgia on My Mind" or some other song yet ("But I think that's the dope more than anything," says Raphael). His hearing is shot, and he no longer signs as many autographs as he used to. But he still practices tae kwon do and sleeps on the Honeysuckle Rose, his 40-foot-long biodiesel-fueled tour bus, while the rest of the band check into hotels. At one point on the ranch, when he stops to show off his favorite paint horse, Billy Boy, he easily hoists himself up to the second­highest fence rung, balancing about four feet off the ground. (...)

Unlike fellow giants like Williams, Merle Haggard or Dolly Parton, who have plenty of obvious imitators, no one sounds like Nelson. He's an uncanny vocal phraser: "The three masters of rubato in our age are Frank Sinatra, Ray Charles and Willie Nelson," said the late producer Jerry Wexler. "The art of gliding over the meter and extending it until you think they're going to miss the next actual musical demarcation – but they always arrive there, at bar one. It's some kind of musical miracle."

In a time when America is more divided than ever, Nelson could be the one thing that everybody agrees on. "The Hells Angels love him, and so do grandmothers," says Raphael. But in private, he can seem introverted and given to long silences. He will often describe his life in brief, purely factual terms, saying things like, "Oh, why does a guy write? I don't know. You get an idea, and you sit down, and you write it." Over the course of 30 interviews with his friends, family and band members, a lot of the same words come up – generous, charismatic, loyal and, as Keith Richards has said, "a bit of a mystery." "He's really good at throwing out a one-liner that will get you off of what you're talking about," says Shooter Jennings, who has known Nelson since he was a kid tagging along on the Highwaymen tours with his father, Waylon. "You're like, 'Fuck, Willie, answer the question!' There's a lot of exterior there. That's why you'll never quite fully get that picture."

"You never get to know him like you should, but you know there's more there than what you're seeing," says Loretta Lynn. "I know there's more there because of how he writes. He can't fool me!"

"He's a hard man to know," Johnny Cash wrote in 1997. "He keeps his inner thoughts for himself and his songs. He just doesn't talk much at all, in fact. When he does, what he says is usually very perceptive and precise. . . . He has a beautiful sense of irony and a true appreciation for the absurd. I really like him."

by Patrick Doyle, Rolling Stone |  Read more:
Image: LeAnn Mueller

The Dying Russians

Sometime in 1993, after several trips to Russia, I noticed something bizarre and disturbing: people kept dying. I was used to losing friends to AIDS in the United States, but this was different. People in Russia were dying suddenly and violently, and their own friends and colleagues did not find these deaths shocking. Upon arriving in Moscow I called a friend with whom I had become close over the course of a year. “Vadim is no more,” said his father, who picked up the phone. “He drowned.” I showed up for a meeting with a newspaper reporter to have the receptionist say, “But he is dead, don’t you know?” I didn’t. I’d seen the man a week earlier; he was thirty and apparently healthy. The receptionist seemed to think I was being dense. “A helicopter accident,” she finally said, in a tone that seemed to indicate I had no business being surprised.

The deaths kept piling up. People—men and women—were falling, or perhaps jumping, off trains and out of windows; asphyxiating in country houses with faulty wood stoves or in apartments with jammed front-door locks; getting hit by cars that sped through quiet courtyards or plowed down groups of people on a sidewalk; drowning as a result of diving drunk into a lake or ignoring sea-storm warnings or for no apparent reason; poisoning themselves with too much alcohol, counterfeit alcohol, alcohol substitutes, or drugs; and, finally, dropping dead at absurdly early ages from heart attacks and strokes.

Back in the United States after a trip to Russia, I cried on a friend’s shoulder. I was finding all this death not simply painful but impossible to process. “It’s not like there is a war on,” I said.

“But there is,” said my friend, a somewhat older and much wiser reporter than I. “This is what civil war actually looks like. “It’s not when everybody starts running around with guns. It’s when everybody starts dying.”

My friend’s framing stood me in good stead for years. I realized the magazine stories I was writing then were the stories of destruction, casualties, survival, restoration, and the longing for peace. But useful as that way of thinking might be for a journalist, it cannot be employed by social scientists, who are still struggling to answer the question, Why are Russians dying in numbers, and at ages, and of causes never seen in any other country that is not, by any standard definition, at war?

In the seventeen years between 1992 and 2009, the Russian population declined by almost seven million people, or nearly 5 percent—a rate of loss unheard of in Europe since World War II. Moreover, much of this appears to be caused by rising mortality. By the mid-1990s, the average St. Petersburg man lived for seven fewer years than he did at the end of the Communist period; in Moscow, the dip was even greater, with death coming nearly eight years sooner.

by Masha Gessen, NYR |  Read more:
Image: Gueorgui Pinkhassov/Magnum Photos

Athleisure



The latest buzzword in fashion is “athleisure,” one of those made-up terms that are so ridiculously nonsensical as to be perfectly descriptive. That is, designers and retailers are obsessed with clothes that fit a somewhat broad category of being appropriate for either athletic or leisure pursuits, or both. We’re talking about anything from designer leggings of the Lululemon variety to cashmere sweats to layering pieces to absurdly fancy (and expensive) gym clothes. (...)

Looking at the number of companies that have since announced they are getting into the game, with clothes that are described as “après sport” or “gym-to-the-office,” it’s fairly clear that athleisure is becoming bigger than a trend. This has also been evidenced by the number of people who seem to think it appropriate to wear leggings or yoga pants practically anywhere, but I digress. There is clearly an overwhelming desire for leisure and sport clothes that are designed well and stylish, given the amount of interest this month in the introduction of Net-a-Sporter, a new channel from the online retailer Net-a-Porter that is dedicated to “sportswear that is as chic as everything else in your closet.” This includes both basic Nike tanks for $30 and luxury items like a Karl Lagerfeld sweatshirt for $235, or cashmere and linen track pants from The Elder Statesman for $600.

by Eric Wilson, InStyle |  Read more:
Image:Courtesy of Lou & Grey; Courtesy of Without Walls

Thursday, September 4, 2014


Saul Leiter, "Shopping"
via:

Marvin Gaye

Life Outside the Lab: The Ones Who Got Away

When Soroosh Shambayati left his organic-chemistry lab, he didn't leave chemical synthesis behind. As a chemist PhD turned investment banker, he started working in the derivatives market in the 1990s. The transactions involved arranging a complex series of trades in a precise order, and it reminded him of synthesizing an organic compound, reaction by reaction.

As a graduate student, Shambayati had excelled at synthesis, just as he did at everything he turned his hand to. He was “other-worldly brilliant”, says his former adviser Stuart Schreiber. He juggled three distinct projects during his PhD, one in organic synthesis, one in theoretical physical chemistry and a third in biochemistry and immunology. He was also calm, thoughtful and well read: his bookshelf spans science philosophy, evolutionary biology and physics. Schreiber, a biochemist at the Broad Institute in Cambridge, Massachusetts, knew that if Shambayati wanted to become an academic scientist, he was sure to succeed. “It was very clear to me that he was going to become a star,” he says. But Shambayati chose the financial world — and excelled there instead: he is now chief executive at Guggenheim Investment Advisors (Suisse) in Geneva, Switzerland, a firm that manages billions of dollars for wealthy families and foundations.

Shambayati is among the hundreds of thousands of scientists who train in academia but then leave to follow a different career. According to the latest survey of doctorate recipients conducted by the US National Science Foundation, nearly one-fifth of employed people with science and engineering PhDs were no longer working in science in 2010. This is partly due to a lack of room at the top. In the United States, the number of PhDs entering the workforce has skyrocketed but the number of stable academic jobs has not. In 1973, nearly 90% of US PhDs working in academia held full-time faculty positions, compared with about 75% in 2010.

A common perception is that the weaker science students are forced out of a competitive field, leaving the brightest stars to secure the desirable academic positions. But as Shambayati's story shows — and as most mentors know — this is not the full picture: sometimes the scientists who move on are the ones with the most promise. Their motivations are diverse: some want more money, or more time with family; others are lured by opportunities elsewhere. To get a better sense of why talented scientists are leaving academia and how their training influences their lives, Nature contacted group leaders recognized for mentoring and asked: “Who was the one who got away?”

by Ewen Callaway, Nature |  Read more:
Image: Señor Salme

The New Luddites: Why Former Digital Prophets Are Turning Against Tech

Very few of us can be sure that our jobs will not, in the near future, be done by machines. We know about cars built by robots, cashpoints replacing bank tellers, ticket dispensers replacing train staff, self-service checkouts replacing supermarket staff, tele­phone operators replaced by “call trees”, and so on. But this is small stuff compared with what might happen next.

Nursing may be done by robots, delivery men replaced by drones, GPs replaced by artificially “intelligent” diagnosers and health-sensing skin patches, back-room grunt work in law offices done by clerical automatons and remote teaching conducted by computers. In fact, it is quite hard to think of a job that cannot be partly or fully automated. And technology is a classless wrecking ball – the old blue-collar jobs have been disappearing for years; now they are being followed by white-collar ones.

Ah, you may say, but human beings will always be better. This misses the point. It does not matter whether the new machines never achieve full human-like consciousness, or even real intelligence, they can almost certainly achieve just enough to do your job – not as well as you, perhaps, but much, much more cheaply. To modernise John Ruskin, “There is hardly anything in the world that some robot cannot make a little worse and sell a little cheaper, and the people who consider price only are this robot’s lawful prey.” (...)

“Luddite” has, in the past few decades, been such a routine term of abuse for anybody questioning the march of the machines (I get it all the time) that most people assume that, like “fool”, “idiot” or “prat”, it can only ever be abusive. But, in truth, Luddism has always been proudly embraced by the few and, thanks to the present climate of machine mania and stagnating incomes, it is beginning to make a new kind of sense. From the angry Parisian taxi drivers who vandalised a car belonging to an Uber driver to a Luddite-sympathetic column by the Nobel laureate Paul Krugman in the New York Times, Luddism in practice and in theory is back on the streets. (...)

In 1992, Neil Postman, in his book Technopoly, rehabilitated the Luddites in response to the threat from computers: “The term ‘Luddite’ has come to mean an almost childish and certainly naive opposition to technology. But the historical Luddites were neither childish nor naive. They were people trying desperately to preserve whatever rights, privileges, laws and customs had given them justice in the older world-view.”

Underpinning such thoughts was the fear that there was a malign convergence – perhaps even a conspiracy – at work. In 1961, even President Eisenhower warned of the anti-democratic power of the “military-industrial complex”. In 1967 Lewis Mumford spoke presciently of the possibility of a “mega-machine” that would result from “the convergence of science, technics and political power”. Pynchon picked up the theme: “If our world survives, the next great challenge to watch out for will come – you heard it here first – when the curves of research and development in artificial intelligence, molecular biology and robotics all converge. Oboy.”

The possibility is with us still in Silicon Valley’s earnest faith in the Singularity – the moment, possibly to come in 2045, when we build our last machine, a super-intelligent computer that will solve all our problems and enslave or kill or save us. Such things are true only to the extent to which they are believed – and, in the Valley, this is believed, widely. (...)

Obviously, if neo-Luddism is conceived of in psychotic or apocalyptic terms, it is of no use to anybody and could prove very dangerous. But if it is conceived of as a critical engagement with technology, it could be useful and essential. So far, this critical engagement has been limited for two reasons. First, there is the belief – it is actually a superstition – in progress as an inevitable and benign outcome of free-market economics. Second, there is the extraordinary power of the technology companies to hypnotise us with their gadgets. Since 1997 the first belief has found justification in a management theory that bizarrely, upon closer examination, turns out to be the mirror image of Luddism. That was the year in which Clayton Christensen published The Innovator’s Dilemma, judged by the Economist to be one of the most important business books ever written. Christensen launched the craze for “disruption”. Many other books followed and many management courses were infected. Jill Lepore reported in the New Yorker in June that “this fall, the University of Southern California is opening a new program: ‘The degree is in disruption,’ the university announced.” And back at Forbes it is announced with glee that we have gone beyond disruptive innovation into a new phase of “devastating innovation”. (...)

Meanwhile in the New York Times, Paul Krugman wrote a very neo-Luddite column that questioned the consoling belief that education would somehow solve the probem of the destruction of jobs by technology. “Today, however, a much darker picture of the effects of technology on labour is emerging. In this picture, highly educated workers are as likely as less educated workers to find themselves displaced and devalued, and pushing for more education may create as many problems as it solves.”

In other words – against all the education boosters from Tony Blair onwards – you can’t learn yourself into the future, because it is already owned by others, primarily the technocracy. But it is expert dissidents from within the technocracy who are more useful for moderate neo-Luddites. In 2000, Bill Joy, a co-founder of Sun Microsystems and a huge figure in computing history, broke ranks with an article for Wired entitled “Why the future doesn’t need us”. He saw that many of the dreams of Silicon Valley would either lead to, or deliberately include, termination of the human species. They still do – believers in the Singularity look forward to it as a moment when we will transcend our biological condition.

“Given the incredible power of these new technologies,” Joy wrote, “shouldn’t we be asking how we can best coexist with them? And if our own extinction is a likely, or even possible, outcome of our technological development, shouldn’t we proceed with great caution?”

by Bryan Appleyard, New Statesman |  Read more:
Image: Ikon Images

Wait Six Years to Buy Your Next Car

You’ll be able to buy a car that can drive itself under most conditions, with an option for override by a human driver, in 2020, according to the median estimate in a survey of 217 attendees of the 2014 Automated Vehicles Symposium. By 2030, the group estimated, you’ll be able to buy a car that is so fully automated it won’t even have the option for a human driver.

Though 2020 is just six years away, there remains a lot of debate over how the industry is going to get there. Most auto manufacturers are incrementalists, adding automated features such as adaptive cruise control, self-parking, and traffic-jam assist, two or three at a time. Google and some others in Silicon Valley, however, are more interested in producing highly or even fully automated cars as soon as possible.

The Society of Automotive Engineers and National Highway Traffic Safety Administration have slightly different definitions of different levels of automated cars. But both basically agree that a “partially automated car” can take over some driving functions such as speed and steering but can’t actually drive itself; “highly automated” can drive itself under most conditions but has a human override, and “fully automated” can drive itself without a human override.

Don Norman, a human-factors engineer from UC San Diego, seemingly endorsed Google’s strategy in a keynote speech Wednesday that argued that even highly automated vehicles might be too dangerous for people to use. “I strongly favor full automation,” he said, but feared that highly automated vehicles might find themselves unable to handle some condition and give the human drive an inadequate amount of time to safely take over.

Airplanes have been highly automated for years, Norman pointed out, but if a plane that is 30,000 feet up suddenly decides it can’t handle current conditions and demands that a human takes over, the human pilot has several minutes before the plane might crash. An auto driver in the same position might have only a fraction of a second.

As I indicated here July 16, another source of debate is whether cars should come with vehicle-to-vehicle (V2) communications. This would allow, among other things, cars to operate in “platoons” of several to many cars each. Since the cars would be in constant contact with each other, they could close the gaps between them and greatly increase highway capacities.

by Randal O'Toole, Kurzweil |  Read more:
Image: Harbrick

Wednesday, September 3, 2014


Andreas Paradise, Lafkos, Greece, 2013
via:

Why Nerdy White Guys Who Love the Blues Are Obsessed With a Wisconsin Chair Factory

In the 2001 movie “Ghost World,” 18-year-old Enid picks up the arm on her turntable, drops the needle in the groove, and plays a song yet another time. She can’t get over the emotional power of bluesman Skip James’ 1931 recording of “Devil Got My Woman.” If you know anything about 78 records, it only makes sense that a nerdy 40-something 78 collector named Seymour would have introduced her to this tune. As played by Steve Buscemi, Seymour is an awkward, introverted sadsack based on the film’s director, Terry Zwigoff, who—along with his comic-artist pal, Robert Crumb—is an avid collector of 78s, a medium whose most haunting and rarest tracks are the blues songs recorded in the 1920s and ’30s.

Nearly a decade later, music critic and reporter Amanda Petrusich had the same intoxicating experience Enid (Thora Birch) did, listening to very same song, although she got to hear “Devil Got My Woman” played on its original 78, courtesy of a real-­life collector, who owns this prohibitively expensive shellac record pressed by Paramount. Only three or four copies are known to exist.

The gramophone, a type of phonograph that played 10-inch shellac discs at 78 rpm, was developed in the late 19th century. But it wasn’t until the 1910s and ’20s that the technology became more affordable and less cumbersome so that an average family could have one at home. The records, which could only play 2 to 3 minutes of sound per side, had their heyday in the ’20s and ’30s. They lost their cachet in the ’40s, when radio became the most popular format for music lovers. Then in the ’50s and ’60s, 78 records were phased out in favor of long-playing vinyl records.

Paramount blues records, in particular, seem to get under the skin of modern 78 collectors. From 1922 to 1932, the label, founded by a furniture company in suburban Wisconsin, discovered some of the most legendary blues icons of the 20th century—Charley Patton, Son House, Blind Blake, Ma Rainey, and Blind Lemon Jefferson—thanks to African American producer J. Mayo Williams, who recruited talent scouts to find these impoverished artists in the South, and then paid the artists a pittance to record for Paramount. These “race records,” meant exclusively for black audiences, were made in limited runs from a cheap, low-quality mixture of shellac that gives them a ghostly, crackling sound. Their rarity, the strange sounds they make, and the brilliance of these artists (who mostly remained obscure at the time) has led to a full-blown fervor in the 78 world. Even rock star Jack White, who founded Third Man Records, is obsessed with Paramount. Last year, White teamed up with Revenant Records’ Dean Blackwood to release a box set of vinyl albums featuring 800 known Paramount tracks. (Yours for a paltry $400.)

Petrusich, who spent years immersing herself in the world of 78 collectors as a reporter, got so obsessed with Paramount Records, she went diving into the murky waters of the Milwaukee River to look for discarded shellac. Now, she’s released a book on her experience about getting swept up in this mania, Do Not Sell at Any Price: The Wild, Obsessive Hunt for the World’s Rarest 78 rpm Records. We talked to Petrusich about the characters she met, the important preservationist work they’re doing, and how white men ended up writing the narrative of a music genre created by impoverished African Americans. (...)

Collectors Weekly: Can you tell me a little bit about the history of Paramount Records?

Petrusich: Paramount is this incredible label that was born from a company called the Wisconsin Chair Company, which was making chairs, obviously. The company had started building phonograph cabinets to contain turntables, which they also were licensing. And they developed, like many furniture companies, an arm that was a record label so that they could make records to sell with the cabinets. This was before a time in which record stores existed. People bought their records at the furniture store, because they were things you needed to make your furniture work.

So the Wisconsin Chair Company, based in the Grafton-Port Washington area of Wisconsin, started the Paramount label. And they accidentally ended up recording whom I believe to be some of the most incredible performers in American musical history. Paramount started a “race record” series in the late 1920s after a few other labels had success doing that model, by which African American artists recorded music for African American audiences. Through a complex series of talent scouts, they would bring artists mostly from the Southeast up to Wisconsin to record, which in and of itself was just insane and miraculous. These are Mississippi bluesmen, being brought to this white rural town in Wisconsin, and you can’t imagine how foreign it must have been to them to see that landscape. Sometimes the performers would record for Paramount in Chicago, but later in Paramount’s history, the company built a studio right in Grafton, and it was a notoriously bad studio. It had shoddy, handmade equipment, and then the records that Paramount was pressing were really cheap. It was a very bad mixture of shellac, and Paramount records are infamous for having a lot of surface noise.

But as I said, they captured some of the best performers in American history, folks like Skip James, Charley Patton, Blind Lemon Jefferson, and Geeshie Wiley—all these really incredible singers. At the time, Paramount didn’t know what it was doing. It hasn’t been until now that people are like, “Oh my God, this label rewrote American history.” I don’t think Paramount was remotely cognizant of the significance of the work that was being recorded in their studio.

by Lisa Hix, Collectors Weekly |  Read more:
Image: Robert Crumb

Creativity Creep

Every culture elects some central virtues, and creativity is one of ours. In fact, right now, we’re living through a creativity boom. Few qualities are more sought after, few skills more envied. Everyone wants to be more creative—how else, we think, can we become fully realized people?

Creativity is now a literary genre unto itself: every year, more and more creativity books promise to teach creativity to the uncreative. A tower of them has risen on my desk—Ed Catmull and Amy Wallace’s “Creativity, Inc.”; Philip Petit’s “Creativity: The Perfect Crime”—each aiming to “unleash,” “unblock,” or “start the flow” of creativity at home, in the arts, or at work. Work-based creativity, especially, is a growth area. In “Creativity on Demand,” one of the business-minded books, the creativity guru Michael Gelb reports on a 2010 survey conducted by I.B.M.’s Institute for Business Values, which asked fifteen hundred chief executives what they valued in their employees. “Although ‘execution’ and ‘engagement’ continue to be highly valued,” Gelb reports, “the CEOs had a new number-one priority: creativity,” which is now seen as “the key to successful leadership in an increasingly complex world.” Meanwhile, at the other end of the spectrum, Julia Cameron’s best-selling “The Artist’s Way” proposes creativity as a path to personal, even spiritual fulfillment: “The heart of creativity is an experience of the mystical union,” Cameron writes. “The heart of the mystical union is an experience of creativity.” It’s a measure of creativity’s appeal that we look to it to solve such a wide range of problems. Creativity has become, for many of us, the missing piece in a life that seems routinized, claustrophobic, and frivolous.

How did we come to care so much about creativity? The language surrounding it, of unleashing, unlocking, awakening, developing, flowing, and so on, makes it sound like an organic and primordial part of ourselves which we must set free—something with which it’s natural to be preoccupied. But it wasn’t always so; people didn’t always care so much about, or even think in terms of, creativity. In the ancient world, good ideas were thought to come from the gods, or, at any rate, from outside of the self. During the Enlightenment, rationality was the guiding principle, and philosophers sought out procedures for thinking, such as the scientific method, that might result in new knowledge. People back then talked about “imagination,” but their idea of it was less exalted than ours. They saw imagination as a kind of mental scratch pad: a system for calling facts and images to the mind’s eye and for comparing and making connections between them. They didn’t think of the imagination as “creative.” In fact, they saw it as a poor substitute for reality; Hobbes called it “decayed sense.”

It was Romanticism, in the late eighteenth and early nineteenth centuries, which took the imagination and elevated it, giving us the “creative imagination.” (That’s the title of a classic intellectual history of this period, by the literary scholar James Engell.) People like Samuel Taylor Coleridge argued that we don’t just store things in our imaginations; we transform them. Coleridge made a useful distinction, largely lost today, between two kinds of imagining. All of us, he thought, have a workaday imagination, which we use to recall memories, make plans, and solve problems; he called this practical imagination “fancy.” But we also have a nobler kind of imagination, which operates, as Engell puts it, like “a human reflex of God’s creative energy.” The first kind of imagination understands the world; the second kind cares about it and brings it to life. In the “Prelude,” Wordsworth describes this kind of imagination as “an auxiliary light” that changes everything it illuminates:

An auxiliary light
Came from my mind which on the setting sun
Bestowed new splendor, the melodious birds,
The gentle breezes, fountains that ran on,
Murmuring so sweetly in themselves, obeyed
A like dominion; and the midnight storm
Grew darker in the presence of my eye.

This watchful, inner kind of creativity is not about making things but about experiencing life in a creative way; it’s a way of asserting your own presence amidst the much larger world of nature, and of finding significance in that wider world. By contrast, our current sense of creativity is almost entirely bound up with the making of stuff. If you have a creative imagination but don’t make anything, we regard that as a problem—we say that you’re “blocked.”

How did creativity transform from a way of being to a way of doing? The answer, essentially, is that it became a scientific subject, rather than a philosophical one.

by Joshua Rothman, New Yorker |  Read more:
Image: Boyoun Kim

Tuesday, September 2, 2014

Feist


[ed. How many people know that Fiest and Peaches used to be roomates? I know, blows your mind. Here's some vintage Fiest, just because I love her... 1234 (and the Sesame Street version) and So Sorry. You can check out Peaches kickin' it with Iggy Pop in the next post.]

The Taming of the Stooge


"I would characterize it sort of like a powerful interest group within a political party at this point. It used to be the entire political party."
—Iggy Pop explains his current relationship with his penis.

h/t The Awl 

[ed. Thanks to whoever pulled this up from the archives today, I'd forgotten I posted it (like so many other things). I need to get back there once in a while (and you do, too). ps. Peaches cracks me up: Fuck the pain away - here and here.]

Monday, September 1, 2014






Henri-Georges Clouzot, La Prisonnière (1968)
via:

Hoda Afshar, Westoxicated #7. 2013
via:

Operation 'Washtub'

Fearing a Russian invasion and occupation of Alaska, the U.S. government in the early Cold War years recruited and trained fishermen, bush pilots, trappers and other private citizens across Alaska for a covert network to feed wartime intelligence to the military, newly declassified Air Force and FBI documents show.

Invasion of Alaska? Yes. It seemed like a real possibility in 1950.

"The military believes that it would be an airborne invasion involving bombing and the dropping of paratroopers," one FBI memo said. The most likely targets were thought to be Nome, Fairbanks, Anchorage and Seward.

So FBI director J. Edgar Hoover teamed up on a highly classified project, code-named "Washtub," with the newly created Air Force Office of Special Investigations, headed by Hoover protege and former FBI official Joseph F. Carroll.

The secret plan was to have citizen-agents in key locations in Alaska ready to hide from the invaders of what was then only a U.S. territory. The citizen-agents would find their way to survival caches of food, cold-weather gear, message-coding material and radios. In hiding they would transmit word of enemy movements.

This was not civil defense of the sort that became common later in the Cold War as Americans built their own bomb shelters. This was an extraordinary enlistment of civilians as intelligence operatives on U.S. soil. (...)

"Washtub" was known inside the government by several other codenames, including Corpuscle, Stigmatic and Catboat, according to an official Air Force history of the OSI, which called it one of OSI's "most extensive and long-running Cold War projects." The FBI had its own code word for the project: STAGE.

"Washtub" had two phases.

The first and more urgent was the stay-behind agent program. The second was a parallel effort to create a standby pool of civilian operatives in Alaska trained to clandestinely arrange for the evacuation of downed military air crews in danger of being captured by Soviet forces. This "evasion and escape" plan was coordinated with the CIA.

Among those listed as a stay-behind agent was Dyton Abb Gilliland of Cooper Landing, a community on the Kenai Peninsula south of Anchorage. A well-known bush pilot, Gilliland died in a plane crash on Montague Island in Prince William Sound in May 1955 at age 45. FBI records say he spent 12 days in Washington D.C., in June-July 1951 undergoing a range of specialized training, including in the use of parachutes.

The agents also got extensive training in coding and decoding messages, but this apparently did not always go well. Learning these techniques was "an almost impossible task for backwoodsmen to master in 15 hours of training," one document said. Details in the document were blacked out.

Many agent names in the OSI and FBI documents also were removed before being declassified.

None of the indigenous population was included. The program founders believed that agents from the "Eskimo, Indian and Aleut groups in the Territory should be avoided in view of their propensities to drink to excess and their fundamental indifference to constituted governments and political philosophies. It is pointed out that their prime concern is with survival and their allegiance would easily shift to any power in control."

Recruiters pitched patriotism and were to offer retainer fees of up to $3,000 a year (nearly $30,000 in 2014 dollars). That sum was to be doubled "after an invasion has commenced," according to one planning document. The records do not say how much was actually paid during the course of the program.

by Robert Burns, AP |  Read more:
Image: J. Edgar Hoover, AP

Foucault and Social Media: Life in a Virtual Panopticon


You start the day bleary-eyed and anxious. You stayed up late last night working on a post for your blog, gathering facts and memes from about the web and weaving them into an incisive whole. Has it produced a spike in the stats? You sign in on your iPhone as you brew the coffee. But it’s too early to slip into the professional headspace – you decide that you don’t want to know. Someone has messaged you on Facebook, so you check that instead. Japanese manga mashup! Killer breaks off the cost of Lombok. Lady Gaga is a man and we have photoshopped evidence to prove it! A friend will appreciate that one, so you share it with her directly. Perhaps not something that you’d want to share widely. Two new contact requests on LinkedIn. Your profile needs updating. Should you include details about the design work you completed for the local event the week before? You are not sure. You are building your profile as a graphic artist and looking for quality clients. Perhaps this is a part of your person that you will let incubate for a while longer.

You jump on HootSuite and start sharing targeted content: Facebook for friends, tweets for professional contacts. The day has barely started and already you are split into half a dozen pieces.

How did we ever get by without social media? In under a decade, free online services like Facebook, Twitter, and LinkedIn have utterly transformed how we work, play, and communicate. For hundreds of millions of people, sharing content across a range of social media services is a familiar part of life. Yet little is known about how social media is impacting us on a psychological level. A wealth of commentators are exploring how social media is refiguring forms of economic activity, reshaping our institutions, and transforming our social and organizational practices. We are still learning about how social media impacts on our sense of personal identity.

The French philosopher Michel Foucault (1926-1984) has a set of insights that can help clarify how social media affects us on a psychological level. Foucault died before the advent of the internet, yet his studies of social conditioning and identity formation in relation to power are applicable to life online. Seen from a Foucaultian perspective, social media is more than a vehicle for exchanging information. Social media is a vehicle for identity-formation. Social media involves ‘subjectivation’.

A Foucaultian perspective on social media targets the mechanism that makes it tick: sharing. Sharing is basic to social media. Sharing content is not just a neutral exchange of information, however. Mostly, when we share content on social media services, we do it transparently, visibly, that is in the presence of a crowd. The act of sharing is a performance, to an extent – it a performative act, an act that does something in the world, as J.L. Austin would say. This is important. The performative aspect of sharing shapes the logic and experience of the act itself.

There is a self-reflexive structure to sharing content on Facebook or Twitter. Just as actors on stage know that they are being watched by the audience and tailor their behaviour to find the best effect, effective use of social media implies selecting and framing content with a view to pleasing and/or impressing a certain crowd. We may not intend to do this but it is essential to doing it well. Unless we are sharing anonymously (and the radical end of internet culture, Anonymous, favours anonymity), all the content we share is tagged with an existential marker:

I sent this – it is part of my work. You shall know me by my works’.

Foucault understood how being made constantly visible impacts on us psychologically. Foucault was fascinated by Jeremy Bentham’s model of the ideal prison, the Panopticon, which has been incorporated in the architecture of prisons, schools, hospitals, workplaces, and urban spaces since Bentham designed it in the eighteenth century. In Benthem’s design, the Panopticon is comprised of a ring of cells surrounding a central guard tower. The prisoners in the cells are perpetually exposed to the gaze of the guards in the tower, yet since they cannot themselves see into the tower, they are never certain whether or not they are being watched.

Bentham’s Panopticon, Foucault argues, functions to make prisoners take responsibility for regulating their behaviour. Assuming that they care about the implications of bad behaviour, prisoners will act in the manner prescribed by the institution at all times on the chance that they are being watched. In time, as the sense of being watched gets under their skin, prisoners come to regulate their behaviour as if they were in a Panopticon all times, even after they have been released from the institution.

This, Foucault claims, is ‘the major effect of the Panopticon: to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power’ (Foucault, Discipline and Punish, 201).

‘Conscious and permanent visibility’…’ Apparantly this is what Mark Zuckerberg thinks social media is all about. By making our actions and shares visible to a crowd, social media exposes us to a kind of virtual Panopticon. This is not just because our activities are monitored and recorded by the social media service for the purposes of producing market analysis or generating targeted advertising. For the most part, we can and do ignore this kind of data harvesting. The surveillance that directly affects us and impacts on our behaviour comes from the people with whom we share.

There are no guards and no prisoners in Facebook’s virtual Panopticon. We are both guards and prisoners, watching and implicitly judging one another as we share content.

In sharing online, we are playing to a crowd.

by Tim Rayner, Philosophy For Change |  Read more:
Image: Michael Foucault, uncredited