Wednesday, February 25, 2015

Kurt Vonnegut on the Shapes of Stories


[ed. Kurt Vonnegut: A Man Without a Country]

“The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is.”

"Now let me give you a marketing tip. The people who can afford to buy books and magazines and go to the movies don’t like to hear about people who are poor or sick, so start your story up here [indicates top of the G-I axis]. You will see this story over and over again. People love it, and it is not copyrighted. The story is ‘Man in Hole,’ but the story needn’t be about a man or a hole. It’s: somebody gets into trouble, gets out of it again [draws line A]. It is not accidental that the line ends up higher than where it began. This is encouraging to readers. (...)

Now there’s a Franz Kafka story [begins line D toward bottom of G-I axis]. A young man is rather unattractive and not very personable. He has disagreeable relatives and has had a lot of jobs with no chance of promotion. He doesn’t get paid enough to take his girl dancing or to go to the beer hall to have a beer with a friend. One morning he wakes up, it’s time to go to work again, and he has turned into a cockroach [draws line downward and then infinity symbol]. It’s a pessimistic story. (...)

The question is, does this system I’ve devised help us in the evaluation of literature? Perhaps a real masterpiece cannot be crucified on a cross of this design. How about Hamlet? It’s a pretty good piece of work I’d say. Is anybody going to argue that it isn’t? I don’t have to draw a new line, because Hamlet’s situation is the same as Cinderella’s, except that the sexes are reversed.

His father has just died. He’s despondent. And right away his mother went and married his uncle, who’s a bastard. So Hamlet is going along on the same level as Cinderella when his friend Horatio comes up to him and says, ‘Hamlet, listen, there’s this thing up in the parapet, I think maybe you’d better talk to it. It’s your dad.’ So Hamlet goes up and talks to this, you know, fairly substantial apparition there. And this thing says, ‘I’m your father, I was murdered, you gotta avenge me, it was your uncle did it, here’s how.’

Well, was this good news or bad news? To this day we don’t know if that ghost was really Hamlet’s father. If you have messed around with Ouija boards, you know there are malicious spirits floating around, liable to tell you anything, and you shouldn’t believe them. Madame Blavatsky, who knew more about the spirit world than anybody else, said you are a fool to take any apparition seriously, because they are often malicious and they are frequently the souls of people who were murdered, were suicides, or were terribly cheated in life in one way or another, and they are out for revenge.

So we don’t know whether this thing was really Hamlet’s father or if it was good news or bad news. And neither does Hamlet. But he says okay, I got a way to check this out. I’ll hire actors to act out the way the ghost said my father was murdered by my uncle, and I’ll put on this show and see what my uncle makes of it. So he puts on this show. And it’s not like Perry Mason. His uncle doesn’t go crazy and say, ‘I-I-you got me, you got me, I did it, I did it.’ It flops. Neither good news nor bad news. After this flop Hamlet ends up talking with his mother when the drapes move, so he thinks his uncle is back there and he says, ‘All right, I am so sick of being so damn indecisive,’ and he sticks his rapier through the drapery. Well, who falls out? This windbag, Polonius. This Rush Limbaugh. And Shakespeare regards him as a fool and quite disposable.

You know, dumb parents think that the advice that Polonius gave to his kids when they were going away was what parents should always tell their kids, and it’s the dumbest possible advice, and Shakespeare even thought it was hilarious.

‘Neither a borrower nor a lender be.’ But what else is life but endless lending and borrowing, give and take?

‘This above all, to thine own self be true.’ Be an egomaniac!

Neither good news nor bad news. Hamlet didn’t get arrested. He’s prince. He can kill anybody he wants. So he goes along, and finally he gets in a duel, and he’s killed. Well, did he go to heaven or did he go to hell? Quite a difference. Cinderella or Kafka’s cockroach? I don’t think Shakespeare believed in a heaven or hell any more than I do. And so we don’t know whether it’s good news or bad news.

I have just demonstrated to you that Shakespeare was as poor a storyteller as any Arapaho.

But there’s a reason we recognize Hamlet as a masterpiece: it’s that Shakespeare told us the truth, and people so rarely tell us the truth in this rise and fall here [indicates blackboard]. The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is.

And if I die — God forbid — I would like to go to heaven to ask somebody in charge up there, ‘Hey, what was the good news and what was the bad news?’"

by Maria Popova, Brain Pickings |  Read more:
Image: Kurt Vonnegut

How to Avoid Rape in Prison


The Marshall Project put together this short film where former inmates explain how to avoid being sexually assaulted while incarcerated.

Cities Don’t ♥ Us

[ed. See also: A Last Ditch Effort to Preserve the Heart of the Central District, and Fixing Pioneer Square.]

Each day in New York an army of street-sweeping trucks fan across the boroughs purportedly inhaling the litter and waste that parks itself in curbside crevices along residential blocks. (Commercial districts are typically cleaned overnight.) If you’ve ever seen one of these massive contraptions you’ve probably wondered how much they truly clean—rather than just disperse the dirt and debris to another location for the next day’s job—and whether they do more environmental harm than good. And if you happen to be a car-owning New Yorker, the sound of a street sweeper even one or two blocks away can easily trigger a chain of panicked questions starting with “What time is it?” followed by “What day is it?” before landing on “What side of the street am I parked on?”

Alternate-side parking is a part of life in New York City. Both for New Yorkers and the city they live in, which relies on parking violation revenue to provide city services. Last year alone the city raked in $70 million from 1.2 million alternate-side parking violations at $55 a pop. Is it any wonder the Dept. of Sanitation fought a recent proposal that would allow car owners to re-park as soon as the street sweeper finished its work—rather than waste countless hours idling just to honor the official parking rules?

New York’s parking wars embody the modern city’s twisted relationship with its dwellers. Officials know street sweeping is largely ineffective and environmentally harmful. They know the fine bears no relation to the underlying offense (spare me the “social cost” argument) and targets working people living in the low-income outer-borough neighborhoods where parking is tight and cars are essential since mass transit is less available. They know that even if everyone earnestly tries to follow the law, there aren’t nearly enough spots for everyone during alternate-side parking times. They know the average urbanite has zero sympathy (disdain is more like it) for drivers even though the billions the city rakes in each year from bridge and tunnel tolls subsidize their train and bus commutes.

The suggestion that alternate-side parking fines exist for any reason other than revenue is vulgar and pretentious. Yet no official, elected or otherwise, will ever come out and admit alternate-side parking rules have been engineered to extract what amounts to a backdoor tax. Doing so would undermine the movement heralding the smart city as humanity’s redeemer. Cities, we are told again and again by the sustainability expert, are our destiny. (...)

Yet this is the crux of urbanism’s shell game. I don’t believe white urbanites are an inherently favored species. Stock images of attractive white couples may adorn the latest luxury condo, but not because urbanism has a special place in its heart for them. It’s economics, pure and simple. This, of course, contradicts the prevailing propaganda pumping out of government public relations offices across the country. The modern city cares about our health and wellness. It wants to be livable, sustainable, and walkable—vibrant. It wants to provide us with amenities and opportunities to experience culture, food, and community. According to the urbanist, the city wishes us to believe it can be both affordable and upscale. That it is invested in our children’s education, our safety, our careers. It is all things to all people. It has a heart.

And in fact, the city will sometimes tease us. The train you desperately need will arrive on time. There will be parking on the block, an open table at a new restaurant. Your favorite artist will be playing in the park, for free. In that moment, you will believe that things could not be any better than they are. You will feel the soothing satisfaction of having made the right choice in life. You will forget the infinite frustrations and heartaches you endure. You will rationalize your overpriced micro-dwelling as a social good. You will believe the life the city offers has been created to suit your unique and discriminating needs and tastes. And you will be wrong.

Here’s what really happens. First, a city hires a think tank to come up with a revitalization plan (pdf). That plan typically entails attracting young people with skills and education and retaining slightly older people with money and small children. Case in point: Washington, DC, in the early 2000s. As I’ve written elsewhere (pdf), in 2001 Brookings Institute economist Alice Rivlin published a report entitled “Envisioning a Future Washington” in which she mapped a revitalization plan that became a blueprint for gentrification. Urban planning and design firms are then hired to figure out how to make a city more desirable to these people. They conduct surveys, mine the data, and issue reports that award these people a flattering label like “creative class” and pronounce what they are looking for and how cities can attract/retain them. What we see happening in cities across America is the result: an unmitigated backlash against the era of sprawl and its accomplices—strip malls, subdivisions, and big-box chains—nothing more, nothing less.

Indeed, the true genius of urbanism is that the marketing campaigns promoting it have seized upon a search for meaning that traditional institutions can no longer satisfy, promising, if only implicitly, to fill the gap. Just look at the shimmering, stylized artist renditions accompanying every new upscale urban development. Rays of light from the heavens above shower the newly paved sidewalks, reflecting boundlessly off the glass buildings and brightening the lives of the multi-hued populace carrying fresh fruits and vegetables in their canvas tote bags.

Urbanism has become the secular religion of choice practiced with the enthusiasm of a Pentecostal tent revival, and the amenitized high-rise the new house of worship. It, after all, promises to fulfill or at least facilitate all of one’s needs while on Earth—with everything from rooftop community gathering space to sunlit Saturday morning yoga classes in the atrium.

This isn’t a new idea. In his celebrated and remarkably enduring 1949 essay, “Here is New York” (pdf). E.B. White addressed the spiritual life that a city offers:
Many people who have no real independence of spirit depend on the city’s tremendous variety and sources of excitement for spiritual sustenance and maintenance of morale … I think that although many persons are here for some excess of spirit (which cause them to break away from their small town), some, too, are here from a deficiency of spirit, who find in New York a protection, or an easy substitution.
White’s essay isolates the beauty of New York: It is a love letter. By all means, I invite you to be taken with it; I am. His city offers the range of rewards—sights and sounds and things to do. I marvel at the way White’s city operates, they way it manages to instill order and achieve artistry. In White’s capable hands, cities are humanity’s premier expression of civilization.

Urbanism, as well, has deftly aligned itself with human progress. It trumpets terms like “smart growth,” “sustainability,” “resilience,” and “scalability” to demonstrate both its concern with the quality of our lives and its progressive street cred. It champions urban “green space” as the solution for everything from obesity to asthma. But green spaces aren’t even parks. Often people can only use them during prescribed times and in particular ways—concerts, film screenings, seasonal outdoor markets. Moreover, they’re usually owned by a developer who likely built it as a concession for a sweet deal on the land. Yet this is what we celebrate? A paltry scrap of flora? Which just begs a question Thomas Frank posed in his Baffler essay skewering the “vibrancy” movement so many cities have staked their futures on:
… [W]hy is it any better to pander to the “creative class” than it is to pander to the traditional business class? Yes, one strategy uses “incentives” and tax cuts to get companies to move from one state to another, while the other advises us to emphasize music festivals and art galleries when we make our appeal to that exalted cohort. But neither approach imagines a future arising from something other than government abasing itself before the wealthy.
To be fair, in as much as cities can be said to have a consciousness, they fully comprehend their vulnerability. Urban planners know perfectly well that if the delicate balance between safety and prosperity is lost, then disinvestment and abandonment can strike. But they have also learned that people can be manipulated to identify with the city and thereby tolerate just about anything it dishes.

by Dax-Devlon Ross, TMN | Read more:
Image: Steven Guerrisi

Tuesday, February 24, 2015

Whistlin' Dixie

Driving south from the North, we tried to spot exactly where the real South begins. We looked for the South in hand-scrawled signs on the roadside advertising ‘Boil Peanut’, in one-room corrugated tin Baptist churches that are little more than holy sheds, in the crumbling plantation homes with their rose gardens and secrets. In the real South, we thought, ships ought to turn to riverboats, cold Puritanism to swampy hellfire, coarse industrialists with a passion for hotels and steel to the genteel ease of the cotton planter.

Most of what we believe about the South, wrote W.J. Cash in the 1930s, exists in our imagination. But, he wrote, we shouldn’t take this to mean that the South is therefore unreal. The real South, wrote Cash in The Mind of the South, exists in unreality. It is the tendency toward unreality, toward romanticism, toward escape, that defines the mind of the South.

The unreality that shaped the South took many forms. In the South, wrote Cash (himself a Southern man), is “a mood in which the mind yields almost perforce to drift and in which the imagination holds unchecked sway, a mood in which nothing any more seems improbable save the puny inadequateness of fact, nothing incredible save the bareness of truth.” Most people still believe, wrote Cash — but no more than Southerners themselves — in a South built by European aristocrats who erected castles from scrub. This imaginary South, wrote Cash, was “a sort of stagepiece out of the eighteenth century,” where gentlemen planters and exquisite ladies in farthingales spoke softly on the steps of their stately mansions. But well-adjusted men of position and power, he wrote, “do not embark on frail ships for a dismal frontier… The laborer, faced with starvation; the debtor, anxious to get out of jail; the apprentice, eager for a fling at adventure; the small landowner and shopkeeper, faced with bankruptcy and hopeful of a fortune in tobacco; the neurotic, haunted by failure and despair” — only these would go.”

The dominant trait of the mind of the South, wrote Cash, was an intense individualism — an individualism the likes of which the world hadn’t seen since Renaissance days. In the backcountry, the Southern man’s ambitions were unbounded. For each who stood on his own little property, his individual will was imperial law. In the South, wrote Cash, wealth and rank were not so important as they were in older societies. “Great personal courage, unusual physical powers, the ability to drink a quart of whiskey or to lose one’s whole capital on the turn of a card without the quiver of a muscle — these are at least as important as possessions, and infinitely more important than heraldic crests.”

The average white Southern man (for this man was Cash’s main focus) was a romantic, but it was a romance bordering on bedlam. Any ordinary man tends to be a hedonist and a romantic, but take that man away from Old World traditions, wrote Cash, and stick him in the frontier wilds. Take away the skepticism and realism necessary for ambition and he falls back on imagination. His world becomes rooted in the fantastic, the unbelievable, and his emotions lie close to the surface. Life on the Southern frontier was harsh but free — it could make a man’s ego feel large.

The Southern landscape, too, had an unreal quality, “itself,” wrote Cash, “a sort of cosmic conspiracy against reality in favor of romance.” In this country of “extravagant color, of proliferating foliage and bloom, of flooding yellow sunlight, and, above all, perhaps, of haze,” the “pale blue fogs [that] hang above the valleys in the morning,” the outlines of reality blur. The atmosphere smokes “rendering every object vague and problematical.” A soft languor creeps through the blood and into the brain, wrote Cash, and the mood of the South becomes like a drunken reverie, where facts drift far away. “But I must tell you also that the sequel to this mood,” wrote Cash, “is invariably a thunderstorm. For days — for weeks, it may be — the land lies thus in reverie and then …”

The romanticism of the South, wrote W.J. Cash, was one that tended toward violence. It was a violence the Southern man often turned toward himself as much as those around him. The reverie turns to sadness and the sadness to a sense of foreboding and the foreboding to despair. Nerves start to wilt under the terrifying sun, questions arise that have no answers, and “even the soundest grow a bit neurotic.” When the rains break, as they will, and the South becomes a land of fury, the descent into unreality takes hold. Pleasure becomes sin, and all are stripped naked before the terror of truth.

by Stefany Anne Golberg, The Smart Set |  Read more:
Image: uncredited

The Drug Technology Boom

Cannabis is joining the coca leaf in the ranks of drugs improved by technology.

Concentrates are made from the entire plant, while the smokable flower is only a part. To sell flowers, plants must be cut, trimmed, and dried. The process takes weeks and a lot of manpower. For refined product, the entire plant can be processed without having to wait. Solvents turn live plants into BHO (butane hash oil) on the spot. Those who produce concentrates have a sellable product within 24 hours, where it would take weeks to properly prepare buds.

Dabs are refined marijuana with highly concentrated doses of THC, which explains why, when I tried it, I could feel my thoughts. About 50 percent of the marijuana flower is vegetative plant matter and weeded out in clarification. What’s left are the cannabinoids and terpenes (the good stuff with the medical applications and flavor).

Marijuana is oil- and fat-soluble, which is why edibles (like brownies) work. When the flowers are simmered in butter, the butter takes on the intoxicating components of the plant, which can be then be strained out; boiling it in water would just get it hot and wet. Once ingested (however they’re ingested) and absorbed into the bloodstream, fat-soluble drugs collect in fatty tissue like the brain.

Whether a substance is fat- or water-soluble depends on polarity. Fat-soluble materials are nonpolar, meaning they lack electrical charge. Nonpolar molecules are held together by covalent bonds, in which electrons are shared between connected atoms. Because it’s fat-soluble, concentrates are made with oil-based solvents (often butane) to suspend the cannabinoids. This separates the essential oils from the plant matter.

For those of us who weren’t in AP Chemistry, a solvent is a material that dissolves another, chemically different material (the solute). There are water-based solvent-extraction techniques, but they are less popular.

“Blasting” (extracting concentrated THC from the plant) is complicated, dangerous, and easy to find on YouTube.

The solvent is introduced to the solute (dank, loud, nugs, and other stupid names), resulting in a yellow liquid, as long as ventilation has been good enough to prevent suffocation by fire.

The product of the last chemical reaction becomes the reactant in the next one. The liquid is heated to evaporate the remaining harsh chemicals until it resembles delicious crème brûlée. The material is then moved into a vacuum chamber where the pump bubbles and fluffs it some more. (The strict need for a vacuum chamber is a point of some Reddit dispute, but it remains the favored process.)

When the fluff looks like gooey home insulation, it returns to the heat source to take its final form. Once isolated, the beneficial compounds can take a variety of structures. A few variables determine the consistency, but temperature is the big one.

“Shatter,” as it’s referred to in this state, looks like the meth seen in Breaking Bad, but yellow. It’s solid yet fragile and breaks apart easily. Shatter is subjected to an additional process to extract the lipids, fats, waxes, and terpenes. It’s the purest of the refined products. Good shatter can reach over 80 percent THC.

“Wax” looks like a mother extracted it from a 14-year-old boy’s ear. It still has the terpenes, which makes it more flavorful but less potent than shatter—usually 70 to 80 percent. Of the three most popular concentrates, it’s definitely the one I’d most like to get under a microscope.

“Honey oil” looks a bit like shatter and feels a lot like maple syrup. It’s the least refined and most flavorful of the three. (...)

Since concentrates are better vaped than smoked, their growing popularity is changing the apparatuses used to consume them. The first vaporizer I ever used was a direct-inhale box. If you tried to get someone stoned with that today, they’d laugh at you. They were analog, complicated, and hard to use. I never once felt that I got stoned, and oh lord did I try. (...)

The two technologies are evolving alongside one another. Since that pioneering stoner first invented the apple pipe, potheads have been seeking to optimize consumption.

“Some people lack the skills to become accomplished stoners,” Greene laughed. “If you hold the lighter the wrong way, you burn your thumb; everyone can push a button.”

I don’t know if I agree. Ritual is part of addiction. Don’t get me wrong: I think pot is great and everyone should get stoned, but I am, without a doubt, addicted. My dad hasn’t smoked since the ’80s, but he still prides himself on rolling great joints (and put to the test, he will). Similarly, I get a rush out of tearing open a new bag of Agent Orange (or Green Crack or another unfortunately named strain), ripping apart the bud, and packing the bowl. The texture of flowers crumbling between my fingers is an intoxicating part of the experience.

With my vape, I’m beginning to appreciate a new set of rituals. I play with the temperature settings, pack the oven with a particular gold pencil, and meticulously scrape the lid clean. These aren’t the same rituals that make me feel like I’m in my childhood bathroom blowing smoke into the ass end of a fan while listening to Belle and Sebastian’s Tigermilk, but there’s magic in the new as well.

by Cece Lederer, The Kernel | Read more:
Images: Imgur, Andres Rodriguez/Flickr (CC BY 2.0), Vjiced/Wikimedia Commons (CC BY-SA 3.0)

Television


[ed. Full album here.]

The Glacier Priest


[ed. Amazing how everything looks much the same. The Wikipedia entry doesn't mention it, but I assume the Hubbard Glacier was named after the Rev. Bernard Hubbard, remembered by Newsweek after he died as: "the Glacier Priest, a tireless Jesuit who led 32 expeditions to Alaska and once listed the requisites of an explorer as "a strong back, a strong stomach, a dumb head, and a guardian angel."]

Winter Forever


The New York Times has something to say about the season:
Long stretches of painfully frigid weather, brief respites, then more snow, ice and freezing rain. Freeze, thaw, repeat — the cycle expands in the mind to unbearable infinities, the unyielding sensation of being trapped. But check the calendar: This week means we are officially in late February, which means March. March means daffodils, which means this all must eventually end.
It’s a nice thought, but I am here to tell you that, like all hope, this optimism is deeply misplaced. Sure, the calendar will change. The birds will sing again. The scent of fresh urine will alight on the nose as you wander the streets. Perhaps it will even once more grow so warm that we shall shed the heavy layers of clothing with which we bundle ourselves before each undertaking outdoors. But mark my words: What you’ve seen this winter and the winter before can never be erased. There is no return from such raw horror. However hard the sun shines down on you in future days you will always carry this winter around in your cold, barren heart. The light that once danced and played behind your eyes has been permanently dimmed and replaced by a mean, dull glare that stares shivers into anything it surveys. If the human race lasts another hundred years each member of the species will carry within its shattered soul a darkness so intense that the all the trees of the fields shall bend their branches away in fear from its frigid malevolence. Winter will never end, nor will you ever rid yourself of it. It is in you, and of you. It is you. There is no turning back, ever. Your stock of sorrows will freeze and, having frozen, crack into a thousand tiny icicles that stab sadness into any spare bit of joy that threatens to melt your bitter, broken spirit. When you walk you carry with you the icy frost of death which, with the wind chill factor, feels like negative five degrees icy frost of death. The only warmth left for you will come when you are lowered for the last time into the ground’s final embrace.

by Alex Balk, The Awl | Read more:
Image: via:

Monday, February 23, 2015

Donna Summer



[ed. Hey, didn't Lady Gaga kill it at the Oscars last night? Sound of Music covers. Huh. Who would have thought?]

How Pop Made a Revolution

Yeah! Yeah! Yeah!: The Story of Pop Music from Bill Haley to Beyoncé, Bob Stanley, W.W. Norton, 624 pages

I wish I could say that my love of pop music began when my middle school music teacher showed me a documentary called “The Compleat Beatles.” That would be the socially acceptable, hipster-sanctioned origin story. But truthfully, the affair began a couple years earlier in 1986, when I conspired with some friends to flood our local top 40 station with requests for the song “Rock Me Amadeus.” Dismayed that Falco’s masterwork had slipped in the charts, we resolved to do whatever we could to reverse its fate. This was either true love or something equally intense—a force that could drive a 12-year-old boy to cold-call a radio station and then sit next to the stereo for hours with finger poised over the tape-record button, enduring songs by Mister Mister and Starship, just waiting for that descending synth motif to issue forth from the speakers.

I’m not terribly surprised that “Rock Me Amadeus” receives no mention in Bob Stanley’s new book. While the song embodies the very essence of pop—it is quirky, flamboyant, goofily ambitious, yet so very of its moment—it was ultimately a failed experiment, a novelty hit. (Though, to be fair, it was no less kitschy than The Timelords’ “Doctorin’ The Tardis,” which does receive mention.) I listen to it now and wonder what the hell my 12-year-old self was thinking. But that’s love, right? It rarely makes sense after it has passed. Stanley clearly knows something about the fever dream of the besotted pop fan, and much of his book is written from that headspace.

What a joy it is to find a music writer who didn’t get the rock-critic memo—the one that says you’re supposed to worship at the altar of punk rock, praise Radiohead, and hate the Eagles. Stanley has plenty of nice things to say about the Eagles, the Bee Gees, Hall and Oates, and Abba. Conversely, he has nothing but contempt for The Clash, those self-anointed exemplars of punk rock. “The Clash set out parameters,” he writes, “and then squirmed like politicians when they were caught busting their own manifesto.” (Stanley prefers the more self-aware Sex Pistols.) Radiohead fare even worse; he describes these critical darlings as “dad rock.” Vocalist Thom York sings “as if he was in the fetal position.”

Of Bob Dylan, a figure as close to a saint as we get in the annals of rock lit, Stanley writes: “along with the Stones he sealed the concept of snotty behavior as a lifestyle, snarled at the conventional with his pack of giggling lickspittle dogs, and extended Brando’s ‘What have you got?’ one-liner into a lifelong party of terse putdowns.” For those of us who grew up reading far too many issues of Rolling Stone for our own good, this is bracing tonic indeed.

What gives Stanley the edge over so many other music journalists is the fact that he is a songwriter himself, and a fairly successful one at that: his band Saint Etienne had a string of UK Top 20 hits in the 1990s. It is easier for musicians than for non-musician critics, I believe, to see beyond genre boundaries and appreciate tunefulness wherever it may reside. Stanley, whom I’m pretty sure would rather be known as a “musician who writes” than a “writer who plays music,” takes a more expansive view of the term “pop” than a lot of other writers might do. In his view, pop simply means “popular.” It is not, as is typically imagined, a specific sound—say that of a Britney Spears or Katy Perry. Under Stanley’s definition, Nirvana qualifies as pop. So do Pink Floyd, Black Sabbath, and Glen Campbell.

At 624 pages, Yeah! Yeah! Yeah! is a doorstop of a book, but Stanley’s enthusiasm for the material keeps the narrative moving briskly. He can get inside a song and describe its magic to outsiders like no one else I have ever come across. Consider the following highlights: Of the “clattering, drum-heavy” mix of Bill Haley’s “Rock Around the Clock,” he writes, “It sounded like jump blues, only with someone dismantling scaffolding in the studio.” On Abba: “No one musician stands out on any of their hits because they don’t sound like anyone played an instrument on them; they all sound like a music box carved from ice.” On the Police: “Singer Sting had a high, mewling voice that, appropriately, sounded a little like the whine of a police siren.” And, as is probably apparent already, Stanley is very effective with the terse putdown. My favorite concerns The Cure—a band that has spawned an entire cottage industry of mopey imitators: “It was all somehow powdery and a little slight,” he writes. “The Cure were more about stubbing your toe than taking your life.” Ouch.

Stanley’s two preoccupations throughout the book are innovation and craft, in that order. He gives a lot of space to sonic pioneers like Joe Meek, Phil Spector, and later the architects of early hip-hop and house music, detailing how each wave of experimentation inevitably made its way into the heart of the mainstream sound, eventually becoming calcified until the next upheaval came along to shake things up.

by Robert Dean Lurie, The American Conservative | Read more:
Image: The Beatles / Wikimedia Commons

War Porn

In the age of the all-volunteer military and an endless stream of war zone losses and ties, it can be hard to keep Homeland enthusiasm up for perpetual war. After all, you don't get a 9/11 every year to refresh those images of the barbarians at the airport departure gates. In the meantime, Americans are clearly finding it difficult to remain emotionally roiled up about our confusing wars in Syria and Iraq, the sputtering one in Afghanistan, and various raids, drone attacks, and minor conflicts elsewhere.

Fortunately, we have just the ticket, one that has been punched again and again for close to a century: Hollywood war movies (to which the Pentagon is always eager to lend a helping hand). American Sniper, which started out with the celebratory tagline “the most lethal sniper in U.S. history” and now has the tagline “the most successful war movie of all time,” is just the latest in a long line of films that have kept Americans on their war game. Think of them as war porn, meant to leave us perpetually hyped up. Now, grab some popcorn and settle back to enjoy the show.

There’s Only One War Movie

Wandering around YouTube recently, I stumbled across some good old government-issue propaganda. It was a video clearly meant to stir American emotions and prepare us for a long struggle against a determined, brutal, and barbaric enemy whose way of life is a challenge to the most basic American values. Here's some of what I learned: our enemy is engaged in a crusade against the West; wants to establish a world government and make all of us bow down before it; fights fanatically, beheads prisoners, and is willing to sacrifice the lives of its followers in inhuman suicide attacks. Though its weapons are modern, its thinking and beliefs are 2,000 years out of date and inscrutable to us.

Of course, you knew there was a trick coming, right? This little U.S. government-produced film wasn’t about the militants of the Islamic State. Made by the U.S. Navy in 1943, its subject was “Our Enemy the Japanese.” Substitute “radical Islam” for “emperor worship,” though, and it still makes a certain propagandistic sense. While the basics may be largely the same (us versus them, good versus evil), modern times do demand something slicker than the video equivalent of an old newsreel. The age of the Internet, with its short attention spans and heightened expectations of cheap thrills, calls for a higher class of war porn, but as with that 1943 film, it remains remarkable how familiar what’s being produced remains.

Like propaganda films and sexual pornography, Hollywood movies about America at war have changed remarkably little over the years. Here's the basic formula, from John Wayne in the World War II-era Sands of Iwo Jima to today's American Sniper:

*American soldiers are good, the enemy bad. Nearly every war movie is going to have a scene in which Americans label the enemy as “savages,” “barbarians,” or “bloodthirsty fanatics,” typically following a “sneak attack” or a suicide bombing. Our country’s goal is to liberate; the enemy's, to conquer. Such a framework prepares us to accept things that wouldn’t otherwise pass muster. Racism naturally gets a bye; as they once were “Japs” (not Japanese), they are now “hajjis” and “ragheads” (not Muslims or Iraqis). It’s beyond question that the ends justify just about any means we might use, from the nuclear obliteration of two cities of almost no military significance to the grimmest sort of torture. In this way, the war film long ago became a moral free-fire zone for its American characters.

*American soldiers believe in God and Country, in “something bigger than themselves,” in something “worth dying for,” but without ever becoming blindly attached to it. The enemy, on the other hand, is blindly devoted to a religion, political faith, or dictator, and it goes without saying (though it’s said) that his God -- whether an emperor, Communism, or Allah -- is evil. As one critic put it back in 2007 with just a tad of hyperbole, “In every movie Hollywood makes, every time an Arab utters the word Allah… something blows up.”

*War films spend no significant time on why those savages might be so intent on going after us. The purpose of American killing, however, is nearly always clearly defined. It's to “save American lives,” those over there and those who won’t die because we don't have to fight them over here. Saving such lives explains American war: in Kathryn Bigelow’s The Hurt Locker, for example, the main character defuses roadside bombs to make Iraq safer for other American soldiers. In the recent World War II-themed Fury, Brad Pitt similarly mows down ranks of Germans to save his comrades. Even torture is justified, as in Zero Dark Thirty, in the cause of saving our lives from their nightmarish schemes. In American Sniper, shooter Chris Kyle focuses on the many American lives he’s saved by shooting Iraqis; his PTSD is, in fact, caused by his having “failed” to have saved even more. Hey, when an American kills in war, he's the one who suffers the most, not that mutilated kid or his grieving mother -- I got nightmares, man! I still see their faces!

*Our soldiers are human beings with emotionally engaging backstories, sweet gals waiting at home, and promising lives ahead of them that might be cut tragically short by an enemy from the gates of hell. The bad guys lack such backstories. They are anonymous fanatics with neither a past worth mentioning nor a future worth imagining. This is usually pretty blunt stuff. Kyle’s nemesis in American Sniper, for instance, wears all black. Thanks to that, you know he’s an insta-villain without the need for further information. And speaking of lack of a backstory, he improbably appears in the film both in the Sunni city of Fallujah and in Sadr City, a Shia neighborhood in Baghdad, apparently so super-bad that his desire to kill Americans overcomes even Iraq's mad sectarianism.

by Peter Van Buren, TomDispatch |  Read more:
Image: via:

Sunday, February 22, 2015


Underwater Road (Wild West), Brad Hamers on Flickr
via:

We're All Gonna Die

[ed. As my generation ages I'm beginning to see more articles like this. It's a good sign. The concept of what constitutes a good death needs to be radically redefined. See also: As I Lay Dying.]

I’m writing this after hearing an apparently innocuous and encouraging snippet of news – that a new lung cancer treatment is capable of giving sufferers a possible “extra 200 days” of life. Another morning, another “battle against cancer” fought, and in this case won – sort of.

Yet I find myself rather in sympathy with the one in five Dutch doctors who, it was reported this week, would consider helping someone die even if they had no physical problems but were “tired of living”. Because these doctors have the maturity to face the fact that life has a natural end.

The wearying truth is, there are just so many “battles”, and they appear to be multiplying all the time. A new drug to treat strokes. A breakthrough in the “war” against heart disease. A promising initiative on Alzheimer’s. We are fed, daily, the hopeful news: fatal disease is slowly on the retreat. But there’s always one more, and sooner or later we all lose.

Which brings me back to the news item that got me thinking in the first place. An extra 200 days for lung cancer sufferers. I found myself wondering – what kind of days? Of course, all days may seem worth living when you are faced with your imminent demise. But sometimes the endless quest to extend our days has the smack of futility about it.

For it seems to me that in the constant narratives of “triumphs” over this disease or that illness, we are not engaged so much in a struggle against disease, but death itself. We are only partially rational beings – and at the non-rational level, we believe medicine will save us from our fates.

Of course we all “know” that we are going to die – but that order of knowledge, for most of us, is of the same kind that tells us we are all made of stardust, or that at the core of the atoms in our bodies and brains there is only a void. In other words, our imagination can’t grasp it. It’s just a rumour, in this case a nasty one.

And good job too, you may say. There is an argument that strong denial mechanisms are essential in order to survive our existential plight. The endless jogging and fitness regimes, the constant struggle to find out what “superfood” it is this week that will reduce the chance of this or that threat to our health; even the dangerous sports that convince us that we can outmanoeuvre mortality. Maybe the maintenance of such delusions is the secret of a happy life.

Yet for many the thought won’t quite go away. Thus, we are never quite at peace, because we are always working so hard to keep our eyes from staring at the sun. We immerse ourselves in trivial distractions – shopping, loud music, flashing lights. As the existential psychologist Rollo May observed: “Anxiety about nothing tries to become anxiety about something.” That is to say, anxiety about nothing-ness.

I watch the runners on Hampstead Heath every day puffing and panting – suffering – in order to put off the big event, and while I admire them, I wonder if it isn’t all in vain. As a recent study on cancer at Johns Hopkins University revealed, lifestyle is somewhat overrated as a panacea for extending life. Researchers found that more than two-thirds of cancers are driven by random mistakes in cell division that are completely outside our control. And beyond that, there are genetic predispositions, also outside our control.

Furthermore, only this month it was discovered that 50% of people will get cancer – as opposed to one in three, the previous estimate. So perhaps, rather than being at constant battle stations, we should get used to the idea, especially as a former editor of the BMJ, Richard Smith, said it was probably the best way to go: “Nature taking its course.” All that straining and sweating, all those nasty Lycra outfits, all those dreary stalks of broccoli – they may be there not to help us to prolong our lives so much as safeguard our illusions.

by Tim Lott, The Guardian |  Read more:
Image: via

Cold, Dark, and Happy: Alaska Is the New Leader in Well-Being

[ed. As a former Alaskan (35 yrs.) I can see why this survey might be valid. First, there's the awesome beauty of the state, which anyone can experience, even if they live in Anchorage. How much of it they experience depends on the time, money and effort they're willing to expend to get "out there", but it's a young population in general and everyone is pretty active (to keep from going nuts). Second, 'individuality' is celebrated (if not enshrined in the state's constitution) so no one feels like anyone is looking over their shoulder, or that they have to conform to any overiding community values or standards. So there's a lot of room for people to express themselves (and, in the process, find association with other like-minded micro-communities). Finally, Alaskans just like being thought of as "Alaskan". It implies a sort of hardiness and apptitude that you can't find anywhere else*. And yes, there are bears in the backyard and moose on the doorstep, but those things just add to the enjoyment and mystique of being in a place so completely different than anywhere else. The mindset is: the harder it gets, the more invigorated people feel (to get through whatever it is they have to get through). That's really the Alaskan ethos.]

"Alaskans are the best in the nation in terms of exercise," explained Dan Witters, a research director at the polling agency Gallup, in making the case that Alaska is the nation's new bastion of well-being. "Which just goes to show you that you don't need year-round good weather to demonstrate good exercise habits."

Even if I remain unconvinced on that front, the fact that people manage to exercise more in Alaska than people in any other state—somehow—is just one of the many metrics that landed the state the number-one spot in a massive study of health and well-being across America, released this week.

Alaskans also reported the lowest stress levels of any population in the country over the past year, and the state had the lowest rate of diabetes. Maybe most surprisingly, despite the cold and darkness, Alaskans also had the second lowest rate of depression diagnoses in the country.

Witters, who oversaw the 2014 Gallup-Healthways study of 176,702 Americans, seemed to find genuine excitement in the ascension of Alaska—more than once calling it "really neat" and suggesting that it is a model that other states would do well to emulate. Indeed, the state's victory is a realization of longer-term trends, Witters explained, that he has been measuring and observing in Alaska for a while now.

The state has actually been in the top 10 multiple times since the first annual well-being rankings in 2008—Hawaii and Colorado are the only states to have made the top 10 every year—though Alaska has never before been number one. Other rural, colder states seem to score highly in well-being, too: South Dakota, Wyoming, Montana, Nebraska, and Utah all made the top ten.

These rankings have made little news in past years, in part because they are based entirely on self-reported surveys, which scientists are quick to dismiss. (Maybe Alaskans don't actually exercise more; they're just part of a statewide culture of lying about exercising. Maybe they don't have diagnoses of depression because doctors aren't recognizing symptoms, or people don't feel comfortable talking about it in a telesurvey. Et cetera.) But seven years and 2.1 million surveys in, the longitudinal trends seem too substantial to dismiss outright. And if people are lying, Witters concedes, at least they are most likely lying in the same ways regularly. (...)

And in community involvement, Alaska leads the nation, too. There, for example, the survey asks people whether they've received recognition in the last year for helping to improve their community. "That's a tough nut to crack nationally," Witters said. But among Alaskans, 28 percent say they have—which is actually the best rate in the country. They are also, despite (because of?) the bear population, fifth in the country in terms of feeling safe and secure.

"Another really good one that I love about Alaska, within the purpose element, is learning something new and interesting every day," Witters explained, "which is an important psychological need." That metric is a reason that college towns tend to score highly on the well-being index. And there, too, Alaska is number one in the nation, with 72 percent of residents feeling daily intellectual stimulation.

The state is held in stark contrast to the opposite end of the spectrum, the cases of Kentucky and West Virginia. If nothing else, the two states attest to the validity of the ranking system in that there is consistency in its results: The pair has managed to hold down spots 49 and 50 for six consecutive years.

"Kentucky and West Virginia are really in bad shape," said Witters. There diagnoses of depression are perennially among the highest in the nation, as are stress levels and high blood pressure. Nearly a third of West Virginians smoke tobacco, compared to 19 percent of people nationwide.

Behind those disheartening numbers is another particularly important metric: having someone in your life who encourages you to be healthy. There West Virginia also ranks last in the country. "That is a really good leverage point that they could take advantage of, that cultural change of encouraging accountability to one another," said Witters, when I asked them how West Virginia could learn from Alaska. "It's about having someone who has fundamental expectations of you, in how you live your life."

by James Hamblin, The Atlantic |  Read more:
Image: Gallup/Healthways

* "In theory it's not impossible to live like that, and of course there are people who do. But nature is actually kind of unnatural, in a way. And relaxation can actually be threatening. It takes experience and preparation to really live with those contradictions."
                                                                     ~ Haruki Murakami, Kafka on the Shore


"The trouble with the rat race is that even if you win, you're still a rat" - Lily Tomlin.
via: