Thursday, February 26, 2015

Regulators Approve Tougher Rules for Internet Providers

[ed. Well, until the next administration anyway. See also: Brief History of the Internet.]

Internet activists declared victory over the nation's big cable companies Thursday, after the Federal Communications Commission voted to impose the toughest rules yet on broadband service to prevent companies like Comcast, Verizon and AT&T from creating paid fast lanes and slowing or blocking web traffic.

The 3-2 vote ushered in a new era of government oversight for an industry that has seen relatively little. It represents the biggest regulatory shake-up to telecommunications providers in almost two decades.

The new rules require that any company providing a broadband connection to your home or phone must act in the "public interest" and refrain from using "unjust or unreasonable" business practices. The goal is to prevent providers from striking deals with content providers like Google, Netflix or Twitter to move their data faster.

"Today is a red-letter day for Internet freedom," said FCC Chairman Tom Wheeler, whose remarks at Thursday's meeting frequently prompted applause by Internet activists in the audience.

President Barack Obama, who had come out in favor of net neutrality in the fall, portrayed the decision as a victory for democracy in the digital age. In an online letter, he thanked the millions who wrote to the FCC and spoke out on social media in support of the change.

"Today's FCC decision will protect innovation and create a level playing field for the next generation of entrepreneurs - and it wouldn't have happened without Americans like you," he wrote.

Verizon saw it differently, using the Twitter hashtag (hash)ThrowbackThursday to draw attention to the FCC's reliance on 1934 legislation to regulate the Internet. Likewise, AT&T suggested the FCC had damaged its reputation as an independent federal regulator by embracing such a liberal policy.

"Does anyone really think Washington needs yet another partisan fight? Particularly a fight around the Internet, one of the greatest engines of economic growth, investment and innovation in history?" said Jim Cicconi, AT&T's senior executive vice president for external and legislative affairs.

Net neutrality is the idea that websites or videos load at about the same speed. That means you won't be more inclined to watch a particular show on Amazon Prime instead of on Netflix because Amazon has struck a deal with your service provider to load its data faster.

For years, providers mostly agreed not to pick winners and losers among Web traffic because they didn't want to encourage regulators to step in and because they said consumers demanded it. But that started to change around 2005, when YouTube came online and Netflix became increasingly popular. On-demand video began hogging bandwidth, and evidence surfaced that some providers were manipulating traffic without telling consumers.

By 2010, the FCC enacted open Internet rules, but the agency's legal approach was eventually struck down in the courts. The vote Thursday was intended by Wheeler to erase any legal ambiguity by no longer classifying the Internet as an "information service" but a "telecommunications service" subject to Title II of the 1934 Communications Act.

That would dramatically expand regulators' power over the industry and hold broadband providers to the higher standard of operating in the public interest.

by Anne Flaherty, AP |  Read more:
Image: uncredited via:

Blogger Porn Ban – Google's Arbitrary Prudishness is Attacking the Integrity of the Web

[ed. This post brought to you by Blogger. See also: Silicon Valley's War on Sex Continues.]

Google has steadily been cutting down on adult-oriented material hosted on Blogger, its blogging platform, over the last few years. Previously, bloggers could freely post “images or videos that contain nudity or sexual activity,” albeit behind an warning screen that Blogger implemented in 2013.

Then, Blogger said “censoring this content is contrary to a service that bases itself on freedom of expression”, so bloggers rightly assumed that they would be free to continue to post adult content.

But in a huge U-turn, Google has changed its position and decided that as of 23 March, there will be no explicit material allowed on Blogger unless it offers “public benefit, for example in artistic, educational, documentary, or scientific contexts” – all which will be determined by Google. Quite how they will do that has not been made clear.

Anything else that does not fall into this category will be restricted to private-only viewing, where only people who have been invited by the blog’s creator will be able to see them; it won’t appear in search results.

This is like having a public library where all the shelves are empty and all the books imperceptible to readers, and authors are required to stand there in person, handing out copies of their work to those hoping to read it. What Google is doing, in reality, is making these blogs invisible. It effectively kills them off.

Some people might read this and think: “Well, Google just doesn’t want to host porn for free any more, that’s why it’s bringing in these restrictions, what’s wrong with that?” To some extent, they’d have a point, because other blog platforms are available and if a users’ sole intent is to make money, then they’re a business and should pay for hosting, not expect to get it for free.

But this new policy has more far-reaching and long-term implications than just censorship and a loss of profit for those posting explicit content, and here’s an example of why: it breaks the internet.

My own personal blog (no explicit images, but graphic descriptions of sex) has had more than 8m readers over 11 years of being hosted on Blogger. If I was forced to make it private and invitation-only, there is no conceivable way that I could contact every single one of those readers and send them a password link to access it.

When I joined Blogger in 2004, I did more than just sign up to publishing a sex blog, I joined a community of people: other erotic writers, non-erotic writers, sex educators, feminist porn-makers, memoirists, political activists, journalists, photographers, news-sharers, comedians, artists, comic creators and more. A disparate bunch of people joined together by one thing in common: we all posted stuff on the internet and then shared it.

This network – indeed the Internet itself – is made up of links. You find a link, click through, and expect to arrive at a page containing some form of content, whether that be text, images, video, or audio files. From its inception, blogging has been about people sharing links; indeed, one of the UK’s first well-known blogs back in 1999 was the link-sharing LinkMachineGo.

By forcing blogs – any blogs, regardless of their content – to become private, it means the link to that blog will no longer work: people clicking through without a password would arrive on a non-existent page. Thousands of other bloggers and websites may have shared that blog’s link over some years, and as a result of this policy change, that link would effectively be dead. In essence, what this means is that a long-standing, interactive, supportive community will be killed off overnight.

by Zoe Margolis, The Guardian |  Read more:
Image: Alamy

Wednesday, February 25, 2015


Jennifer Cantwell, Letter home, 2011
via:

Leigh Smith, selective memory, 2012
via:

Kurt Vonnegut on the Shapes of Stories


[ed. Kurt Vonnegut: A Man Without a Country]

“The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is.”

"Now let me give you a marketing tip. The people who can afford to buy books and magazines and go to the movies don’t like to hear about people who are poor or sick, so start your story up here [indicates top of the G-I axis]. You will see this story over and over again. People love it, and it is not copyrighted. The story is ‘Man in Hole,’ but the story needn’t be about a man or a hole. It’s: somebody gets into trouble, gets out of it again [draws line A]. It is not accidental that the line ends up higher than where it began. This is encouraging to readers. (...)

Now there’s a Franz Kafka story [begins line D toward bottom of G-I axis]. A young man is rather unattractive and not very personable. He has disagreeable relatives and has had a lot of jobs with no chance of promotion. He doesn’t get paid enough to take his girl dancing or to go to the beer hall to have a beer with a friend. One morning he wakes up, it’s time to go to work again, and he has turned into a cockroach [draws line downward and then infinity symbol]. It’s a pessimistic story. (...)

The question is, does this system I’ve devised help us in the evaluation of literature? Perhaps a real masterpiece cannot be crucified on a cross of this design. How about Hamlet? It’s a pretty good piece of work I’d say. Is anybody going to argue that it isn’t? I don’t have to draw a new line, because Hamlet’s situation is the same as Cinderella’s, except that the sexes are reversed.

His father has just died. He’s despondent. And right away his mother went and married his uncle, who’s a bastard. So Hamlet is going along on the same level as Cinderella when his friend Horatio comes up to him and says, ‘Hamlet, listen, there’s this thing up in the parapet, I think maybe you’d better talk to it. It’s your dad.’ So Hamlet goes up and talks to this, you know, fairly substantial apparition there. And this thing says, ‘I’m your father, I was murdered, you gotta avenge me, it was your uncle did it, here’s how.’

Well, was this good news or bad news? To this day we don’t know if that ghost was really Hamlet’s father. If you have messed around with Ouija boards, you know there are malicious spirits floating around, liable to tell you anything, and you shouldn’t believe them. Madame Blavatsky, who knew more about the spirit world than anybody else, said you are a fool to take any apparition seriously, because they are often malicious and they are frequently the souls of people who were murdered, were suicides, or were terribly cheated in life in one way or another, and they are out for revenge.

So we don’t know whether this thing was really Hamlet’s father or if it was good news or bad news. And neither does Hamlet. But he says okay, I got a way to check this out. I’ll hire actors to act out the way the ghost said my father was murdered by my uncle, and I’ll put on this show and see what my uncle makes of it. So he puts on this show. And it’s not like Perry Mason. His uncle doesn’t go crazy and say, ‘I-I-you got me, you got me, I did it, I did it.’ It flops. Neither good news nor bad news. After this flop Hamlet ends up talking with his mother when the drapes move, so he thinks his uncle is back there and he says, ‘All right, I am so sick of being so damn indecisive,’ and he sticks his rapier through the drapery. Well, who falls out? This windbag, Polonius. This Rush Limbaugh. And Shakespeare regards him as a fool and quite disposable.

You know, dumb parents think that the advice that Polonius gave to his kids when they were going away was what parents should always tell their kids, and it’s the dumbest possible advice, and Shakespeare even thought it was hilarious.

‘Neither a borrower nor a lender be.’ But what else is life but endless lending and borrowing, give and take?

‘This above all, to thine own self be true.’ Be an egomaniac!

Neither good news nor bad news. Hamlet didn’t get arrested. He’s prince. He can kill anybody he wants. So he goes along, and finally he gets in a duel, and he’s killed. Well, did he go to heaven or did he go to hell? Quite a difference. Cinderella or Kafka’s cockroach? I don’t think Shakespeare believed in a heaven or hell any more than I do. And so we don’t know whether it’s good news or bad news.

I have just demonstrated to you that Shakespeare was as poor a storyteller as any Arapaho.

But there’s a reason we recognize Hamlet as a masterpiece: it’s that Shakespeare told us the truth, and people so rarely tell us the truth in this rise and fall here [indicates blackboard]. The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is.

And if I die — God forbid — I would like to go to heaven to ask somebody in charge up there, ‘Hey, what was the good news and what was the bad news?’"

by Maria Popova, Brain Pickings |  Read more:
Image: Kurt Vonnegut

How to Avoid Rape in Prison


The Marshall Project put together this short film where former inmates explain how to avoid being sexually assaulted while incarcerated.

Cities Don’t ♥ Us

[ed. See also: A Last Ditch Effort to Preserve the Heart of the Central District, and Fixing Pioneer Square.]

Each day in New York an army of street-sweeping trucks fan across the boroughs purportedly inhaling the litter and waste that parks itself in curbside crevices along residential blocks. (Commercial districts are typically cleaned overnight.) If you’ve ever seen one of these massive contraptions you’ve probably wondered how much they truly clean—rather than just disperse the dirt and debris to another location for the next day’s job—and whether they do more environmental harm than good. And if you happen to be a car-owning New Yorker, the sound of a street sweeper even one or two blocks away can easily trigger a chain of panicked questions starting with “What time is it?” followed by “What day is it?” before landing on “What side of the street am I parked on?”

Alternate-side parking is a part of life in New York City. Both for New Yorkers and the city they live in, which relies on parking violation revenue to provide city services. Last year alone the city raked in $70 million from 1.2 million alternate-side parking violations at $55 a pop. Is it any wonder the Dept. of Sanitation fought a recent proposal that would allow car owners to re-park as soon as the street sweeper finished its work—rather than waste countless hours idling just to honor the official parking rules?

New York’s parking wars embody the modern city’s twisted relationship with its dwellers. Officials know street sweeping is largely ineffective and environmentally harmful. They know the fine bears no relation to the underlying offense (spare me the “social cost” argument) and targets working people living in the low-income outer-borough neighborhoods where parking is tight and cars are essential since mass transit is less available. They know that even if everyone earnestly tries to follow the law, there aren’t nearly enough spots for everyone during alternate-side parking times. They know the average urbanite has zero sympathy (disdain is more like it) for drivers even though the billions the city rakes in each year from bridge and tunnel tolls subsidize their train and bus commutes.

The suggestion that alternate-side parking fines exist for any reason other than revenue is vulgar and pretentious. Yet no official, elected or otherwise, will ever come out and admit alternate-side parking rules have been engineered to extract what amounts to a backdoor tax. Doing so would undermine the movement heralding the smart city as humanity’s redeemer. Cities, we are told again and again by the sustainability expert, are our destiny. (...)

Yet this is the crux of urbanism’s shell game. I don’t believe white urbanites are an inherently favored species. Stock images of attractive white couples may adorn the latest luxury condo, but not because urbanism has a special place in its heart for them. It’s economics, pure and simple. This, of course, contradicts the prevailing propaganda pumping out of government public relations offices across the country. The modern city cares about our health and wellness. It wants to be livable, sustainable, and walkable—vibrant. It wants to provide us with amenities and opportunities to experience culture, food, and community. According to the urbanist, the city wishes us to believe it can be both affordable and upscale. That it is invested in our children’s education, our safety, our careers. It is all things to all people. It has a heart.

And in fact, the city will sometimes tease us. The train you desperately need will arrive on time. There will be parking on the block, an open table at a new restaurant. Your favorite artist will be playing in the park, for free. In that moment, you will believe that things could not be any better than they are. You will feel the soothing satisfaction of having made the right choice in life. You will forget the infinite frustrations and heartaches you endure. You will rationalize your overpriced micro-dwelling as a social good. You will believe the life the city offers has been created to suit your unique and discriminating needs and tastes. And you will be wrong.

Here’s what really happens. First, a city hires a think tank to come up with a revitalization plan (pdf). That plan typically entails attracting young people with skills and education and retaining slightly older people with money and small children. Case in point: Washington, DC, in the early 2000s. As I’ve written elsewhere (pdf), in 2001 Brookings Institute economist Alice Rivlin published a report entitled “Envisioning a Future Washington” in which she mapped a revitalization plan that became a blueprint for gentrification. Urban planning and design firms are then hired to figure out how to make a city more desirable to these people. They conduct surveys, mine the data, and issue reports that award these people a flattering label like “creative class” and pronounce what they are looking for and how cities can attract/retain them. What we see happening in cities across America is the result: an unmitigated backlash against the era of sprawl and its accomplices—strip malls, subdivisions, and big-box chains—nothing more, nothing less.

Indeed, the true genius of urbanism is that the marketing campaigns promoting it have seized upon a search for meaning that traditional institutions can no longer satisfy, promising, if only implicitly, to fill the gap. Just look at the shimmering, stylized artist renditions accompanying every new upscale urban development. Rays of light from the heavens above shower the newly paved sidewalks, reflecting boundlessly off the glass buildings and brightening the lives of the multi-hued populace carrying fresh fruits and vegetables in their canvas tote bags.

Urbanism has become the secular religion of choice practiced with the enthusiasm of a Pentecostal tent revival, and the amenitized high-rise the new house of worship. It, after all, promises to fulfill or at least facilitate all of one’s needs while on Earth—with everything from rooftop community gathering space to sunlit Saturday morning yoga classes in the atrium.

This isn’t a new idea. In his celebrated and remarkably enduring 1949 essay, “Here is New York” (pdf). E.B. White addressed the spiritual life that a city offers:
Many people who have no real independence of spirit depend on the city’s tremendous variety and sources of excitement for spiritual sustenance and maintenance of morale … I think that although many persons are here for some excess of spirit (which cause them to break away from their small town), some, too, are here from a deficiency of spirit, who find in New York a protection, or an easy substitution.
White’s essay isolates the beauty of New York: It is a love letter. By all means, I invite you to be taken with it; I am. His city offers the range of rewards—sights and sounds and things to do. I marvel at the way White’s city operates, they way it manages to instill order and achieve artistry. In White’s capable hands, cities are humanity’s premier expression of civilization.

Urbanism, as well, has deftly aligned itself with human progress. It trumpets terms like “smart growth,” “sustainability,” “resilience,” and “scalability” to demonstrate both its concern with the quality of our lives and its progressive street cred. It champions urban “green space” as the solution for everything from obesity to asthma. But green spaces aren’t even parks. Often people can only use them during prescribed times and in particular ways—concerts, film screenings, seasonal outdoor markets. Moreover, they’re usually owned by a developer who likely built it as a concession for a sweet deal on the land. Yet this is what we celebrate? A paltry scrap of flora? Which just begs a question Thomas Frank posed in his Baffler essay skewering the “vibrancy” movement so many cities have staked their futures on:
… [W]hy is it any better to pander to the “creative class” than it is to pander to the traditional business class? Yes, one strategy uses “incentives” and tax cuts to get companies to move from one state to another, while the other advises us to emphasize music festivals and art galleries when we make our appeal to that exalted cohort. But neither approach imagines a future arising from something other than government abasing itself before the wealthy.
To be fair, in as much as cities can be said to have a consciousness, they fully comprehend their vulnerability. Urban planners know perfectly well that if the delicate balance between safety and prosperity is lost, then disinvestment and abandonment can strike. But they have also learned that people can be manipulated to identify with the city and thereby tolerate just about anything it dishes.

by Dax-Devlon Ross, TMN | Read more:
Image: Steven Guerrisi

Tuesday, February 24, 2015

Whistlin' Dixie

Driving south from the North, we tried to spot exactly where the real South begins. We looked for the South in hand-scrawled signs on the roadside advertising ‘Boil Peanut’, in one-room corrugated tin Baptist churches that are little more than holy sheds, in the crumbling plantation homes with their rose gardens and secrets. In the real South, we thought, ships ought to turn to riverboats, cold Puritanism to swampy hellfire, coarse industrialists with a passion for hotels and steel to the genteel ease of the cotton planter.

Most of what we believe about the South, wrote W.J. Cash in the 1930s, exists in our imagination. But, he wrote, we shouldn’t take this to mean that the South is therefore unreal. The real South, wrote Cash in The Mind of the South, exists in unreality. It is the tendency toward unreality, toward romanticism, toward escape, that defines the mind of the South.

The unreality that shaped the South took many forms. In the South, wrote Cash (himself a Southern man), is “a mood in which the mind yields almost perforce to drift and in which the imagination holds unchecked sway, a mood in which nothing any more seems improbable save the puny inadequateness of fact, nothing incredible save the bareness of truth.” Most people still believe, wrote Cash — but no more than Southerners themselves — in a South built by European aristocrats who erected castles from scrub. This imaginary South, wrote Cash, was “a sort of stagepiece out of the eighteenth century,” where gentlemen planters and exquisite ladies in farthingales spoke softly on the steps of their stately mansions. But well-adjusted men of position and power, he wrote, “do not embark on frail ships for a dismal frontier… The laborer, faced with starvation; the debtor, anxious to get out of jail; the apprentice, eager for a fling at adventure; the small landowner and shopkeeper, faced with bankruptcy and hopeful of a fortune in tobacco; the neurotic, haunted by failure and despair” — only these would go.”

The dominant trait of the mind of the South, wrote Cash, was an intense individualism — an individualism the likes of which the world hadn’t seen since Renaissance days. In the backcountry, the Southern man’s ambitions were unbounded. For each who stood on his own little property, his individual will was imperial law. In the South, wrote Cash, wealth and rank were not so important as they were in older societies. “Great personal courage, unusual physical powers, the ability to drink a quart of whiskey or to lose one’s whole capital on the turn of a card without the quiver of a muscle — these are at least as important as possessions, and infinitely more important than heraldic crests.”

The average white Southern man (for this man was Cash’s main focus) was a romantic, but it was a romance bordering on bedlam. Any ordinary man tends to be a hedonist and a romantic, but take that man away from Old World traditions, wrote Cash, and stick him in the frontier wilds. Take away the skepticism and realism necessary for ambition and he falls back on imagination. His world becomes rooted in the fantastic, the unbelievable, and his emotions lie close to the surface. Life on the Southern frontier was harsh but free — it could make a man’s ego feel large.

The Southern landscape, too, had an unreal quality, “itself,” wrote Cash, “a sort of cosmic conspiracy against reality in favor of romance.” In this country of “extravagant color, of proliferating foliage and bloom, of flooding yellow sunlight, and, above all, perhaps, of haze,” the “pale blue fogs [that] hang above the valleys in the morning,” the outlines of reality blur. The atmosphere smokes “rendering every object vague and problematical.” A soft languor creeps through the blood and into the brain, wrote Cash, and the mood of the South becomes like a drunken reverie, where facts drift far away. “But I must tell you also that the sequel to this mood,” wrote Cash, “is invariably a thunderstorm. For days — for weeks, it may be — the land lies thus in reverie and then …”

The romanticism of the South, wrote W.J. Cash, was one that tended toward violence. It was a violence the Southern man often turned toward himself as much as those around him. The reverie turns to sadness and the sadness to a sense of foreboding and the foreboding to despair. Nerves start to wilt under the terrifying sun, questions arise that have no answers, and “even the soundest grow a bit neurotic.” When the rains break, as they will, and the South becomes a land of fury, the descent into unreality takes hold. Pleasure becomes sin, and all are stripped naked before the terror of truth.

by Stefany Anne Golberg, The Smart Set |  Read more:
Image: uncredited

The Drug Technology Boom

Cannabis is joining the coca leaf in the ranks of drugs improved by technology.

Concentrates are made from the entire plant, while the smokable flower is only a part. To sell flowers, plants must be cut, trimmed, and dried. The process takes weeks and a lot of manpower. For refined product, the entire plant can be processed without having to wait. Solvents turn live plants into BHO (butane hash oil) on the spot. Those who produce concentrates have a sellable product within 24 hours, where it would take weeks to properly prepare buds.

Dabs are refined marijuana with highly concentrated doses of THC, which explains why, when I tried it, I could feel my thoughts. About 50 percent of the marijuana flower is vegetative plant matter and weeded out in clarification. What’s left are the cannabinoids and terpenes (the good stuff with the medical applications and flavor).

Marijuana is oil- and fat-soluble, which is why edibles (like brownies) work. When the flowers are simmered in butter, the butter takes on the intoxicating components of the plant, which can be then be strained out; boiling it in water would just get it hot and wet. Once ingested (however they’re ingested) and absorbed into the bloodstream, fat-soluble drugs collect in fatty tissue like the brain.

Whether a substance is fat- or water-soluble depends on polarity. Fat-soluble materials are nonpolar, meaning they lack electrical charge. Nonpolar molecules are held together by covalent bonds, in which electrons are shared between connected atoms. Because it’s fat-soluble, concentrates are made with oil-based solvents (often butane) to suspend the cannabinoids. This separates the essential oils from the plant matter.

For those of us who weren’t in AP Chemistry, a solvent is a material that dissolves another, chemically different material (the solute). There are water-based solvent-extraction techniques, but they are less popular.

“Blasting” (extracting concentrated THC from the plant) is complicated, dangerous, and easy to find on YouTube.

The solvent is introduced to the solute (dank, loud, nugs, and other stupid names), resulting in a yellow liquid, as long as ventilation has been good enough to prevent suffocation by fire.

The product of the last chemical reaction becomes the reactant in the next one. The liquid is heated to evaporate the remaining harsh chemicals until it resembles delicious crème brûlée. The material is then moved into a vacuum chamber where the pump bubbles and fluffs it some more. (The strict need for a vacuum chamber is a point of some Reddit dispute, but it remains the favored process.)

When the fluff looks like gooey home insulation, it returns to the heat source to take its final form. Once isolated, the beneficial compounds can take a variety of structures. A few variables determine the consistency, but temperature is the big one.

“Shatter,” as it’s referred to in this state, looks like the meth seen in Breaking Bad, but yellow. It’s solid yet fragile and breaks apart easily. Shatter is subjected to an additional process to extract the lipids, fats, waxes, and terpenes. It’s the purest of the refined products. Good shatter can reach over 80 percent THC.

“Wax” looks like a mother extracted it from a 14-year-old boy’s ear. It still has the terpenes, which makes it more flavorful but less potent than shatter—usually 70 to 80 percent. Of the three most popular concentrates, it’s definitely the one I’d most like to get under a microscope.

“Honey oil” looks a bit like shatter and feels a lot like maple syrup. It’s the least refined and most flavorful of the three. (...)

Since concentrates are better vaped than smoked, their growing popularity is changing the apparatuses used to consume them. The first vaporizer I ever used was a direct-inhale box. If you tried to get someone stoned with that today, they’d laugh at you. They were analog, complicated, and hard to use. I never once felt that I got stoned, and oh lord did I try. (...)

The two technologies are evolving alongside one another. Since that pioneering stoner first invented the apple pipe, potheads have been seeking to optimize consumption.

“Some people lack the skills to become accomplished stoners,” Greene laughed. “If you hold the lighter the wrong way, you burn your thumb; everyone can push a button.”

I don’t know if I agree. Ritual is part of addiction. Don’t get me wrong: I think pot is great and everyone should get stoned, but I am, without a doubt, addicted. My dad hasn’t smoked since the ’80s, but he still prides himself on rolling great joints (and put to the test, he will). Similarly, I get a rush out of tearing open a new bag of Agent Orange (or Green Crack or another unfortunately named strain), ripping apart the bud, and packing the bowl. The texture of flowers crumbling between my fingers is an intoxicating part of the experience.

With my vape, I’m beginning to appreciate a new set of rituals. I play with the temperature settings, pack the oven with a particular gold pencil, and meticulously scrape the lid clean. These aren’t the same rituals that make me feel like I’m in my childhood bathroom blowing smoke into the ass end of a fan while listening to Belle and Sebastian’s Tigermilk, but there’s magic in the new as well.

by Cece Lederer, The Kernel | Read more:
Images: Imgur, Andres Rodriguez/Flickr (CC BY 2.0), Vjiced/Wikimedia Commons (CC BY-SA 3.0)

Television


[ed. Full album here.]

The Glacier Priest


[ed. Amazing how everything looks much the same. The Wikipedia entry doesn't mention it, but I assume the Hubbard Glacier was named after the Rev. Bernard Hubbard, remembered by Newsweek after he died as: "the Glacier Priest, a tireless Jesuit who led 32 expeditions to Alaska and once listed the requisites of an explorer as "a strong back, a strong stomach, a dumb head, and a guardian angel."]

Winter Forever


The New York Times has something to say about the season:
Long stretches of painfully frigid weather, brief respites, then more snow, ice and freezing rain. Freeze, thaw, repeat — the cycle expands in the mind to unbearable infinities, the unyielding sensation of being trapped. But check the calendar: This week means we are officially in late February, which means March. March means daffodils, which means this all must eventually end.
It’s a nice thought, but I am here to tell you that, like all hope, this optimism is deeply misplaced. Sure, the calendar will change. The birds will sing again. The scent of fresh urine will alight on the nose as you wander the streets. Perhaps it will even once more grow so warm that we shall shed the heavy layers of clothing with which we bundle ourselves before each undertaking outdoors. But mark my words: What you’ve seen this winter and the winter before can never be erased. There is no return from such raw horror. However hard the sun shines down on you in future days you will always carry this winter around in your cold, barren heart. The light that once danced and played behind your eyes has been permanently dimmed and replaced by a mean, dull glare that stares shivers into anything it surveys. If the human race lasts another hundred years each member of the species will carry within its shattered soul a darkness so intense that the all the trees of the fields shall bend their branches away in fear from its frigid malevolence. Winter will never end, nor will you ever rid yourself of it. It is in you, and of you. It is you. There is no turning back, ever. Your stock of sorrows will freeze and, having frozen, crack into a thousand tiny icicles that stab sadness into any spare bit of joy that threatens to melt your bitter, broken spirit. When you walk you carry with you the icy frost of death which, with the wind chill factor, feels like negative five degrees icy frost of death. The only warmth left for you will come when you are lowered for the last time into the ground’s final embrace.

by Alex Balk, The Awl | Read more:
Image: via:

Monday, February 23, 2015

Donna Summer



[ed. Hey, didn't Lady Gaga kill it at the Oscars last night? Sound of Music covers. Huh. Who would have thought?]

How Pop Made a Revolution

Yeah! Yeah! Yeah!: The Story of Pop Music from Bill Haley to Beyoncé, Bob Stanley, W.W. Norton, 624 pages

I wish I could say that my love of pop music began when my middle school music teacher showed me a documentary called “The Compleat Beatles.” That would be the socially acceptable, hipster-sanctioned origin story. But truthfully, the affair began a couple years earlier in 1986, when I conspired with some friends to flood our local top 40 station with requests for the song “Rock Me Amadeus.” Dismayed that Falco’s masterwork had slipped in the charts, we resolved to do whatever we could to reverse its fate. This was either true love or something equally intense—a force that could drive a 12-year-old boy to cold-call a radio station and then sit next to the stereo for hours with finger poised over the tape-record button, enduring songs by Mister Mister and Starship, just waiting for that descending synth motif to issue forth from the speakers.

I’m not terribly surprised that “Rock Me Amadeus” receives no mention in Bob Stanley’s new book. While the song embodies the very essence of pop—it is quirky, flamboyant, goofily ambitious, yet so very of its moment—it was ultimately a failed experiment, a novelty hit. (Though, to be fair, it was no less kitschy than The Timelords’ “Doctorin’ The Tardis,” which does receive mention.) I listen to it now and wonder what the hell my 12-year-old self was thinking. But that’s love, right? It rarely makes sense after it has passed. Stanley clearly knows something about the fever dream of the besotted pop fan, and much of his book is written from that headspace.

What a joy it is to find a music writer who didn’t get the rock-critic memo—the one that says you’re supposed to worship at the altar of punk rock, praise Radiohead, and hate the Eagles. Stanley has plenty of nice things to say about the Eagles, the Bee Gees, Hall and Oates, and Abba. Conversely, he has nothing but contempt for The Clash, those self-anointed exemplars of punk rock. “The Clash set out parameters,” he writes, “and then squirmed like politicians when they were caught busting their own manifesto.” (Stanley prefers the more self-aware Sex Pistols.) Radiohead fare even worse; he describes these critical darlings as “dad rock.” Vocalist Thom York sings “as if he was in the fetal position.”

Of Bob Dylan, a figure as close to a saint as we get in the annals of rock lit, Stanley writes: “along with the Stones he sealed the concept of snotty behavior as a lifestyle, snarled at the conventional with his pack of giggling lickspittle dogs, and extended Brando’s ‘What have you got?’ one-liner into a lifelong party of terse putdowns.” For those of us who grew up reading far too many issues of Rolling Stone for our own good, this is bracing tonic indeed.

What gives Stanley the edge over so many other music journalists is the fact that he is a songwriter himself, and a fairly successful one at that: his band Saint Etienne had a string of UK Top 20 hits in the 1990s. It is easier for musicians than for non-musician critics, I believe, to see beyond genre boundaries and appreciate tunefulness wherever it may reside. Stanley, whom I’m pretty sure would rather be known as a “musician who writes” than a “writer who plays music,” takes a more expansive view of the term “pop” than a lot of other writers might do. In his view, pop simply means “popular.” It is not, as is typically imagined, a specific sound—say that of a Britney Spears or Katy Perry. Under Stanley’s definition, Nirvana qualifies as pop. So do Pink Floyd, Black Sabbath, and Glen Campbell.

At 624 pages, Yeah! Yeah! Yeah! is a doorstop of a book, but Stanley’s enthusiasm for the material keeps the narrative moving briskly. He can get inside a song and describe its magic to outsiders like no one else I have ever come across. Consider the following highlights: Of the “clattering, drum-heavy” mix of Bill Haley’s “Rock Around the Clock,” he writes, “It sounded like jump blues, only with someone dismantling scaffolding in the studio.” On Abba: “No one musician stands out on any of their hits because they don’t sound like anyone played an instrument on them; they all sound like a music box carved from ice.” On the Police: “Singer Sting had a high, mewling voice that, appropriately, sounded a little like the whine of a police siren.” And, as is probably apparent already, Stanley is very effective with the terse putdown. My favorite concerns The Cure—a band that has spawned an entire cottage industry of mopey imitators: “It was all somehow powdery and a little slight,” he writes. “The Cure were more about stubbing your toe than taking your life.” Ouch.

Stanley’s two preoccupations throughout the book are innovation and craft, in that order. He gives a lot of space to sonic pioneers like Joe Meek, Phil Spector, and later the architects of early hip-hop and house music, detailing how each wave of experimentation inevitably made its way into the heart of the mainstream sound, eventually becoming calcified until the next upheaval came along to shake things up.

by Robert Dean Lurie, The American Conservative | Read more:
Image: The Beatles / Wikimedia Commons

War Porn

In the age of the all-volunteer military and an endless stream of war zone losses and ties, it can be hard to keep Homeland enthusiasm up for perpetual war. After all, you don't get a 9/11 every year to refresh those images of the barbarians at the airport departure gates. In the meantime, Americans are clearly finding it difficult to remain emotionally roiled up about our confusing wars in Syria and Iraq, the sputtering one in Afghanistan, and various raids, drone attacks, and minor conflicts elsewhere.

Fortunately, we have just the ticket, one that has been punched again and again for close to a century: Hollywood war movies (to which the Pentagon is always eager to lend a helping hand). American Sniper, which started out with the celebratory tagline “the most lethal sniper in U.S. history” and now has the tagline “the most successful war movie of all time,” is just the latest in a long line of films that have kept Americans on their war game. Think of them as war porn, meant to leave us perpetually hyped up. Now, grab some popcorn and settle back to enjoy the show.

There’s Only One War Movie

Wandering around YouTube recently, I stumbled across some good old government-issue propaganda. It was a video clearly meant to stir American emotions and prepare us for a long struggle against a determined, brutal, and barbaric enemy whose way of life is a challenge to the most basic American values. Here's some of what I learned: our enemy is engaged in a crusade against the West; wants to establish a world government and make all of us bow down before it; fights fanatically, beheads prisoners, and is willing to sacrifice the lives of its followers in inhuman suicide attacks. Though its weapons are modern, its thinking and beliefs are 2,000 years out of date and inscrutable to us.

Of course, you knew there was a trick coming, right? This little U.S. government-produced film wasn’t about the militants of the Islamic State. Made by the U.S. Navy in 1943, its subject was “Our Enemy the Japanese.” Substitute “radical Islam” for “emperor worship,” though, and it still makes a certain propagandistic sense. While the basics may be largely the same (us versus them, good versus evil), modern times do demand something slicker than the video equivalent of an old newsreel. The age of the Internet, with its short attention spans and heightened expectations of cheap thrills, calls for a higher class of war porn, but as with that 1943 film, it remains remarkable how familiar what’s being produced remains.

Like propaganda films and sexual pornography, Hollywood movies about America at war have changed remarkably little over the years. Here's the basic formula, from John Wayne in the World War II-era Sands of Iwo Jima to today's American Sniper:

*American soldiers are good, the enemy bad. Nearly every war movie is going to have a scene in which Americans label the enemy as “savages,” “barbarians,” or “bloodthirsty fanatics,” typically following a “sneak attack” or a suicide bombing. Our country’s goal is to liberate; the enemy's, to conquer. Such a framework prepares us to accept things that wouldn’t otherwise pass muster. Racism naturally gets a bye; as they once were “Japs” (not Japanese), they are now “hajjis” and “ragheads” (not Muslims or Iraqis). It’s beyond question that the ends justify just about any means we might use, from the nuclear obliteration of two cities of almost no military significance to the grimmest sort of torture. In this way, the war film long ago became a moral free-fire zone for its American characters.

*American soldiers believe in God and Country, in “something bigger than themselves,” in something “worth dying for,” but without ever becoming blindly attached to it. The enemy, on the other hand, is blindly devoted to a religion, political faith, or dictator, and it goes without saying (though it’s said) that his God -- whether an emperor, Communism, or Allah -- is evil. As one critic put it back in 2007 with just a tad of hyperbole, “In every movie Hollywood makes, every time an Arab utters the word Allah… something blows up.”

*War films spend no significant time on why those savages might be so intent on going after us. The purpose of American killing, however, is nearly always clearly defined. It's to “save American lives,” those over there and those who won’t die because we don't have to fight them over here. Saving such lives explains American war: in Kathryn Bigelow’s The Hurt Locker, for example, the main character defuses roadside bombs to make Iraq safer for other American soldiers. In the recent World War II-themed Fury, Brad Pitt similarly mows down ranks of Germans to save his comrades. Even torture is justified, as in Zero Dark Thirty, in the cause of saving our lives from their nightmarish schemes. In American Sniper, shooter Chris Kyle focuses on the many American lives he’s saved by shooting Iraqis; his PTSD is, in fact, caused by his having “failed” to have saved even more. Hey, when an American kills in war, he's the one who suffers the most, not that mutilated kid or his grieving mother -- I got nightmares, man! I still see their faces!

*Our soldiers are human beings with emotionally engaging backstories, sweet gals waiting at home, and promising lives ahead of them that might be cut tragically short by an enemy from the gates of hell. The bad guys lack such backstories. They are anonymous fanatics with neither a past worth mentioning nor a future worth imagining. This is usually pretty blunt stuff. Kyle’s nemesis in American Sniper, for instance, wears all black. Thanks to that, you know he’s an insta-villain without the need for further information. And speaking of lack of a backstory, he improbably appears in the film both in the Sunni city of Fallujah and in Sadr City, a Shia neighborhood in Baghdad, apparently so super-bad that his desire to kill Americans overcomes even Iraq's mad sectarianism.

by Peter Van Buren, TomDispatch |  Read more:
Image: via: