Sunday, March 1, 2015

Ronnie Earl & The Broadcasters


[ed. Repost]

Frans Post, Brazilian Landscape with Anteater, 1649
via:

Gillian Welch, David Rawlings

[Repost]

When Your Punctuation Says It All (!)

I went out with a guy based on his use of dashes once. Within moments of our first interaction — over text message — I was basically in love.

He didn’t just use the lazy singular dash (“-”) as a pause between his thoughts, or even the more time-consuming double-dash (“--”). Nope. This man used a proper em dash.

That is, the kind that required him to hold down the dash button on his iPhone for that extra second, until the “—” appeared, then choose it from among three options. I don’t remember what his messages actually said. But he obviously really liked me.

I’m a writer; it’s natural I’d have a thing for grammar. But these days, it’s as if our punctuation is on steroids.

It’s not just that each of us is more delicately choosing our characters, knowing that an exclamation point or a colon carries more weight in our 140-character world. Or even that our punctuation suddenly feels like hyperbole (right?!?!?!?!?!) because we’ve lost all audible tone.

Those things are true. But it’s also as if a kind of micro-punctuation has emerged: tiny marks in the smallest of spaces that suddenly tell us more about the person on the other end than the words themselves (or, at least, we think they do).

Take the question mark. Recently, a friend I had dinner plans with sent a text to ask “what time” we were meeting. We’d been organizing this meal for weeks; a half-dozen emails back and forth. And yet the question — sans the mark — felt indifferent, almost cold. Couldn’t she at least bother to insert the necessary character?

Of course, had she inserted too many marks, that may have been a problem, too, as there is suddenly a very fine line between appearing overeager (too much punctuation) and dismissive (not enough).

Even the period, once the most benign of the punctuation spectrum, now feels aggressive. And the exclamation point is so ubiquitous that “when my girlfriends don’t use an exclamation point, I’m like ‘What’s wrong, you O.K.?’ ” said Jordana Narin, a 19-year-old student in New York.

“Girlfriends” may be a key word there, as women are more likely to use emotive punctuation than men are. Yet lately I’ve tried to rein my own effusiveness in, going as far as to insert additional punctuation into existing punctuation in an effort to soften the marks themselves.

So instead of responding to a text with “Cant wait!!” I’ll insert a space or two before the mark — “Cant wait !!” – for that extra little pause. Sometimes I’ll make the exclamation point a parenthetical, as a kind of after thought (“Can’t wait (!)”). A friend inserts an ellipses — “Can’t wait … !!” — so, as she puts it, “it’s less intense.”

“At this point, I’ve basically suspended judgment,” said Ben Crair, an editor at the New Republic who recently wrote a column about the new aggression of the period. “You could drive yourself insane trying to decode the hidden messages in other people’s punctuation.”

by Jessica Bennett, NY Times |  Read more:
Image: Ron Barrett

Saturday, February 28, 2015

You Are Listening To...


...a soothing mix of police radio chatter and ambient music. Choose from Los Angeles, New York, San Francisco, Chicago, or my personal recommendation, MontrĂ©al. French police chat really blends into the music nicely. You may need to adjust the balance of each stream a bit to find the right mix. 
via:
[ed. Repost]

photo: markk
[ed. Repost]

Erik Satie


[ed. Repost]

Los Amigos Invisibles


Have some fun.
[ed. Repost]

Our Date with Miranda

[ed. One of my favorite movies: Me and You and Everyone We Know.]

I first met Miranda July years ago at a faraway literary conference in Portland, Oregon. Along with Rick Moody and others we were on a panel that was supposed to converse authoritatively about narrative structure. When it came time for July to speak, she stood up and started singing. She was large-eyed and lithe. I don’t remember what song it was—something she had written herself, I believe. I was startled. Who was this woman? (Her performances and short films had not appeared widely enough to have caught my notice.) I was then mortified, not for her, since she seemed completely at ease and the audience was enthralled, but mortified for narrative structure, which had clearly been given the bum’s rush. (Well, fiction writers will do anything to avoid this topic: it is the one about which they are the most clueless and worried and improvisational.)

Sitting next to Ms. July was the brilliant Denis Johnson, who, inspired by his neighbor, when it was his turn (figuring out one’s turn can be the most difficult part of a panel) also began to sing. Also something he had written himself. I may have laughed, thinking it was all supposed to be funny, realizing too late my mistake. There was a tragic aspect to one verse in the Johnson song. I believe he did not sit down because he had not stood to begin with.

Then it was clearly, or unclearly, my turn. If not the wallflower at the orgy then I was the mute at the a cappella operetta (a condition typical of many a July character though not of July herself): I refused to sing. I don’t remember what I said—I believe I read from some notes, silently vowing never to be on another panel. (The next panel I was on, in Boston, I thought went well by comparison. That is, no one burst into random song. But when I said as much to the person sitting next to me, the editor of a prominent literary journal, he said, “Really? This was the worst panel I’ve ever participated in.”) So my introduction to July was one at which I watched her redefine boundaries and hijack something destined to be inert and turn it into something uncomfortably alive, whether you wanted her to or not. This has been my experience of her work ever since.

July’s first feature-length film, the now-famous independent Me and You and Everyone We Know, also upends expectations. July writes, directs, and stars in all her films. In many ways, while remaining a love story, the film is about the boundary-busting that is ruleless sexuality—stalking and sexual transgression—though here the predators and perpetrators are gentle and female. A boy is coercively fellated by two slightly unpleasant teenage girls devising a competition. Low-level sexual harassment is everywhere and July sometimes plays it for laughs. Two kids in a chat room lead someone on a wild goose chase, writing scatological comments in the language of very young children, and despite all this it is hilarious. A shoe salesman named Richard who has set his hand on fire in front of his sons is hounded by a woman named Christine (played by July herself) who does not know him but who is erotically obsessed with him. She has psychically and perhaps correctly marked him as her mate (the telepathic heart is at the center of much of July’s work).

Another character, a middle-aged woman seeking a partner online, finds herself hooked up with a five-year-old boy in the park. Images of flame and precariousness recur—the burning sun, the burning hand, a bright goldfish riding in a plastic bag on a car roof. And yet all is put forward with tenderness and humor. The desire for human love goes unquestioned and its role in individual fate is assumed to be essential. July’s Christine, a struggling artist who works as a driver for ElderCab, possesses a thin-skinned empathy for everyone, and her love for the shoe salesman (who is played in convincingly addled fashion by John Hawkes) is performed with both vulnerability and purity of passion.

In her two feature-length films the chemistry with her male leads is quite strong: they as well as July are like openly soulful children, attaching without reason or guile, and July is quite focused on this quality of connective vulnerability, as well as on children themselves. Her work also engages with the criterion offered up by the character of a museum curator looking at Christine’s own works: “Could this have been made in any era or only now?” With July it is a little of both. She focuses on people living “courageously with grace,” while also quietly arguing with a culture that asks us to do that.

by Lorrie Moore, NY Review of Books |  Read more:
Image: Nick Wall/Eyevine/Redux

A Glorious Distraction

During the two weeks before the Super Bowl there were more than 10,000 news articles written about the slight deviation in air pressure of the footballs used by the New England Patriots in their American Football Conference Championship victory over the Indianapolis Colts. The Patriots quarterback Tom Brady, in an attempt to defuse conspiracy allegations, joked in a press conference, “Things are fine—this isn’t ISIS.”

He was right: it wasn’t ISIS. During those two weeks, the Islamic State of Iraq and Syria was the subject of only seventy-nine articles in The New York Times. “Deflate-gate” was the subject of eighty. These included interviews with football players, who explained why a deflated ball was easier to throw and catch; physicists, who suggested that the deflation might have occurred due to climate effects; logisticians, who opined on the time necessary to deflate a football; and a seamstress of Wilson footballs who vowed, “It’s not Wilson’s fault.” Even the leader of the free world felt obliged to make a statement. “Here’s what I know,” said President Obama on Super Bowl Sunday. “The Patriots were going to beat the Colts regardless of what the footballs looked like.”

In that period Andy Studebaker’s name appeared in only nine articles, all published in sports blogs. Studebaker is the twenty-nine-year-old backup linebacker for the Colts who, while defending a punt return, was blindsided with a gruesome hit to the chest by the Patriots’ backup running back Brandon Bolden. Studebaker’s head jerked back and he landed on his neck. On the sideline after the play Studebaker was seen coughing up blood.

Nor was much made of the fine levied on professional monster Clay Matthews of the Green Bay Packers for illegally smashing into the defenseless head of Seattle Seahawks quarterback Russell Wilson in the National Football Conference Championship game. Matthews’s fine was $22,050, or approximately what he earns every ninety seconds of game play. There was also little attention given to the fact that, in the second half of that game, Seattle cornerback Richard Sherman injured his left arm so badly that he couldn’t straighten it; he played the final quarter with it bent and pressed tightly to his chest like a chicken wing.

Was it broken? Badly sprained? Was he given shockingly powerful illegal or legal drugs in order to endure the pain? The league, and Seattle, were mum on these points. When asked ten days later about the injury, Sherman said, “It’s a little sore, but not too bad.” Then, with a wink: “That’s my story and I’m sticking to it.” Minutes after the Super Bowl ended it was revealed that Sherman had torn ligaments in his elbow and will have to undergo reconstructive surgery. (...)

NFL Commissioner Roger Goodell might have been grateful for the deflation controversy because it distracted from what otherwise have been the season’s two dominant storylines: the league’s reluctance to discipline players who commit domestic violence and its failure to protect its players from brain damage. But Goodell didn’t need the help. Every thinking fan must, in order to enjoy any NFL game, consent to participate in a formidable suspension of disbelief. We must put aside our knowledge that nearly every current NFL player can expect to suffer from chronic traumatic encephalopathy, a degenerative disease that leads to memory loss, impaired judgment, depression, and dementia.

Football players are also four times more likely both to die from ALS (a fact that Goodell, despite participating in this past year’s ALS ice-bucket challenge, refuses to acknowledge) and to develop Alzheimer’s disease. An NFL player can expect to live twenty years less than the average American male. The average NFL career lasts 3.3 years. By that measure, each season costs an NFL player about six years of his life. Football fans, in other words, must ignore the fact that we are watching men kill themselves.

by Nathaniel Rich, New York Review of Books |  Read more:
Image: Jim Davis/Boston Globe/Getty Images

Hans Erni, Le Dessinateur or Kybernetes, Lithograph in 5 colours, 1956
via:

Plastilina Mosh/El Guincho/Odisea





[ed. Repost]

The Dress That Melted The Internet


The mother of the bride wore white and gold. Or was it blue and black?

From a photograph of the dress the bride posted online, there was broad disagreement. A few days after the wedding last weekend on the Scottish island of Colonsay, a member of the wedding band was so frustrated by the lack of consensus that she posted a picture of the dress on Tumblr, and asked her followers for feedback.

“I was just looking for an answer because it was messing with my head,” said Caitlin McNeill, a 21-year-old singer and guitarist.

Within a half-hour, her post attracted some 500 likes and shares. The photo soon migrated to Buzzfeed and Facebook and Twitter, setting off a social media conflagration that few were able to resist.

As the debate caught fire across the Internet — even scientists could not agree on what was causing the discrepancy — media companies rushed to get articles online. Less than a half-hour after Ms. McNeil’s original Tumblr post, Buzzfeed posted a poll: “What Colors Are This Dress?” As of Friday afternoon, it had been viewed more than 28 million times. (White and gold was winning handily.)

At its peak, more than 670,000 people were simultaneously viewing Buzzfeed’s post. Between that and the rest of Buzzfeed’s blanket coverage of the dress Thursday night, the site easily smashed its previous records for traffic. So did Tumblr.

Everyone, it seems, had an opinion. And everyone was convinced that he, or she, was right. (...)

In an era when just about everyone seems to be doing anything they can to ignite interest online, the great dress debate went viral the old-fashioned way. It just happened. (...)

At its center was a simple yet bedeviling mystery with an almost old-fashioned, trompe l’oeil quality: How could different people see the same article of clothing so differently? The simplicity of the debate, the fact that it was about something as universal as the color of a dress, made it all the more irresistible.

“This definitely felt like a special thing,” said Buzzfeed’s editor in chief, Ben Smith. “It sort of erased the line between web culture and real culture.”

by Jonathan Mahler, NY Times |  Read more:
Images: New Yorker and Wired

Friday, February 27, 2015

William S. Burroughs, The Art of Fiction No. 36

Firecrackers and whistles sounded the advent of the New Year of 1965 in St. Louis. Stripteasers ran from the bars in Gaslight Square to dance in the street when midnight came. Burroughs, who had watched television alone that night, was asleep in his room at the Chase Park Plaza Hotel, St. Louis's most elegant.

At noon the next day he was ready for the interview. He wore a gray lightweight Brooks Brothers suit with a vest, a blue-striped shirt from Gibraltar cut in the English style, and a deep-blue tie with small white polka dots. His manner was not so much pedagogic as didactic or forensic. He might have been a senior partner in a private bank, charting the course of huge but anonymous fortunes. A friend of the interviewer, spotting Burroughs across the lobby, thought he was a British diplomat. At the age of fifty, he is trim; he performs a complex abdominal exercise daily and walks a good deal. His face carries no excess flesh. His expression is taut, and his features are intense and chiseled. He did not smile during the interview and laughed only once, but he gives the impression of being capable of much dry laughter under other circumstances. His voice is sonorous, its tone reasonable and patient; his accent is mid-Atlantic, the kind of regionless inflection Americans acquire after many years abroad. He speaks elliptically, in short, clear bursts.

On the dresser of his room sat a European transistor radio; several science fiction paperbacks; Romance, by Joseph Conrad and Ford Madox Ford; The Day Lincoln Was Shot, by Jim Bishop; and Ghosts in American Houses, by James Reynolds. A Zeiss Ikon camera in a scuffed leather case lay on one of the twin beds beside a copy of Field & Stream. On the other bed were a pair of long shears, clippings from newspaper society pages, photographs, and a scrapbook. A Facit portable typewriter sat on the desk, and gradually one became aware that the room, although neat, contained a great deal of paper.

Burroughs smoked incessantly, alternating between a box of English Ovals and a box of Benson & Hedges. As the interview progressed, the room filled with smoke. He opened the window. The temperature outside was seventy degrees, the warmest New Year's Day in St. Louis's history; a yellow jacket flew in and settled on the pane. The bright afternoon deepened. The faint cries of children rose up from the broad brick alleys in which Burroughs had played as a boy. (...)

INTERVIEWER

When and why did you start to write?

BURROUGHS

I started to write in about 1950; I was thirty-five at the time; there didn't seem to be any strong motivation. I simply was endeavoring to put down in a more or less straightforward journalistic style something about my experiences with addiction and addicts.

INTERVIEWER

Why did you feel compelled to record these experiences?

BURROUGHS

I didn't feel compelled. I had nothing else to do. Writing gave me something to do every day. I don't feel the results were at all spectacular. Junky is not much of a book, actually. I knew very little about writing at that time.

INTERVIEWER

Where was this?

BURROUGHS

In Mexico City. I was living near Sears, Roebuck, right around the corner from the University of Mexico. I had been in the army four or five months and I was there on the GI Bill, studying native dialects. I went to Mexico partly because things were becoming so difficult with the drug situation in America. Getting drugs in Mexico was quite easy, so I didn't have to rush around, and there wasn't any pressure from the law.

INTERVIEWER

Why did you start taking drugs?

BURROUGHS

Well, I was just bored. I didn't seem to have much interest in becoming a successful advertising executive or whatever, or living the kind of life Harvard designs for you. After I became addicted in New York in 1944, things began to happen. I got in some trouble with the law, got married, moved to New Orleans, and then went to Mexico.

INTERVIEWER

There seems to be a great deal of middle-class voyeurism in this country concerning addiction, and in the literary world, downright reverence for the addict. You apparently don't share these points of view.

BURROUGHS

No, most of it is nonsense. I think drugs are interesting principally as chemical means of altering metabolism and thereby altering what we call reality, which I would define as a more or less constant scanning pattern. (...)

INTERVIEWER

The visions of drugs and the visions of art don't mix?

BURROUGHS

Never. The hallucinogens produce visionary states, sort of, but morphine and its derivatives decrease awareness of inner processes, thoughts, and feelings. They are painkillers, pure and simple. They are absolutely contraindicated for creative work, and I include in the lot alcohol, morphine, barbiturates, tranquilizers—the whole spectrum of sedative drugs. As for visions and heroin, I had a hallucinatory period at the very beginning of addiction, for instance, a sense of moving at high speed through space. But as soon as addiction was established, I had no visions—vision—at all and very few dreams. (...)

INTERVIEWER

You regard addiction as an illness but also a central human fact, a drama?

BURROUGHS

Both, absolutely. It's as simple as the way in which anyone happens to become an alcoholic. They start drinking, that's all. They like it, and they drink, and then they become alcoholic. I was exposed to heroin in New York—that is, I was going around with people who were using it; I took it; the effects were pleasant. I went on using it and became addicted. Remember that if it can be readily obtained, you will have any number of addicts. The idea that addiction is somehow a psychological illness is, I think, totally ridiculous. It's as psychological as malaria. It's a matter of exposure. People, generally speaking, will take any intoxicant or any drug that gives them a pleasant effect if it is available to them. In Iran, for instance, opium was sold in shops until quite recently, and they had three million addicts in a population of twenty million. There are also all forms of spiritual addiction. Anything that can be done chemically can be done in other ways, that is, if we have sufficient knowledge of the processes involved. Many policemen and narcotics agents are precisely addicted to power, to exercising a certain nasty kind of power over people who are helpless. The nasty sort of power: white junk, I call it—rightness; they're right, right, right—and if they lost that power, they would suffer excruciating withdrawal symptoms. The picture we get of the whole Russian bureaucracy, people who are exclusively preoccupied with power and advantage, this must be an addiction. Suppose they lose it? Well, it's been their whole life.

by Conrad Knickerbocker, Paris Review | Read more:
Image: via:

What Long-Distance Trains Teach Us About Public Space in America


"What people don’t like about the train is the time lapse. People don’t have time to tie their own shoes these days.” Trent, a fellow passenger on Amtrak’s Sunset Limited from New Orleans to Los Angeles, was philosophizing about the train. Trent is a middle-aged African-American man from California with whom I struck up a conversation in the observation car, which, for those of you who aren’t versed in the lingo of the rails, is the living room of a long-distance train, heavily windowed and designed for friendly interaction. Rolling through the desert, Trent and I talked for more two hours about family, spirituality, and all the other things that come up when you have opted to ride across the country with strangers and without WiFi.

“People [usually] just want to get from point A to point B as quickly as possible. We don’t give ourselves the chance to be in the moment,” Trent says. “Whether it’s good or bad, you grow. It’s about experiences.” He was describing something special about the long-distance train: It is a place to slow down and experience the present. (...)

Michel Foucault coined the term “heterotopia” to describe what he called “placeless places,” autonomous zones where societal rules are reinterpreted — like the train. (...)

The physical qualities that help to facilitate this sense of connection are human-scale design, a clean and safe environment, and an aesthetic that is straightforward and not overly fanciful. The dimensions of the car make it (generally speaking) cozy and comfortable, but spacious enough that you aren’t on top of the person seated next to you. (When people are too physically close they tend to retreat emotionally and mentally, as anyone who has ever ridden the 1 train during rush hour in Manhattan can attest.)

That long-distance trains aren’t designed with one specific aesthetic, demographic or psychographic in mind means that the ride is more about what’s unfolding within the space rather than the materiality of the car. It also frames the passing landscape in a way that makes it easy to use as a conversation starter. This follows the concept of “triangulation,” which William H. Whyte, a famous public space researcher and advocate, coined to describe a third element that gives people something easy to talk about.

Another important element encouraging interaction is what I will call the “together alone” factor. Riders are in the same space — and apart from everything and everyone else — for an extended period of time. Being in a shared physical space that’s also outside of one’s normal environment for an extended duration facilitates a special sense of focus and an enhanced sense of accountability, which can lead to conversations we wouldn’t normally have with strangers. (Online, the standard for conversation is a different ball game but we aren’t talking about that here.) Democratic theorists since the Ancient Greeks have celebrated public discourse. But where in contemporary offline America does this occur? Housing policies have segregated us by race, class and political leanings. It is increasingly difficult to have open, face-to-face conversations about important topics. Yet on train rides I observed conversations between total strangers about race, religion, sexuality and other taboo topics. Unlike the flame wars of the Internet, these conversations were civil.

by Danya Sherman, Next City |  Read more:
Image: Nikki Yanofsky, YouTube

Why 40-Year-Old Tech Is Still Running America’s Air Traffic Control

On Friday, September 26, 2014, a telecommunications contractor named Brian Howard woke early and headed to Chicago Center, an air traffic control hub in Aurora, Illinois, where he had worked for eight years. He had decided to get stoned and kill himself, and as his final gesture he planned to take a chunk of the US air traffic control system with him.

Court records say Howard entered Chicago Center at 5:06 am and went to the basement, where he set a fire in the electronics bay, sliced cables beneath the floor, and cut his own throat. Paramedics saved Howard's life, but Chicago Center, which controls air traffic above 10,000 feet for 91,000 square miles of the Midwest, went dark. Airlines canceled 6,600 flights; air traffic was interrupted for 17 days. Howard had wanted to cause trouble, but he hadn't anticipated a disruption of this magnitude. He had posted a message to Facebook saying that the sabotage “should not take a large toll on the air space as all comms should be switched to the alt location.” It's not clear what alt location Howard was talking about, because there wasn't one. Howard had worked at the center for nearly a decade, and even he didn't know that.

At any given time, around 7,000 aircraft are flying over the United States. For the past 40 years, the same computer system has controlled all that high-altitude traffic—a relic of the 1970s known as Host. The core system predates the advent of the Global Positioning System, so Host uses point-to-point, ground-based radar. Every day, thousands of travelers switch their GPS-enabled smartphones to airplane mode while their flights are guided by technology that predates the Speak & Spell. If you're reading this at 30,000 feet, relax—Host is still safe, in terms of getting planes from point A to point B. But it's unbelievably inefficient. It can handle a limited amount of traffic, and controllers can't see anything outside of their own airspace—when they hand off a plane to a contiguous airspace, it vanishes from their radar.

The FAA knows all that. For 11 years the agency has been limping toward a collection of upgrades called NextGen. At its core is a new computer system that will replace Host and allow any controller, anywhere, to see any plane in US airspace. In theory, this would enable one air traffic control center to take over for another with the flip of a switch, as Howard seemed to believe was already possible. NextGen isn't vaporware; that core system was live in Chicago and the four adjacent centers when Howard attacked, and this spring it'll go online in all 20 US centers. But implementation has been a mess, with a cascade of delays, revisions, and unforeseen problems. Air traffic control can't do anything as sophisticated as Howard thought, and unless something changes about the way the FAA is managing NextGen, it probably never will.

This technology is complicated and novel, but that isn't the problem. The problem is that NextGen is a project of the FAA. The agency is primarily a regulatory body, responsible for keeping the national airspace safe, and yet it is also in charge of operating air traffic control, an inherent conflict that causes big issues when it comes to upgrades. Modernization, a struggle for any federal agency, is practically antithetical to the FAA's operational culture, which is risk-averse, methodical, and bureaucratic. Paired with this is the lack of anything approximating market pressure. The FAA is the sole consumer of the product; it's a closed loop.

The first phase of NextGen is to replace Host with the new computer system, the foundation for all future upgrades. The FAA will finish the job this spring, five years late and at least $500 million over budget. Lockheed Martin began developing the software for it in 2002, and the FAA projected that the transition from Host would be complete by late 2010. By 2007, the upgraded system was sailing through internal tests. But once installed, it was frighteningly buggy. It would link planes to flight data for the wrong aircraft, and sometimes planes disappeared from controllers' screens altogether. As timelines slipped and the project budget ballooned, Lockheed churned out new software builds, but unanticipated issues continued to pop up. As recently as April 2014, the system crashed at Los Angeles Center when a military U-2 jet entered its airspace—the spy plane cruises at 60,000 feet, twice the altitude of commercial airliners, and its flight plan caused a software glitch that overloaded the system.

Even when the software works, air traffic control infrastructure is not prepared to use it. Chicago Center and its four adjacent centers all had NextGen upgrades at the time of the fire, so nearby controllers could reconfigure their workstations to see Chicago airspace. But since those controllers weren't FAA-certified to work that airspace, they couldn't do anything. Chicago Center employees had to drive over to direct the planes. And when they arrived, there weren't enough workstations for them to use, so the Chicago controllers could pick up only a portion of the traffic. Meanwhile, the telecommunications systems were still a 1970s-era hardwired setup, so the FAA had to install new phone lines to transfer Chicago Center's workload. The agency doesn't anticipate switching to a digital system (based on the same voice over IP that became mainstream more than a decade ago) until 2018. Even in the best possible scenario, air traffic control will not be able to track every airplane with GPS before 2020. For the foreseeable future, if you purchase Wi-Fi in coach, you're pretty much better off than the pilot.

by Sara Breselor, Wired |  Read more:
Image: Valero Doval

'Pics or It Didn’t Happen'

Our social networks have a banality problem. The cultural premium now placed on recording and broadcasting one’s life and accomplishments means that Facebook timelines are suffused with postings about meals, workouts, the weather, recent purchases, funny advertisements, the milestones of people three degrees removed from you. On Instagram, one encounters a parade of the same carefully distressed portraits, well-plated dishes and sunsets gilded with smog. Nuance, difference, and complexity evaporate as one scrolls through these endless feeds, vaguely hoping to find something new or important but mostly resigned to variations on familiar themes.

In a digital landscape built on attention and visibility, what matters is not so much the content of your updates but their existing at all. They must be there. Social broadcasts are not communications; they are records of existence and accumulating metadata. Rob Horning, an editor at the New Inquiry, once put it in tautological terms: “The point of being on social media is to produce and amass evidence of being on social media.” This is further complicated by the fact that the feed is always refreshing. Someone is always updating more often or rising to the top by virtue of retweets, reshares, or some opaque algorithmic calculation. In the ever-cresting tsunami of data, you are always out to sea, looking at the waves washing ashore. As the artist Fatima Al Qadiri has said: “There’s no such thing at the most recent update. It immediately becomes obsolete.”

Why, then, do we do it? If it’s so easy to become cynical about social media, to see amid the occasionally illuminating exchanges or the harvesting of interesting links (which themselves come in bunches, in great indigestible numbers of browser tabs) that we are part of an unconquerable system, why go on? One answer is that it is a byproduct of the network effect: the more people who are part of a network, the more one’s experience can seem impoverished by being left out. Everyone else is doing it. A billion people on Facebook, hundreds of millions scattered between these other networks – who wants to be on the outside? Who wants to miss a birthday, a friend’s big news, a chance to sign up for Spotify, or the latest bit of juicy social intelligence? And once you’ve joined, the updates begin to flow, the small endorphin boosts of likes and re-pins becoming the meagre rewards for all that work. The feeling of disappointment embedded in each gesture, the sense of “Is this it?”, only advances the process, compelling us to continue sharing and participating.

The achievement of social-media evangelists is to make this urge – the urge to share simply so that others might know you are there, that you are doing this thing, that you are with this person – second nature. This is society’s great phenomenological shift, which, over the last decade, has occurred almost without notice. Now anyone who opts out, or who feels uncomfortable about their participation, begins to feel retrograde, Luddite, uncool. Interiority begins to feel like a prison. The very process of thinking takes on a kind of trajectory: how can this idea be projected outward, towards others? If I have a witty or profound thought and I don’t tweet or Facebook it, have I somehow failed? Is that bon mot now diminished, not quite as good or meaningful as it would be if laid bare for the public? And if people don’t respond – retweet, like, favourite – have I boomeranged back again, committing the greater failure of sharing something not worth sharing in the first place? After all, to be uninteresting is a cardinal sin in the social-media age. To say “He’s bad at Twitter” is like saying that someone fails to entertain; he won’t be invited back for dinner.

In this environment, interiority, privacy, reserve, introspection – all those inward-looking, quieter elements of consciousness – begin to seem insincere. Sharing is sincerity. Removing the mediating elements of thought becomes a mark of authenticity, because it allows you to be more uninhibited in your sharing. Don’t think, just post it. “Pics or it didn’t happen” – that is the populist mantra of the social networking age. Show us what you did, so that we may believe and validate it.

by Jacob Silverman, The Guardian | Read more:
Image: Peter Macdiarmid/PA

What It Means to be Made in Italy

My Italian has gotten good enough that I can understand pretty much everything the locals say to me. The only words I consistently miss are the English words that they insert into conversation like french fries stuck in a spaghetti carbonara. WTF is “Nike” when it rhymes with “hike”? “Levi’s” when it rhymes with “heavies”? “Ee Red Hot Keelee Pepper?” But one English phrase comes up so often in conversation, at least within the rag trade, that I can pick it up on the first take: “Made In Italy.”

Cosa Vuol Dire “Made In Italy”?

To understand the meaning of “Made In Italy,” you have to go back to the genesis of the Italian nation, in the second half of the 19th century. Before that, Italy was a geographic concept, but not a political or cultural one. There was no real sense of an “Italian people” in the same way as there was already for the Germans, who formed a nation around the same time. Italy became one country not through collaboration, but through conquest by the Piedmont in the far north, which might as well have been Sweden as far as many Italians were concerned. If you think of Italy as a boot, the Piedmont would be the knee. A knee the rest of the peninsula would feel at their throats.

Citizens of the newly formed Italian state had little shared history, so newly-crowned propagandists created one, often relying on Roman iconography. Over the following decades, nationalistic myths hypertrophied into fascism - also largely a Northern phenomenon. Italy’s defeat in World War II broke this fever, but at a huge cost. The War was, for Italy, also a civil war, mostly pitting North against South, breaking open all the fissures that had been plastered over at the nation’s birth.

Two industries recreated Italian identity following the war - the film industry, and the fashion industry. Film helped the country understand its experience with the war and the poverty that followed. Fashion gave Italians a new nationalistic myth. Its appeals were more to the artistic achievements of the Italian Renaissance than the empire-building of the Roman era, and it helped that the industry’s first successes were in Tuscany, birthplace of Michelangelo. The Sala Bianca in the Pitti Palace hosted the first Italian fashion show in 1951, as well as Brioni’s men’s fashion show, famously the first of its kind, in 1952. Italian designers were able to capture something of the uniquely Italian approach to luxury and craft that had eluded the stuffy couturiers and tailors of Paris and Savile Row. As post-war realist film gave way to Fellini’s surrealist fantasies, Marcello Mastroianni became the guy everyone wanted to look, dress, and act like. And he wore Italian suits.

Allure, but Insecure

By 1980, the industry had grown tremendously, but had become something different. It had mostly moved to Milan, the industrial behemoth of the North. And it had begun to shift its focus from brands like Brioni to emerging giants like Armani and Ferre’. It was at this point that the “Made In Italy” campaign began, with the ambitious goal of branding an entire country. As one politico at Pitti’s “Opening Ceremony” said this year,” ‘Made In Italy’ is not just about selling fashion - it’s about selling Italian quality of life.” “Made In Italy” was intended to convey more than just the country of origin, but elegance, sophistication, craftsmanship - as if Leonardo DaVinci himself had blessed every stitch.

The campaign has been a massive success. Armani remains one of the most valuable brands in all of fashion. Gucci, Prada, and Zegna aren’t far behind. The manufacturing infrastructure that supports these brands is now also used by brands from Huntsman to Tom Ford to Ralph Lauren Purple Label, all of which are Made In Italy.

But the future is uncertain. At the Pitti’s Opening Ceremony, politician after politician announced their full support for the Italian fashion industry, for Pitti as a trade show, and their belief in the enduring allure of Italian luxury. Each one pledged a re-investment in “Made In Italy”. Which is what you do when you’re worried that a good idea’s time is running out.

by David Isle, Styleforum |  Read more:
Image: uncredited