Wednesday, May 7, 2014

Young Blood

[ed. Good news Boomers! Another avenue to parasitize the young (and you thought the resurgence of vampire themes in recent culture was just a coincidence?)]
  
Two teams of scientists published studies on Sunday showing that blood from young mice reverses aging in old mice, rejuvenating their muscles and brains. As ghoulish as the research may sound, experts said that it could lead to treatments for disorders like Alzheimer’s disease and heart disease.

“I am extremely excited,” said Rudolph Tanzi, a professor of neurology at Harvard Medical School, who was not involved in the research. “These findings could be a game changer.”

The research builds on centuries of speculation that the blood of young people contains substances that might rejuvenate older adults.

In the 1950s, Clive M. McCay of Cornell University and his colleagues tested the notion by delivering the blood of young rats into old ones. To do so, they joined rats in pairs by stitching together the skin on their flanks. After this procedure, called parabiosis, blood vessels grew and joined the rats’ circulatory systems. The blood from the young rat flowed into the old one, and vice versa.

Later, Dr. McCay and his colleagues performed necropsies and found that the cartilage of the old rats looked more youthful than it would have otherwise. But the scientists could not say how the transformations happened. There was not enough known at the time about how the body rejuvenates itself.

It later became clear that stem cells are essential for keeping tissues vital. When tissues are damaged, stem cells move in and produce new cells to replace the dying ones. As people get older, their stem cells gradually falter.

In the early 2000s, scientists realized that stem cells were not dying off in aging tissues.

“There were plenty of stem cells there,” recalled Thomas A. Rando, a professor of neurology at Stanford University School of Medicine. “They just don’t get the right signals.”

Dr. Rando and his colleagues wondered what signals the old stem cells would receive if they were bathed in young blood. To find out, they revived Dr. McCay’s experiments.

The scientists joined old and young mice for five weeks and then examined them. The muscles of the old mice had healed about as quickly as those of the young mice, the scientists reported in 2005. In addition, the old mice had grown new liver cells at a youthful rate.

The young mice, on the other hand, had effectively grown prematurely old. Their muscles had healed more slowly, and their stem cells had not turned into new cells as quickly as they had before the procedure.

The experiment indicated that there were compounds in the blood of the young mice that could awaken old stem cells and rejuvenate aging tissue. Likewise, the blood of the old mice had compounds that dampened the resilience of the young mice.

by Carl Zimmer, NY Times |  Read more:
Image: Getty Images/Zoonar RF via:

Kinetic Beauty


"Beauty is not the goal of competitive sports, but high-level sports are a prime venue for the expression of human beauty. The relation is roughly that of courage to war.

The human beauty we’re talking about here is beauty of a particular type; it might be called kinetic beauty. Its power and appeal are universal. It has nothing to do with sex or cultural norms. What it seems to have to do with, really, is human beings’ reconciliation with the fact of having a body.

Of course, in men’s sports no one ever talks about beauty or grace or the body. Men may profess their “love” of sports, but that love must always be cast and enacted in the symbology of war: elimination vs. advance, hierarchy of rank and standing, obsessive statistics, technical analysis, tribal and/or nationalist fervor, uniforms, mass noise, banners, chest-thumping, face-painting, etc. For reasons that are not well understood, war’s codes are safer for most of us than love’s.”

David Foster Wallace
via:

Lab Rats


Tom Tomorrow
via:

Why the Mona Lisa Stands Out

In 1993 a psychologist, James Cutting, visited the Musée d’Orsay in Paris to see Renoir’s picture of Parisians at play, “Bal du Moulin de la Galette”, considered one of the greatest works of impressionism. Instead, he found himself magnetically drawn to a painting in the next room: an enchanting, mysterious view of snow on Parisian rooftops. He had never seen it before, nor heard of its creator, Gustave Caillebotte.

That was what got him thinking.

Have you ever fallen for a novel and been amazed not to find it on lists of great books? Or walked around a sculpture renowned as a classic, struggling to see what the fuss is about? If so, you’ve probably pondered the question Cutting asked himself that day: how does a work of art come to be considered great?

The intuitive answer is that some works of art are just great: of intrinsically superior quality. The paintings that win prime spots in galleries, get taught in classes and reproduced in books are the ones that have proved their artistic value over time. If you can’t see they’re superior, that’s your problem. It’s an intimidatingly neat explanation. But some social scientists have been asking awkward questions of it, raising the possibility that artistic canons are little more than fossilised historical accidents.

Cutting, a professor at Cornell University, wondered if a psychological mechanism known as the “mere-exposure effect” played a role in deciding which paintings rise to the top of the cultural league. In a seminal 1968 experiment, people were shown a series of abstract shapes in rapid succession. Some shapes were repeated, but because they came and went so fast, the subjects didn’t notice. When asked which of these random shapes they found most pleasing, they chose ones that, unbeknown to them, had come around more than once. Even unconscious familiarity bred affection.

Back at Cornell, Cutting designed an experiment to test his hunch. Over a lecture course he regularly showed undergraduates works of impressionism for two seconds at a time. Some of the paintings were canonical, included in art-history books. Others were lesser known but of comparable quality. These were exposed four times as often. Afterwards, the students preferred them to the canonical works, while a control group of students liked the canonical ones best. Cutting’s students had grown to like those paintings more simply because they had seen them more.

Cutting believes his experiment offers a clue as to how canons are formed. He points out that the most reproduced works of impressionism today tend to have been bought by five or six wealthy and influential collectors in the late 19th century. The preferences of these men bestowed prestige on certain works, which made the works more likely to be hung in galleries and printed in anthologies. The kudos cascaded down the years, gaining momentum from mere exposure as it did so. The more people were exposed to, say, “Bal du Moulin de la Galette”, the more they liked it, and the more they liked it, the more it appeared in books, on posters and in big exhibitions. Meanwhile, academics and critics created sophisticated justifications for its pre-eminence. After all, it’s not just the masses who tend to rate what they see more often more highly. As contemporary artists like Warhol and Damien Hirst have grasped, critical acclaim is deeply entwined with publicity. “Scholars”, Cutting argues, “are no different from the public in the effects of mere exposure.”

The process described by Cutting evokes a principle that the sociologist Duncan Watts calls “cumulative advantage”: once a thing becomes popular, it will tend to become more popular still. A few years ago, Watts, who is employed by Microsoft to study the dynamics of social networks, had a similar experience to Cutting in another Paris museum. After queuing to see the “Mona Lisa” in its climate-controlled bulletproof box at the Louvre, he came away puzzled: why was it considered so superior to the three other Leonardos in the previous chamber, to which nobody seemed to be paying the slightest attention?

When Watts looked into the history of “the greatest painting of all time”, he discovered that, for most of its life, the “Mona Lisa” languished in relative obscurity. In the 1850s, Leonardo da Vinci was considered no match for giants of Renaissance art like Titian and Raphael, whose works were worth almost ten times as much as the “Mona Lisa”. It was only in the 20th century that Leonardo’s portrait of his patron’s wife rocketed to the number-one spot. What propelled it there wasn’t a scholarly re-evaluation, but a burglary.

by Ian Leslie, Intelligent Life |  Read more:
Image: Eyevine

Early-Life Crisis

I was born a friendless virgin.

During those first months, it was clear that I was depressed. I spent each day at home, lying flat on my back, looking up at the ceiling and thinking, I should really go out and meet people. But I never did. In fact, I don’t think I made a single friend in my first months alive. I was such a loser.

Instead of making connections, I distracted myself with meaningless games. I slept poorly and cried all the time. My life was nothing like “Entourage.” I had trouble meeting women but refused to use Tinder. Looks-wise, I didn’t bring a lot to the table: I had no muscle definition, a chubby face, and a very tiny penis. People would call my naked pictures “cute.”

I’ll never forget the day my mother introduced me to her friend’s daughter, Chelsea. I felt a connection from the moment she peed herself. We had a lot in common—we were both bald and androgynous. Neither of us had teeth. I thought to myself, She might be the one.

Later that night, we were lying side by side on my bed. I wanted to tell her how I felt, but suddenly I was unable to speak, or even to lift my head. My therapist says that’s right—I was literally unable to do those things, and I know what he means: I’m always sabotaging myself.

After Chelsea left, I began worrying that I might be alone forever. Everyone I knew was married—my mom, my dad, and my grandparents. I had started experiencing the pains that came with aging, many of which involved my molars. I felt my mortality. My molar-tality.

by Ben Jurney, New Yorker | Read more:
Image: Jessica Peterson

Tuesday, May 6, 2014

Sudden Death: A Eulogy

In 1958, John Kenneth Galbraith’s Affluent Society reminded Americans that, for the first time in human history, we lived in a civilization where a majority of people did not have to worry about basic subsistence. More than five decades later, we find ourselves belonging to the first human civilization where sudden death is the glaring exception, not the expectation. The novelty of our position is all too easy to forget; it is even easier to assume without questioning that the present state of affairs reflects progress. After all, which of us wouldn’t rather die well-prepared at ninety than suddenly at fifty-five? And yet, the more I see of death, the less convinced I become that, in this medical and social revolution, we have not lost something of considerable value. I certainly don’t mean to glorify premature death: I suspect both “dying with one’s boots on” and “living fast, loving hard and dying young” are highly overrated feats. I do not believe that it is either dulce or decorum to die at twenty-five for one’s country. My concern is also not with the economic effects of the long goodbye: the percent of Medicare dollars spent in the last six months of life, the prospect of every gainfully-employed worker supporting two retirees. Rather, my disquiet is principally for lost human dignity. Canadian right-to-die activist Gloria Taylor, who suffers from Lou Gehrig’s disease, recently wrote: “I can accept death because I recognize it as a part of life. What I fear is a death that negates, as opposed to concludes, my life.” Sudden death is a conclusion. Too often, I fear, the long goodbye devolves into a negation.

The contrast between the death of my grandmother’s father and that of her husband fifty-eight years later is highly revealing. Grandpa Leo, a Belgian refugee who earned a comfortable living in the jewelry business, developed prostate cancer in his early seventies, survived a mild heart attack at age seventy-seven, and by his mid-eighties had trouble remembering the names of his sisters. And then, at eighty-six, he developed a metastatic lesion on the surface of his brain. In 1950, the cancer would have killed him in a matter of months. In 2006, a skilled neurosurgeon managed to scoop out the bulk of the tumor, enabling my grandfather to survive to a series of small strokes a full year later. Once again, these cerebral insults—as the medical chart termed them—would certainly have ended an octogenarian’s life in his own father-in-law’s generation. But after a two-month long hospital stay and tens of thousands of dollars in high tech imaging, modern anti-coagulants enabled Grandpa Leo to roll into a nursing home that he actually believed to be his mother’s apartment in prewar Antwerp. I visited him one afternoon and he announced how much he loved his wife—but he was actually referring to the young West African woman assigned to change his bed linens. It took two intubations, weeks on a ventilator, multiple courses of dialysis and a month of unconsciousness before my grandmother finally cried uncle and brought the process of her husband’s dying to a halt. By then, the man I’d worshiped as a child for his vigor and independence had gone nearly a half a year without responding to his own name. When Grandpa Leo died—after the best nursing care imaginable—his entire torso had become one enormous bedsore, his back and shoulders assuming the color of a side of tenderized beef. Is my grandfather’s longevity a triumph or a tragedy? On the one hand, I am grateful that I had an opportunity to know my grandfather well into my own adulthood—an opportunity that my father never had. On the other hand, faced with the prospect of following in my grandfather’s footsteps, I’d much rather drop dead in front of a firehouse at sixty.

by Jacob M. Appel, The Kenyon Review | Read more:

Arne Svenson - The Neighbors (2013)
via:

Debunking the Bunk Police

The parking lot after a Phish concert is a notoriously dirty drug scene. I used to feel less disturbed by it than I do now, but I was younger then and I think I glorified the drugs. Now all I see are nitrous-filled balloons that sell for twenty dollars apiece and alcoholics who will grab your ass and may or may not end up passing out in a stranger’s tent. When Phish plays the Gorge in George, Washington, and everyone camps overnight, the lot becomes a virtual no man’s land. The people that you enjoyed the concert with turn into zombies, wobbling around high on cat tranquilizers (Ketamine) or stalking the sunrise on a slow comedown from their long, winding acid trip.

At the Gorge this past July there were two Shakedown Streets, makeshift roads lined with people selling things: food, clothing, fairy wings, hula hoops, ceramic masks, stone jewelry—though it’s a destination best known for the endless array of mind-altering possibilities for sale or trade. Drug dealers will approach your campsite, trying to barter hash for tickets or a gram of Molly—MDMA, or ecstasy, the most popular drug—for around $100 cash. It’s a place where anything can be bought, sold or traded, and for years it operated according to mutual trust; when someone sold you a drug you assumed they weren’t trying to poison you, and that it was in fact what they claimed it to be. Though always illegal, LSD was in fact LSD, opium wasn’t black tar heroin, and you weren’t likely to get crack instead of the cocaine you were promised.

It’s over a mile hike from the Gorge concert venue back to the campground, and the re-acclimation from show to campsite can be as harsh as the unexpected flare of the fluorescent lights. I made the walk back with my friend Ashley, who I’d met a few weeks before at a Phish show in Saratoga Springs, New York. Ashley is beautiful, her parents are prominent government employees, and having recently graduated college, she decided to follow Phish that summer. That night she was wearing a long, button-down black tunic with a pattern of red roses and a $300 leather cowboy hat. I remember because it was my birthday, and though it had only been a few weeks, we were fast friends by the time we got to Washington. Walking back that night, we held a mutual dread for the nitrous hustlers we would have to pass, less so because we’re opposed to nitrous itself, but because in general it makes for a gross scene; they’re selling something most people want, at an absurdly inflated price, and with the Nitrous Mafia can come deaths, unnecessary aggression, and indifferent campsite neighbors who stay up until sunrise giggling and inhaling, keeping you awake with the hiss of their tank. Before we entered the campground (and thus the nitrous), we saw a booth situated at the entrance, the blur of a pink and purple jellyfish-like tent, all good vibes and chill tunes. We went closer, and there, nestled among food vendors, with no line and an austere aura, we found the Bunk Police, selling something entirely different: drug-testing kits.

We knew of the Bunk Police from Saratoga Springs, and also because they’d maintained a constant presence on the Phish tour all summer. They had been at other festivals, from the more mainstream Coachella, Bonaroo, and Wakarusa, to obscure electronic gatherings like Lightning in a Bottle and Firefly. The anonymous organization, run by volunteers, preaches harm reduction through education about misrepresented substances. The kits vary depending on the kind of drug you’re testing, but in principle they’re all the same: you dissolve a minuscule amount of your substance in the chemicals provided in your kit (one is good for about 50-100 tests), and depending upon the color change, you know what drug you’re dealing with. The test kits are essentially the same as what a cop would use if he were trying to test someone’s drugs. The Bunk Police sell kits for $20 apiece to drug users so that they can increase their safety and call out fraudulent (or simply ignorant) dealers.

Jeffrey Bryan Chambers spent the summer following the Bunk Police and filming their experience for his forthcoming documentary What’s in my Baggie? Chambers and his crew traveled across the country shooting the Bunk Police’s work, largely through BP volunteers’ interactions with customers, and the campsites where most of the testing occurs (the BP only sell the kits; they leave the testing up to you). When I asked Chambers what he and his crew found doing this work, he hesitated: “It’s hard to say an actual percentage. It’s difficult to say or even quantify all the drugs even at a festival, say like Bonaroo, where there’s upwards of 80,000 people camping in one area.” I pressed him to be more specific. “I can confidently say that from what we saw, over half of the substances were misrepresented, most commonly bath salts being sold as MDMA,” he said. Of the cocaine samples his crew tested over the summer, only one in over 30 cases even contained cocaine. As a population, we’ve been dealing with cocaine for decades, and when cut, most often it’s with methamphetamine. Bath Salts and other research chemicals—many of which are legal and available in bulk on the Internet—that masquerade as Molly and LSD, pose a more serious threat.

by Kiran Herbert, The Weeklings | Read more:
Image: via:

Is Green the New Brown?

I do a lot of driving, most of it highly tedious. Two miles to the grocery store. Six miles to the mall. Twelve miles to work. The sort where every minute seems to count because the whole trip is so wearisome. In that context, it doesn’t take much to piss me off. I start stereotyping. Big pick-up trucks are driven by reckless assholes; European sedans by condescending elitists.

And then there are the bumper stickers, which can drive me batty even when I mostly agree with the political worldview they promote. Does the world really need another “Coexist” message? Or a faded reminder that the owner once believed that Barack Obama was a metonym for change?

Sometimes, though, the stars align to produce a juxtaposition so perverse that it takes my breath away. The other day I was cut off by a Toyota Prius that then proceeded to slam on the brakes, making me miss a crucial left-turn arrow while it rolled through the intersection on red.

I was incensed. The drivers of hybrids are notoriously self-righteous, practically begging everyone else to praise them for saving the world, even though the giant batteries that save them so much money are far from ecologically sound. But in my experience, Prius owners are particularly egregious in this regard.

But the Prius also seems to be the car of choice for overly cautious drivers, the way Volvos were in the 1970s. If I see one in front of me, I change lanes as soon as I can. It’s almost as bad as having a bus ahead of you.

by Charlie Bertsch, Souciant | Read more:
Image: Charlie Bertsch

Monday, May 5, 2014

A Living Wage

The only socialist city councillor in the United States is torn.

On the one hand, Kshama Sawant has claimed an “historic victory” for a populist campaign that pressured Seattle’s mayor, politicians and business owners to embrace by far the highest across-the-board minimum wage in the US at $15 an hour.

On the other, the economics professor accuses the Democratic party establishment and corporate interests of colluding to compromise its implementation as the city council on Monday begins to hammer out the terms for setting pay at more than double the federal minimum wage. Sawant is gearing up to put the issue on the ballot in November’s election if the final legislation is not to her liking – a move Seattle’s mayor has warned could result in “class warfare” as it is likely to pit big business against increasingly vocal low-paid workers and to divide the trade unions.

The Socialist Alternative party’s sole elected representative hailed the looming debate on the legislation as evidence of a growing backlash across the country against the wealthy getting ever richer while working people endure decades of stagnant wages and deepening poverty.

“The fact that the city council of a major city in the US will discuss in the coming weeks raising the minimum wage to $15 is a testament to how working people can push back against the status quo of poverty, inequality and injustice,” she said.

One third of Seattle residents earn less than $15 an hour. A University of Washington study commissioned by the council said the increase would benefit 100,000 people working in the city and reduce poverty by more than one quarter. The pay of full-time workers on today’s minimum wage would increase by about $11,000 a year.

Sawant can claim a good share of the credit for forcing the agenda. Seattle fast-food workers got the movement off the ground early last year in joining nationwide strikes and protests that began in New York. But the Socialist Alternative candidate helped put the $15 demand at the fore of Seattle’s politics by making it the centrepiece of an election campaign she began as a rank outsider against a Democratic incumbent. Sawant won in November with more than 93,000 votes, socialist views, strong denunciations of capitalism and the occasional quoting of Karl Marx evidently no longer an immediate bar to election in the US.

by Chris McGreal, Guardian |  Read more:
Image: Elaine Thompson/AP

Under The Volcano


Americans love Mexican food. We consume nachos, tacos, burritos, tortas, enchiladas, tamales and anything resembling Mexican in enormous quantities. We love Mexican beverages, happily knocking back huge amounts of tequila, mezcal and Mexican beer every year. We love Mexican people—as we sure employ a lot of them. Despite our ridiculously hypocritical attitudes towards immigration, we demand that Mexicans cook a large percentage of the food we eat, grow the ingredients we need to make that food, clean our houses, mow our lawns, wash our dishes, look after our children. As any chef will tell you, our entire service economy—the restaurant business as we know it—in most American cities, would collapse overnight without Mexican workers. Some, of course, like to claim that Mexicans are “stealing American jobs”. But in two decades as a chef and employer, I never had ONE American kid walk in my door and apply for a dishwashing job, a porter’s position—or even a job as prep cook. Mexicans do much of the work in this country that Americans, provably, simply won’t do.

We love Mexican drugs. Maybe not you personally, but “we”, as a nation, certainly consume titanic amounts of them—and go to extraordinary lengths and expense to acquire them. We love Mexican music, Mexican beaches, Mexican architecture, interior design, Mexican films.

So, why don’t we love Mexico?

We throw up our hands and shrug at what happens and what is happening just across the border. Maybe we are embarrassed. Mexico, after all, has always been there for us, to service our darkest needs and desires. Whether it’s dress up like fools and get pass-out drunk and sun burned on Spring break in Cancun, throw pesos at strippers in Tijuana, or get toasted on Mexican drugs, we are seldom on our best behavior in Mexico. They have seen many of us at our worst. They know our darkest desires.

by Anthony Bourdain |  Read more:
Image: uncredited

This is What Comes After Search

The average person with an Android smartphone is using it to search the web, from a browser, only 1.25 times per day, says Roi Carthy, CEO of Tel Aviv-based mobile startup Everything.Me. That isn’t just bad news for Google, which still relies on ads placed along search results for the bulk of its revenue—it also signals a gigantic, fundamental shift in how people interact with the web. It’s a shift upon which fortunes will be made and lost.

Carthy knows how often people use search on Android because once you install his company’s Everything.Me software, it replaces the home screen on an Android smartphone with one that is uniquely customized to you. And then Everything.Me collects data on how often you search, plus a whole lot else, including where you are, where you go, which apps you use, the contents of your calendar, etc.

This kind of data collection is key to how Everything.Me works, and if Carthy and his investors, who have already sunk $37 million into his company are right, it’s the sort of thing many other companies will be doing on smartphones, all in the name of bringing people what comes after search.

Context is the new search

We’re accustomed to turning on our phones and seeing the same set of icons in the same place every time. But Everything.Me upends this interface convention, and shows people different icons depending on the context in which they find themselves. For example, if Everything.Me knows you’re in a new city, it will show you apps that could aid your navigation in that city—like Uber and Lyft—even if you’ve never downloaded them before. Or, based on apps you and people like you have enjoyed in the past, Everything.Me will show you games and entertainment apps under an “I’m bored” tab. (Tabs for different pages full of apps is one way Everything.Me allows users to tell the phone even more about their current context.)

If it’s time to eat, Everything.Me will show you restaurants nearby you might enjoy, and if it’s time to go out, it will show you activities and hotspots you’re likely to want to check out.

Carthy says that, in contrast to the paltry number of times users of Everything.Me are searching the web each day, they’re engaging in context-based interactions with their customized home screens dozens of times a day.

In other words, in the old days, if you wanted to do something—navigate to the restaurant where you’ve got a dinner reservation—you might open a web browser and search for its address. But in the post-search world of context—in which our devices know so much about us that they can guess our intentions—your phone is already displaying a route to that restaurant, as well as traffic conditions, and how long it will take you to get there, the moment you pull your phone out of your pocket.

Most consumer tech giants are piling into context

Context-aware software for smartphones is all the rage among tech giants. In just the past year, Twitter bought Android home screen startup Cover, Apple bought smart assistant Cue, Yahoo bought Cover competitor Aviate, and of course Google has pioneered the field of learning everything about a person so that it can push data to them before they even know they need it, with its Google Now service.

Yahoo CEO Marissa Mayer has been especially explicitly about what this new age of context means. “Contextual search aims to give people the right information, at the right time, by looking at signals such as where they’re located and what they’re doing—such as walking or driving a car,” she said at a recent conference. “Mobile devices tend to provide a lot more of those signals….When I look at things like contextual search, I get really excited.”

Notice that Mayer said “contextual search” and not just “context.” That’s a nod to the fact that software designed to deliver information based on context is still using search engines to get that information, it’s just that the user doesn’t have to interact with the search engine directly.

by Christopher Mims, Quartz | Read more:
Image: Chris Pizzello/AP

White-Collar World

In or around the year 1956, the percentage of American workers who were "white collar" exceeded the percentage that were blue collar for the first time. Although labor statistics had long foretold this outcome, what the shift meant was unclear, and little theoretical work had prepared anyone to understand it. In the preceding years, the United States had quickly built itself up as an industrial powerhouse, emerging from World War II as the world’s leading source of manufactured goods. Much of its national identity was predicated on the idea that it made things. But thanks in part to advances in automation, job growth on the shop floor had slowed to a trickle. Meanwhile, the world of administration and clerical work, and new fields like public relations and marketing, grew inexorably—a paperwork empire annexing whole swaths of the labor force, as people exchanged assembly lines for metal desks, overalls for gray-flannel suits.

It’s hard to retrieve what this moment must have been like: An America that was ever not dominated by white-collar work is pretty difficult to recall. Where cities haven’t fallen prey to deindustrialization and blight, they have gentrified with white-collar workers, expelling what remains of their working classes to peripheries. The old factory lofts, when occupied, play host to meeting rooms and computers; with the spread of wireless technology, nearly every surface can be turned into a desk, every place into an office. We are a nation of paper pushers.

What it means to be a paper pusher, of course, seems to have changed dramatically (not least because actual paper isn’t getting carted around as much as it used to). The success of a show like Mad Men capitalizes on our sense of profound distance from the drinking, smoking, serial-philandering executive egos idolized in the era of the organization man. Many of the problems associated with white-collar work in midcentury—bureaucracy, social conformity, male chauvinism—have, if not gone away, at least come into open question and been seriously challenged. It would be hard to accuse the colorful, open, dog-friendly campuses of Silicon Valley of the beehivelike sameness and drabness that characterized so many 1950s offices, with their steno pools and all-white employees. On the surface, contemporary office life exudes a stronger measure of freedom than it ever did: More and more women have come to occupy higher rungs of the corporate ladder; working from home has become a more common reality, helping to give employees more ostensible control over their workday; people no longer get a job and stick with it, leading to more movement between companies.

At the same time, we are undergoing one of the most prolonged and agonizing desiccations of the white-collar, middle-class ideal in American history. Layoffs feel as common to late capitalist offices as they were to Gilded Age factories; freedom in one’s choice of workplace really reflects the abrogation of a company’s sense of loyalty to its employees; and insecurity has helped to enforce a regime of wage stagnation. In universities, the very phrase "academic labor" has become a byword for dwindling job protection. White-collar workers report experiencing higher levels of stress than their blue-collar counterparts do, and many work long hours without overtime pay. The increasingly darkening mood of frantic busyness—punctuated by bouts of desperate yoga—that has settled over American life owes much to the country’s overall shift to a white-collar world, where the rules resemble very little those of the world it left behind.

In other words, what the office has done to American life should be a topic of central importance. But there is still only one book, now more than 60 years old, that has tried to figure out what the new dominance of white-collar work means for society: White Collar: The American Middle Classes, by C. Wright Mills.

Few books inaugurate a field of study and continue to tower over it in the way White Collar has; its title alone is authoritative. It sums up and it commands. Even if we are not all white-collar workers now, white-collar work has become central to social life in ways so ubiquitous as to be invisible. Mills was practically the first to notice this and to explore its ramifications. His findings not only stand alone in the literature on the subject but loom over the others in its eerie prescience and power.

It helped his book that, as a personality, Mills, in his mid-30s when the book came out, was far from any dry middle-manager drone he analyzed, let alone the tweedy sonorousness of his Columbia colleagues Lionel Trilling and Jacques Barzun. Students who witnessed his arrival at class would see him dismount a motorcycle and adjust his leather jacket, lugging a duffel bag crammed with books that he would fling onto the seminar table. His unprofessorial style corresponded to an intellectual nonconformism. A scourge of the blandly complacent, "value neutral" social theory that formed the academic consensus of his day, Mills was also hostile to the orthodox Marxist accents that had been fashionable in the speech of the 1930s. Unfortunately, the dominance especially of the latter made it impossible to understand what class position white-collar workers belonged to, and what it meant. Under the most popular (or "vulgar") version of Marxism, the various strata of clerical and professional workers grouped under the heading "white collar" were supposed to dissolve eventually into the working class: In the terms of left-wing German sociology, they were a Stehkragen, or "stiff collar," proletariat.

Mills was unimpressed by all that. The more he looked at white-collar workers, the more he saw that their work made their lives qualitatively different from those of manual workers. Where manual workers exhibited relatively high rates of unionization—solidarity, in other words—white-collar workers tended to rely on themselves, to insist on their own individual capacity to rise through the ranks—to keep themselves isolated. The kind of work they did was partly rationalized, the labor divided to within an inch of its life. Mills constantly emphasized the tremendous growth of corporations and bureaucracies, the sheer massiveness of American institutions—words like "huge" and "giant" seem to appear on every page of his book. At the same time, so much of their work was incalculably more social than manual labor, a factor that particularly afflicted the roles afforded to female white-collar workers: Salesgirls had to sell their personalities in order to sell their products; women in the office were prized as much for their looks or demeanor as for their skills or capabilities.

What Mills realized was that, where backbreaking labor was the chief problem for industrial workers, psychological instability was the trial that white-collar workers endured, and on a daily basis.

by Nikil Saval, Chronicle of Higher Education |  Read more:
Image: David Plunkert for The Chronicle Review

All the World’s an App

I used to ask the internet everything. I started young. In the late 1980s, my family got its first modem. My father was a computer scientist, and he used it to access his computer at work. It was a silver box the size of a book; I liked its little red lights that told you when it was on and communicating with the world. Before long, I was logging onto message boards to ask questions about telescopes and fossils and plots of science fiction TV shows.

I kept at it for years, buying new hardware, switching browsers and search engines as needed. And then, around 2004, I stopped. Social media swallowed my friends whole, and I wanted no part of it. Friendster and Myspace and Facebook—the first great wave of social networking sites—all felt too invasive and too personal. I didn’t want to share, and I didn’t want to be seen.

So now, 10 years on, Facebook, iMessaging, and Twitter have passed me by. It’s become hard to keep up with people. I get all my news—weddings, moves, births, deaths—second-hand, from people who saw something on someone else’s feed. I never know what’s going on. In return, I have the vain satisfaction of feeling like the last real human being in a world of pods. But I am left wondering: what am I missing out on? And is everyone else missing out on something I still have?

Virginia Woolf famously said that on or about December 1910 human character changed. We don’t yet know if the same thing happened with the release of the iPhone 5—but, as the digital and “real” worlds become harder to distinguish from each other, it seems clear that something is shifting. The ways we interact with each other and with the world have altered. Yet the writing on this subject—whether it’s by social scientists, novelists or self-styled “internet intellectuals”—still doesn’t seem to have registered the full import of this transformation. (...)

The behaviour of teens online can be baffling. But are they really more “risk-averse,” “dependent,” “superficial” and “narcissistic” than kids in the past? And are they in danger in some new, hard-to-track way? Danah Boyd, a researcher at New York University and Microsoft, isn’t so sure. In It’s Complicated, her detailed new anthropological inquiry into the internet habits of American teenagers, she does much to dispel many of the alarmist myths that surround young people and social media.

Boyd has spent over a decade interviewing teens about their use of social media, and in the process has developed a nuanced feel for how they live their online lives. Throughout It’s Complicated, she shows teens to be gifted at alternating between different languages and modes of self-presentation, assuming different personas for different audiences and switching platforms (say, between Facebook and Twitter and Ask.fm) based on their individual interests and levels of privacy. She also suggests that many of the fears associated with teens and the internet—from bullying to addiction—are overblown. She argues convincingly, for instance, that “Social media has not radically altered the dynamics of bullying, but it has made these dynamics more visible to more people.”

Social media may not lead to more bullying or addiction, but it does create lots of drama. Boyd and her sometime-collaborator Alice Marwick define drama as “performative, interpersonal conflict that takes place in front of an active, engaged audience, often on social media.” Essentially, “drama” is what keeps school from being boring, and what makes it such hell. It’s also the reason teenagers spend so much time online. The lure isn’t technology itself, or the utopian dream of a space in which anyone could become anything, which drew many young people to the internet in its early bulletin-board and newsgroup days; it’s socialising. Teens go online to “be with friends on their own terms, without adult supervision, and in public”—and Boyd argues that this is now much more difficult than it used to be. She portrays the US as a place in which teens are barred from public spaces such as parks and malls, and face constant monitoring from parents, teachers and the state. This is a paranoid country, in which parents try to channel all their children’s free time into structured activities and are so afraid of predators that they don’t allow their children outside alone. In this “culture of fear” social media affords teens one of their few avenues for autonomous expression.

Parents never understand; but Boyd makes the case that adult cluelessness about the multiple uses teens find for social media—everything from sharing jokes to showing off for university recruiters—can be especially harmful now. She tells the story of a teenager from south central Los Angeles who writes an inspiring college entrance essay about his desire to escape his gang-ridden neighbourhood. But when admissions officers at the Ivy League university to which he’s applying Google him, they are shocked to discover that his MySpace profile is filled with gang symbolism and references to gang activities. They do not consider that this might be a survival strategy instead of a case of outright deception.

by Jacob Mikanowski, Prospect |  Read more:
Image: uncredited