Monday, August 25, 2014

Did Congestion Charging Just Go Viral?

[ed. I'd never heard of congestion charging until today. Sounds like a pretty hard-sell.]

Congestion charging or pricing is the practice of setting up cordon tolls around the city on a large-scale to charge entrants for entering during peak hours. Ideally, this is done in an automatic fashion with cameras registering your license plate and directly billing you. This is different from low emissions zones, which are specific zones that limit the type of vehicles that can enter, and when.

City-scale congestion charging is picking up steam as a policy tool to free cities from crippling traffic.Singapore led the way starting in 1975, and LondonMilan, and Stockholm have since followed suit. In 2008, the former Mayor of New York City Michael Bloomberg led a valiant, but eventually doomed effort to install congestion charging around Manhattan. However, despite New York’s setback and otherwise sporadic progress, three news items make me wonder if congestion pricing is reaching a tipping point:

First, despite New York’s failed attempt, it looks as if a bottom-up plan could revive the city’s efforts. With crippling congestion and underfunded transit projects, New Yorkers are starting to rally to the cause. The key to success this time might be better consultation and more community engagement. So far so good.

Second, Stockholm’s at-first shaky congestion pricing plan is now considered an unobtrusive part of life. In fact, its popularity spurred Gothenburg to adopt it, and there are now proposals for all major Swedish cities to adopt the system [in Swedish].

Finally, we turn to the mother lode of traffic: China. Not only have Beijing andShanghai studied the possibility of congestion charging for a while now, it appears that Beijing is going to institute it next year, using its many ring roads to its advantage.

by Tali Trigg, Scientifc American | Read more:
Image: Stockholm Transport Styrelsen.

Sunday, August 24, 2014


Xiao Wen Ju fronts the Lane Crawford Spring/Summer 2014 Campaign
via:

Mutablend (on Flickr), No Communication No Love
via:

Every Insanely Mystifying Paradox in Physics: A Complete List


Today’s brain-melter: Every Insanely Mystifying Paradox in Physics. It’s all there, from the Greisen-Zatsepin-Kuzmin limit to quantum immortality to, of course, the tachyonic antitelephone.
A tachyonic antitelephone is a hypothetical device in theoretical physics that could be used to send signals into one’s own past. Albert Einstein in 1907 presented a thought experiment of how faster-than-light signals can lead to a paradox of causality, which was described by Einstein and Arnold Sommerfeld in 1910 as a means “to telegraph into the past”.
If you emerge with your brain intact, at the very least, you’ll have lost a couple of hours to the list.

by Cliff Pickover, Sprott Physics, Univ. of Wisconson | Read more:
via: Kottke.org

Saturday, August 23, 2014

R.E.M.


[ed. Repost. This disappeared for a while but is now back up on YouTube. Great to see them so young and going for it.]

The Best Drones

Of the dozens of drones aimed at the aspiring aerial photographer/videographer, the $1,300 DJI Phantom 2 Vision+ is the one we recommend for most people, as it’s the only one that is easy to control while having great battery life and range, terrific safety features, and a smartphone app that lets you preview your on-drone camera for photography and piloting ease.

It was the obvious favorite going into this guide due to its numerous editorial accolades and positive user reviews, so I tried my best to find something better. But after over 25 hours of research, 10 hours of interviews with experts, and half a day of hands-on testing against its closest competition (on top of over 100 previous drone flights of my own), I had to agree with the crowd. Nothing comes close to the Phantom 2 Vision+. Its combination of ease-of-use and advanced features simply can’t be matched by anything currently available.

In addition to being easy to fly, the V+ comes equipped with a relatively high-quality camera that’s almost as good as the GoPro Hero 3+ (rare), a three-plane gimbal for image stabilization (rare), and a Wi-Fi extender that gives you the ability to see real-time stats and what you’re shooting from over 2,000 feet away on a smartphone you mount to your radio controller (also rare). It also has pre-programmed flight controls tailored to beginners and advanced pilots, a standout 2,000-foot range, a battery that lasts a stellar 25 minutes instead of the usual 12, the ability to fly autonomously (thanks to a recently announced Ground Station function), and the standard safety setting that prompts the drone to return to the launch pad if it loses connection with the radio transmitter.

In other words, it’s exactly what you’d want, expect, and need from a camera drone.

You can get most of these features on other drones if you have the technical know-how and are willing to figure things out, tools in hand, but unless you’re into tinkering for tinkering’s sake, it’s just not worth the time.

$1,300 sounds like a lot of money, but the Phantom 2 Vision+ (henceforth referred to as the “V+”) is a surprisingly great value if you run the numbers: in order to get similar capabilities from a cheaper drone using aftermarket parts, you’d have to spend over $1,500 and futz with the inside wiring of your drone. And you’d still wind up with lesser capabilities.

That said, $1,300 is a lot of money to spend on a thing that you could crash on its maiden voyage. If you’re unfamiliar with how to fly drones or just need to fine tune your skills (and who doesn’t), we highly suggest getting a cheapo trainer drone before putting your $1,300 investment aloft. For that, we recommend the highly-touted $90 Blade Nano QX. It’s essentially a palm-sized quadcopter without the camera and fancy features like GPS-assisted position hold.

If you’re already confused by the terms in this guide, we’ve got you covered with a glossary.1 We explain any technical terms we use, but other sites don’t; we definitely recommend keeping it handy if you’re planning on clicking through to our sources.

by Eric Hansen, Wirecutter |  Read more:
Images: DJI Phantom 2 Vision+ and Blade Nano QX

John Piper, Eye and Camera: Red, Blue and Yellow (1980)
via:

The Truth We Won’t Admit: Drinking Is Healthy

Bob Welch, former star Dodgers pitcher, died in June from a heart attack at age 57. In 1981, Welch published (with George Vecsey) Five O’Clock Comes Early: A Cy Young Award-Winner Recounts His Greatest Victory, in which he detailed how he became an alcoholic at age 16: “I would get a buzz on and I would stop being afraid of girls. I was shy, but with a couple of beers in me, it was all right.”

In his early 20s, he recognized his “disease” and quit drinking. But I wonder if, like most 20-something problem drinkers (as shown by all epidemiological research), he would otherwise have outgrown his excessive drinking and drunk moderately?

If he had, he might still be alive. At least, that’s what the odds say.

Had Welch smoked, his obituaries would have mentioned it by way of explaining how a world-class athlete might have died prematurely of heart disease. But no one would dare suggest that quitting drinking might be responsible for his heart attack.

In fact, the evidence that abstinence from alcohol is a cause of heart disease and early death is irrefutable—yet this is almost unmentionable in the United States. Even as health bodies like the CDC and Dietary Guidelines for Americans(prepared by Health and Human Services) now recognize the decisive benefits from moderate drinking, each such announcement is met by an onslaught of opposition and criticism, and is always at risk of being reversed.

Noting that even drinking at non-pathological levels above recommended moderate limits gives you a better chance of a longer life than abstaining draws louder protests still. Yet that’s exactly what the evidence tells us. (...)

Given the multitude of studies of the effects of alcohol on mortality (since heart disease is the leading killer of men and women, drinking reduces overall mortality significantly), meta-analyses combining the results of the best-designed such studies can be generated. In 2006, the Archives of Internal Medicine, an American Medical Association journal, published an analysis based on 34 well-designed prospective studies—that is, research which follows subjects for years, even decades. This meta-analysis, incorporating a million subjects, found that “1 to 2 drinks per day for women and 2 to 4 drinks per day for men are inversely associated with total mortality.”

So the more you drink—up to two drinks a day for woman, and four for men—the less likely you are to die. You may have heard that before, and you may have heard it doubted. But the consensus of the science is overwhelming: It is true.

Although I dispute many of the caveats offered against the life-saving benefits of alcohol, I will endorse two. First, these outcome data do not apply to women with the “breast-cancer gene” mutations (BRCA 1 or 2) or a first-degree (mother, sister) relation who has had breast cancer, for whom alcohol consumption is far riskier. Second, drinking 10 drinks Friday and Saturday nights does not convey the benefits of two or three drinks daily, even though your weekly totals would be the same: Frequent, heavy binge drinking is unhealthy. But then you knew that already, didn’t you? If you don’t distinguish binge drinking from daily moderate drinking, that would be due to Americans’ addiction-phobia, which causes them to interpret any daily drinking as addictive.

The global summary of alcohol’s benefits raises a key question: How much do you have to drink regularly before you become as likely to die as an abstainer? We’ll see below.

by Stanton Peele, Pacific Standard |  Read more:
Image: Ben Hussman/Flickr

William Kendall, Wipe Out (1998)
via:

How Surf Mania Was Invented

John Severson's path towards becoming surfing’s first editor-in-chief began with a stroke of good fortune. Upon being drafted to the US Army in 1956, Severson was told that he would be serving out his active duty in Germany. However, after another draftee failed his Morse code exam, Severson’s presence was required elsewhere and he received new orders: “You’re going to Hawaii.”

To his surprise, Severson’s fellow troops were not charmed by the thought of spending two years in the middle of the Pacific. Almost fifty years later, he still recalls their complaints, “We’re going to Hawaii? There’s nothing to do there.” But, for Severson, things could not have been farther from the case. Like every other California surfer from the 50s, he had grown up riding a redwood board and dreaming of Hawaii’s gargantuan waves.

While working for the Army as an illustrator, Severson was encouraged to surf daily as a member of the US Army Surf Team and fell in with a generation of surfers who pioneered new techniques in big wave riding. After choosing to stay in Hawaii, he began selling his drawings and paintings on the beach, eventually being able to acquire the 16mm camera he used to make his notorious surf films.

Filled with DIY exuberance, Severson’s films of the early 60s were created as a way of celebrating the energy of surfing. Citing Leni Riefenstahl’s Olympia as an influence, Severson oriented his surf footage around the formal splendor of the body in motion. Captured in high contrast black-and-white, the remaining stills of these films depict bodies contorted and flexed against the enormous force of an ink-black ocean. Composed like scenes from another dimension, Severson’s films communicated the ineluctable verve of what was then a niche pastime, sending an invitation to those who had never surfed.

After witnessing the riot-like environment of excitement at his screenings, Severson decided to produce a booklet called The Surfer that he sold during the premieres of his 1960 film Surf Fever. It featured black-and-white photos, writing, and cartoons, as well as surf maps and instructional articles for new surfers. After the booklet sold out five thousand copies, Severson decided to dedicate his attention to creating the magazine now known as SURFER.

As its title suggested, SURFER was a publication that aimed towards expressing the culture of the person who surfed, rather than the sport itself. With hoards of newcomers being brought to surfing by the Beach Boys and Hollywood films such as Gidget, SURFER had a mission to set the record strait. Its editorial program defined surfing as a way of looking out onto the world, an all-encompassing lifestyle that had its own social responsibilities.

Since it was the first magazine of its kind, SURFER gave Severson the freedom to fully craft what has became a massive genre. As surf writer Sam George would later say, “Before John Severson, there was no ‘surf media,’ no ‘surf industry,’ no ‘surf culture’ – not at least in the way we understand it today.’”

by O32c |  Read more:
Image: Greg Noll at Pipeline, 1964. John Severson

What is the Great American Novel?

In Tracy Letts’s play Superior Donuts (2010), Arthur, a bakery owner, is presented with a bundle of notebooks by a new employee, Franco, who explains that they contain “the Great American Novel, my man. Authored by yours truly”. Franco attributes Arthur’s scepticism about this claim to racism: “You think I can’t write the Great American Novel ’cause I’m a black man”. Lawrence Buell’s study of the concept of the “Great American Novel” (or “GAN” as Buell, using Henry James’s acronym, calls it) explains Arthur’s reaction. Before the mid-twentieth century, only one critic believed that a GAN could be written by someone who was not white, and Buell’s survey of literature from Washington Irving to Jonathan Franzen suggests there has been little change since in perceptions of literary greatness.

One of the central claims of The Dream of the Great American Novel is that novels are uniquely well suited to the task of representing what is quintessentially American because they are “carriers and definers of evolving ‘national imaginaries’”. This long and detailed study considers the works of fiction that have, at various times, been deemed contenders for the crown, and attempts to explain why. Buell is also interested in those books that were lauded in their own moment but went out of fashion, and others whose merits are more evident with the benefit of hindsight. Along with exploring the nation’s most distinguished fiction, Buell considers why America might “dream” of locating the single novel that best expresses Americanness. He acknowledges the paradox that, although the concept of the Great American Novel seems to articulate “national swagger”, the most praised GANs are “anything but patriotic”. Rather, they convey “national self-criticism”, typically on the grounds of social inequality.

Introduced in print by John W. De Forest in January 1868, the phrase “Great American Novel” had already been used by P. T. Barnum to mock publishers for puffing their latest books, Buell writes, confirming that the GAN is at least as much a marketing device as a reliable measure of literary merit. The first novel to be named “the Greatest Book of the Age” was Uncle Tom’s Cabin (1852) by Harriet Beecher Stowe, which supposedly provoked the “great war” that ended slavery in the Southern states. For this reason, Buell deems it the preeminent American example of activist art: it “changed the world” and so its status endures despite criticism of its depiction of black people. It also shows that it is possible for a GAN to be written by a woman, although critical consensus suggests that hardly any have been. The heyday of serious debate over the Great American Novel ran from the 1860s to the 1920s, when the promise of the American Dream was equally prominent. After The Great Gatsby (1925) killed the Dream, along with its hero, interest in pinpointing GANs waxed and waned in popularity, perhaps because an increasingly heterogeneous nation found it hard to believe that a single novel – even a very long one – could represent America in all its variety.

Since the function of the GAN is to represent Americanness, Buell proposes that its aims are best fulfilled by a body of work rather than a single novel. The key works he identifies are famous and familiar. Rather than discussing these books in order of publication, Buell arranges them according to four themes, or “scripts”. This decision refreshes critical debate by showing how novels are “in conversation” with each other, and leads to some intriguing comparisons, such as William Faulkner’s Absalom! Absalom! with Margaret Mitchell’s Gone with the Wind, both published in 1936.

by Sarah Graham, TLS |  Read more:
Image: Moby Dick

Friday, August 22, 2014

The Teddybears

Bang Data

What a Difference a Day Makes

Before a romantic Caribbean weekend with her new boyfriend, Amanda Sanders decided she needed a little lift. So she called her doctor, Dr. Norman M. Rowe, to help out.

Dr. Rowe, a plastic surgeon in Manhattan, offers a quick fix — temporary breast enlargement. Instead of surgery, he injects a saline solution into the breasts, which briefly expands them.

The procedure began as as a way for women seeking breast enhancement to determine how they might look if they chose surgery. “We can take pictures and put them on computers, but those are sometimes unrealistic and can lead to false expectations,” Dr. Rowe said (giving new meaning, perhaps, to the term “falsies”). “So we said, if patients are unsure if they want implants, let’s put saline in the breast and let them live with it for 24 hours to see how they like it.”

It may not surprise that the injections were soon being requested as pick-me-ups for parties, weddings, bar mitzvahs, red-carpet events or, as with Ms. Sanders, a tropical vacation.

Ms. Sanders, 41, an image consultant in New York and a mother of two, had been toying with the idea of a breast lift to enhance her “very shallow C cup,” but she was a little reluctant. When she heard of the temporary saline option (cost: $3,500), she leapt at the chance. Twice.

“It was worth it,” she said. “I could wear halter tops and a string bikini and feel really sexy. I’m in the business of vanity. As an image consultant, I have to look the part and be the part.”

While “lunchtime lifts” using injectable fillers similar to Restylane or Juvéderm are available in Europe, they are not F.D.A.-approved in the United States. Macrolane, another filler, was banned in Britain as a breast injectable because it was thought to cloud mammogram readings, among other complications. Saline is essentially saltwater that is absorbed into the bloodstream in about 24 hours.

Breast enhancement surgeries are decidedly popular in the United States. According to the American Society for Aesthetic Plastic Surgery, 313,327 breast augmentations and 137,233 breast lifts were performed in 2013. A noninvasive procedure like a saline injection would seem to be just what the doctor ordered.

by Abby Ellin, NY Times |  Read more:
Image: Caryn Posnansky

The Art of Sprezzatura


[ed. See also: America's Message: At Ease Men]

Sprezzatura is a word often bandied around within the world of fashion and style with reckless abandon. Much like other trademark words we use such as ‘essential’, ‘classic’ and ‘steez’ (which I hate, by the way), the definition of sprezzatura has come to mean many different and varying things, thanks to the general overuse and the lack of knowledge about the subject in general. Luckily, this is where FashionBeans comes in.

Regardless of whether you’re a beginner to the world of menswear or a seasoned pro, we can all benefit from a quick reminder of what sprezzatura really means and how it can be utilised in your own style. It’s an Italian word that first shows up in The Book of the Courtier by Baldassare Castiglione, where it is defined as: ‘a certain nonchalance, so as to conceal all art and make whatever one does or say appear to be without effort and almost without any thought about it.’ This essentially boils down to making difficult actions look easy while concealing the conscious effort that went into them. Or as Yeezy would say – ‘what? This old thing?’

So it boils down to making it seem like you don’t care then? Well, sort of. The easiest translation of sprezzatura is ‘artful dishevelment’ and there is a fine line between achieving it and simply being sloppy.

by Matt Allinson, FashionBeans |  Read more:
Image: uncredited

Wednesday, August 20, 2014


Antoni Clavé (Spanish, 1913–2005), Table Aux Fruits, 1966.
via:

America in Decay: The Sources of Political Dysfunction

The story of the U.S. Forest Service is not an isolated case but representative of a broader trend of political decay; public administration specialists have documented a steady deterioration in the overall quality of American government for more than a generation. In many ways, the U.S. bureaucracy has moved away from the Weberian ideal of an energetic and efficient organization staffed by people chosen for their ability and technical knowledge. The system as a whole is less merit-based: rather than coming from top schools, 45 percent of recent new hires to the federal service are veterans, as mandated by Congress. And a number of surveys of the federal work force paint a depressing picture. According to the scholar Paul Light, “Federal employees appear to be more motivated by compensation than mission, ensnared in careers that cannot compete with business and nonprofits, troubled by the lack of resources to do their jobs, dissatisfied with the rewards for a job well done and the lack of consequences for a job done poorly, and unwilling to trust their own organizations.”

WHY INSTITUTIONS DECAY

In his classic work Political Order in Changing Societies, the political scientist Samuel Huntington used the term “political decay” to explain political instability in many newly independent countries after World War II. Huntington argued that socioeconomic modernization caused problems for traditional political orders, leading to the mobilization of new social groups whose participation could not be accommodated by existing political institutions. Political decay was caused by the inability of institutions to adapt to changing circumstances. Decay was thus in many ways a condition of political development: the old had to break down in order to make way for the new. But the transitions could be extremely chaotic and violent, and there was no guarantee that the old political institutions would continuously and peacefully adapt to new conditions. (...)

The very stability of institutions, however, is also the source of political decay. Institutions are created to meet the demands of specific circumstances, but then circumstances change and institutions fail to adapt. One reason is cognitive: people develop mental models of how the world works and tend to stick to them, even in the face of contradictory evidence. Another reason is group interest: institutions create favored classes of insiders who develop a stake in the status quo and resist pressures to reform. (...)

Political decay thus occurs when institutions fail to adapt to changing external circumstances, either out of intellectual rigidities or because of the power of incumbent elites to protect their positions and block change. Decay can afflict any type of political system, authoritarian or democratic. And while democratic political systems theoretically have self-correcting mechanisms that allow them to reform, they also open themselves up to decay by legitimating the activities of powerful interest groups that can block needed change.

This is precisely what has been happening in the United States in recent decades, as many of its political institutions have become increasingly dysfunctional. A combination of intellectual rigidity and the power of entrenched political actors is preventing those institutions from being reformed. And there is no guarantee that the situation will change much without a major shock to the political order.

by Francis Fukuyama, Foreign Affairs |  Read more:
Image: Max Whittaker/ Reuters

Tuesday, August 19, 2014


Uta Barth

The Teaching Class

When Mary Margaret Vojtko died last September—penniless and virtually homeless and eighty-three years old, having been referred to Adult Protective Services because the effects of living in poverty made it seem to some that she was incapable of caring for herself—it made the news because she was a professor. That a French professor of twenty-five years would be let go from her job without retirement benefits, without even severance, sounded like some tragic mistake. In the Pittsburgh Post-Gazette op-ed that broke the story, Vojtko’s friend and attorney Daniel Kovalik describes an exchange he had with a caseworker from Adult Protective Services: “The caseworker paused and asked with incredulity, ‘She was a professor?’ I said yes. The caseworker was shocked; this was not the usual type of person for whom she was called in to help.” A professor belongs to the professional class, a professor earns a salary and owns a home, probably with a leafy yard, and has good health insurance and a retirement account. In the American imagination, a professor is perhaps disheveled, but as a product of brainy eccentricity, not of penury. In the American university, this is not the case.

Most university-level instructors are, like Vojtko, contingent employees, working on a contract basis year to year or semester to semester. Some of these contingent employees are full-time lecturers, and many are adjunct instructors: part-time employees, paid per class, often without health insurance or retirement benefits. This is a relatively new phenomenon: in 1969, 78 percent of professors held tenure-track positions. By 2009 this percentage had shrunk to 33.5. The rest of the professors holding jobs—whether part time or full time—do so without any job security. These are the conditions that left Vojtko in such a vulnerable position after twenty-five years at Duquesne University. Vojtko was earning between $3,000 and $3,500 per three-credit course. During years when she taught three courses per semester, and an additional two over the summer, she made less than $25,000, and received no health benefits through her employer. Though many universities limit the number of hours that adjunct professors can work each semester, keeping them nominally “part-time” employees, teaching three three-credit courses is certainly a full-time job. These circumstances are now the norm for university instructors, as the number of tenured and tenure-track positions shrinks and the ranks of contingent laborers swell.

A moment of full disclosure: I am an adjunct. I taught freshman composition at Columbia University for two years as a graduate student, then for a few semesters more as an adjunct after I finished my degree. I now tutor in a writing center in the City University of New York system. Many of my friends do this same kind of work at colleges around New York City, commuting from campus to campus, cobbling together more-than-full-time work out of multiple part-time jobs. We talk a lot about how to make adjuncting livable, comparing pay rates at different writing centers and English departments. We crowdsource answers to questions about how to go to the dentist, for example, since none of us has dental insurance—wait for a Groupon for a cleaning, or go to the student dentists at NYU for anything urgent. I do have health insurance at my current job, though I get an email a few times per year informing me that it may expire soon because negotiations between the union and the university over adjunct health insurance have stalled. This is mostly fine—my coverage has never actually been interrupted—but it is hard to swallow the notion that the university that employs me is constantly trying to get out of providing health insurance to teachers, particularly when it announces that it is giving our new chancellor an $18,000/month apartment for free.

So I have closely followed the news and op-ed coverage of the adjunct bubble that followed Vojtke’s death. And while I have been glad to see more attention being paid to the working conditions in higher education, I’ve been surprised that the issue is consistently framed as purely a workers’ rights problem. It is this, of course. But it is not only this.

by Rachel Riederer, Guernica | Read more:
Image: Zeke Berman