Tuesday, August 30, 2016

Can We Save Venice Before It’s Too Late?

[ed. Short answer, it's already too late (as far as industrial tourism is concerned). Longer term question is how much of Venice will still be above water in 20 years?]

A deadly plague haunts Venice, and it’s not the cholera to which Thomas Mann’s character Gustav von Aschenbach succumbed in the Nobel laureate’s 1912 novella “Death in Venice.” A rapacious tourist monoculture threatens Venice’s existence, decimating the historic city and turning the Queen of the Adriatic into a Disneyfied shopping mall.

Millions of tourists pour into Venice’s streets and canals each year, profoundly altering the population and the economy, as many native citizens are banished from the island city and those who remain have no choice but to serve in hotels, restaurants and shops selling glass souvenirs and carnival masks.

Tourism is tearing apart Venice’s social fabric, cohesion and civic culture, growing ever more predatory. The number of visitors to the city may rise even further now that international travelers are avoiding destinations like Turkey and Tunisia because of fears of terrorism and unrest. This means that the 2,400 hotels and other overnight accommodations the city now has no longer satisfy the travel industry’s appetites. The total number of guest quarters in Venice’s historic center could reach 50,000 and take it over entirely.

Just along the Grand Canal, Venice’s main waterway, the last 15 years have seen the closure of state institutions, judicial offices, banks, the German Consulate, medical practices and stores to make way for 16 new hotels.

Alarm at this state of affairs led to last month’s decision by the United Nations Educational, Scientific and Cultural Organization to place Venice on its World Heritage in Danger list unless substantial progress to halt the degradation of the city and its ecosystem is made by next February. Unesco has so far stripped only one city of its status as a heritage site from the more than 1,000 on the list: Dresden, after German authorities ignored Unesco’s 2009 recommendations against building a bridge over the River Elbe that marred the Baroque urban ensemble. Will Venice be next to attain this ignominious status?

In its July report, Unesco’s committee on heritage sites expressed “extreme concern” about “the combination of ongoing transformations and proposed projects threatening irreversible changes to the overall relationship between the City and its Lagoon,” which would, in its thinking, erode the integrity of Venice.

Unesco’s ultimatum stems from several longstanding problems. First, the increasing imbalance between the number of the city’s inhabitants (which plummeted from 174,808 in 1951 to 56,311 in 2014, the most recent year for which numbers are available) and the tourists. Proposed large-scale development, including new deepwater navigation channels and a subway running under the lagoon, would hasten erosion and strain the fragile ecological-urban system that has grown up around Venice.

For now, gigantic cruise liners regularly parade in front of Piazza San Marco, the city’s main public square, mocking the achievements of the last 1,500 years. To mention but one, the M.S.C. Divina is 222 feet high, twice as tall as the Doge’s Palace, a landmark of the city that was built in the 14th century. At times, a dozen liners have entered the lagoon in a single day.

by Salvatore Settis, NY Times |  Read more:
Image: Venice, uncredited

Monday, August 29, 2016

How to Make Omurice (Japanese Fried Rice Omelette)


There's a video on YouTube that I've watched several times over the past couple of years. In it, a chef in Kyoto makes a plate ofomurice with a deftness and perfection of technique that may be unrivaled. He starts by frying rice in a carbon steel skillet, tossing it every which way until each grain is coated in a sheen of demi-glace and oil. Then he packs it into an oval mold and turns it out in a tight mound on a plate.

He then proceeds to make what is perhaps the greatest French omelette ever executed, cooking it in that same perfectly seasoned carbon steel skillet, stirring the egg with chopsticks, rolling it up, gently tossing it, rotating it, and finally tipping it out of the pan onto that mound of rice. He then grabs a knife and slices through the top of the omelette from end to end, unfurling it in a custardy cascade of soft-cooked egg curds. It's an act of such prowess, such beauty, such tantalizing food-porniness that it's easy to conclude there's no hope of ever making such a dish at home.

And that's where I want to step in. Because you absolutely can and should make this at home. I realized this while watching a cook make omurice on a trip to Japan back in July (my travel and lodging were paid for by the Tokyo Convention & Visitors Bureau). The cook was working with a flat griddle, not a carbon steel skillet. He fried the rice on that griddle, and, after mounding it on a plate, made the omelette on the griddle, too. Except that it wasn't a true rolled omelette. Instead, he poured the beaten eggs into a round on the griddle...and that was it. As soon as the eggs were set on the bottom and just slightly runny on top, he lifted the round with a couple of spatulas and set it down over the rice.

As fun as it is to master a French omelette, in this particular case, it's an unnecessary flourish that—while it makes for great showmanship—does little to improve the final dish, since you end up unrolling the omelette anyway. By not bothering to roll the omelette in the first place, you sidestep the entire technical challenge.

For those unfamiliar with omurice, it's a Japanese invention that combines an omelette with fried rice. You'll often hear it referred to as omuraisu (a contraction of the words "omuretsu" and "raisu," the Japanese pronunciations of "omelette" and "rice"), oromumeshi, which fully translates "rice" into Japanese. Some versions have the rice rolled up in the omelette; you can watch the very same Kyoto chef do that here.

by Daniel Gritzer, Serious Eats |  Read more: 
Image: YouTube

Yes We Scan

It’s the dead of winter in Stockholm and I’m sitting in a very small room inside the very inaptly named Calm Body Modification clinic. A few feet away sits the syringe that will, soon enough, plunge into the fat between my thumb and forefinger and deposit a glass-encased microchip roughly the size of an engorged grain of rice.

“You freaking out a little?” asks Calm’s proprietor, a heavily tattooed man named Chai, as he runs an alcohol-soaked cotton swab across my hand. “It’s all right. You’re getting a microchip implanted inside your body. It’d be weird if you weren’t freaking out a little bit.” Of Course It Fucking Hurts!, his T-shirt admonishes in bold type.

My choice to get microchipped was not ceremonial. It was neither a transhumanist statement nor the fulfillment of a childhood dream born of afternoons reading science fiction. I was here in Stockholm, a city that’s supposedly left cash behind, to see out the extreme conclusion of a monthlong experiment to live without cash, physical credit cards, and, eventually, later in the month, state-backed currency altogether, in a bid to see for myself what the future of money — as is currently being written by Silicon Valley — might look like.

Some of most powerful corporations in the world — Apple, Facebook, and Google; the Goliaths, the big guys, the companies that make the safest bets and rarely lose — are pouring resources and muscle into the payments industry, historically a complicated, low-margin business. Meanwhile, companies like Uber and Airbnb have been forced to become payments giants themselves, helping to facilitate and process millions of transactions (and millions of dollars) each day. A recent report from the auditor KPMG revealed that global investment in fintech — financial technology, that is — totaled $19.1 billion in 2015, a 106% jump compared to 2014; venture capital investment alone nearly quintupled between 2012 and last year. In 2014, Americans spent more than $3.68 billion using tap-to-pay tech, according to eMarketer. In 2015, that number was $8.71 billion, and in 2019, it’s projected to hit $210.45 billion. As Apple CEO Tim Cook told (warned?) a crowd in the U.K. last November, “Your kids will not know what money is.”

To hear Silicon Valley tell it, the broken-in leather wallet is on life support. I wanted to pull the plug. Which is how, ultimately, I found myself in this sterile Swedish backroom staring down a syringe the size of a pipe cleaner. I was here because I wanted to see the future of money. But really, I just wanted to pay for some shit with a microchip in my hand.
------
The first thing you’ll notice if you ever decide to surrender your wallet is how damn many apps you’ll need in order to replace it. You’ll need a mobile credit card replacement — Apple Pay or Android Pay — for starters, but you’ll also need person-to-person payment apps like Venmo, PayPal, and Square Cash. Then don’t forget the lesser-knowns: Dwolla, Tilt, Tab, LevelUp, SEQR, Popmoney, P2P Payments, and Flint. Then you might as well embrace the cryptocurrency of the future, bitcoin, by downloading Circle, Breadwallet, Coinbase, Fold, Gliph, Xapo, and Blockchain. You’ll also want to cover your bases with individual retailer payment apps like Starbucks, Walmart, USPS Mobile, Exxon Speedpass, and Shell Motorist, to name but a few. Plus public and regular transit apps — Septa in Philadelphia, NJ Transit in New Jersey, Zipcar, Uber, Lyft. And because you have to eat and drink, Seamless, Drizly, Foodler, Saucey, Waitress, Munchery, and Sprig. The future is fractured.

This isn’t lost on Bryan Yeager, a senior analyst who covers payments for eMarketer. “This kind of piecemeal fragmentation is probably one of the biggest inhibitors out there,” he said. “I’ll be honest: It’s very confusing, not just to me, but to most customers. And it really erodes the value proposition that mobile payments are simpler.”

On a frigid January afternoon in Midtown Manhattan, just hours into my experiment, I found myself at 2 Bros., a red-tiled, fluorescent-lit pizza shop that operates with an aversion to frills. As I made my way past a row of stainless steel ovens, I watched the patrons in front of me grab their glistening slices while wordlessly forking over mangled bills, as has been our country’s custom for a century and a half. When my turn came to order, I croaked what was already my least-favorite phrase: “Do you, um, take Apple Pay?” The man behind the counter blinked four times before (wisely) declaring me a lost cause and moving to the next person in line.

This kind of bewildered rejection was fairly common. A change may be coming for money, but not everyone’s on board yet, and Yaeger’s entirely correct that the “simple” value proposition hasn’t entirely come to pass. Paying with the wave of a phone, I found, pushes you toward extremes; to submit to the will of one of the major mobile wallets is to choose between big-box retailers and chain restaurants and small, niche luxury stores. The only business in my Brooklyn neighborhood that took Apple Pay or Android Pay was a cafe where a large iced coffee runs upwards of $5; globally, most of the businesses that have signed on as Apple Pay partners are large national chains like Jamba Juice, Pep Boys, Best Buy, and Macy’s.

Partially for this reason, the primary way most Americans are currently experiencing the great fintech boom isn’t through Apple or Android Pay at all, but through proprietary payment apps from chains such as Target, Walmart, and Starbucks — as of last October, an astonishing 1 in 5 of all Starbucks transactions in the U.S. were done through the company’s mobile app. It wouldn’t be all that hard to live a fully functional — if possibly boring — cash-free consumer life by tapping and swiping the proprietary apps of our nation’s biggest stores.

If that doesn’t feel revolutionary or particularly futuristic, it’s because it’s not really meant to. But the future of mobile retail is assuredly dystopian. Just ask Andy O’Dell, who works for Clutch, a marketing company that helps with consumer loyalty programs and deals with these kinds of mobile purchasing apps. “Apple Pay and the Starbucks payment app have nothing to do with actual payments,” he told me. “The power of payments and the future of these programs is in the data they generate.”

Imagine this future: Every day you go to Starbucks before work because it’s right near your house. You use the app, and to ensure your reliable patronage, Starbucks coughs up a loyalty reward, giving you a free cup of coffee every 15 visits. Great deal, you say! O’Dell disagrees. According to him, Starbucks is just hurting its margins by giving you something you’d already be buying. The real trick, he argued, is changing your behavior. He offers a new scenario where this time, instead of a free coffee every 15 visits, you get a free danish — which you try and then realize it goes great with coffee. So you start buying a danish once a week, then maybe twice a week, until it starts to feel like it was your idea all along.

In that case, O’Dell said, Starbucks has “changed my behavior and captured more share of my wallet, and they’ve also given me more of what I want.”

“That’s terrifying,” I told him.

“But that’s the brave new world, man,” he shot back. “Moving payments from plastic swipes to digital taps is going to change how companies influence your behavior. That’s what you’re asking, right? Well, that’s how we’re doing it.”

In this sense, the payments rush is, in no small part, a data rush. Creating a wallet that’s just a digital version of the one you keep in your pocket is not the endgame. But figuring out where you shop, when you shop, and exactly what products you have an affinity for, and then bundling all that information in digestible chunks to inform the marketers of the world? Being able to, as O’Dell puts it, “drive you to the outcome they want you to have like a rat in a maze by understanding, down to your personality, who you are”? That’s disruption worth investing in.

by Charlie Warzel, Buzzfeed | Read more:
Image: Katie Notopoulos / BuzzFeed News

Sunday, August 28, 2016

Toquinho & Gilberto Gil

Colin Kaepernick Is Righter Than You Know: The National Anthem Is a Celebration of Slavery

[ed. Personally, I vote for America the Beautiful.]

Before a preseason game on Friday, San Francisco 49ers quarterback Colin Kaepernick refused to stand for the playing of “The Star-Spangled Banner.” When he explained why, he only spoke about the present: “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color. … There are bodies in the street and people getting paid leave and getting away with murder.”

Twitter then went predictably nuts, with at least one 49ers fan burning Kaepernick’s jersey.

Almost no one seems to be aware that even if the U.S. were a perfect country today, it would be bizarre to expect African-American players to stand for “The Star-Spangled Banner.” Why? Because it literally celebrates the murder of African-Americans.

Few people know this because we only ever sing the first verse. But read the end of the third verse and you’ll see why “The Star-Spangled Banner” is not just a musical atrocity, it’s an intellectual and moral one, too:

No refuge could save the hireling and slave
From the terror of flight or the gloom of the grave,
And the star-spangled banner in triumph doth wave
O’er the land of the free and the home of the brave.

“The Star-Spangled Banner,” Americans hazily remember, was written by Francis Scott Key about the Battle of Fort McHenry in Baltimore during the War of 1812. But we don’t ever talk about how the War of 1812 was a war of aggression that began with an attempt by the U.S. to grab Canada from the British Empire.

However, we’d wildly overestimated the strength of the U.S. military. By the time of the Battle of Fort McHenry in 1814, the British had counterattacked and overrun Washington, D.C., setting fire to the White House.

And one of the key tactics behind the British military’s success was its active recruitment of American slaves. As a detailed 2014 article in Harper’s explains, the orders given to the Royal Navy’s Admiral Sir George Cockburn read:
Let the landings you make be more for the protection of the desertion of the Black Population than with a view to any other advantage. … The great point to be attained is the cordial Support of the Black population. With them properly armed & backed with 20,000 British Troops, Mr. Madison will be hurled from his throne.
Whole families found their way to the ships of the British, who accepted everyone and pledged no one would be given back to their “owners.” Adult men were trained to create a regiment called the Colonial Marines, who participated in many of the most important battles, including the August 1814 raid on Washington.

Then on the night of September 13, 1814, the British bombarded Fort McHenry. Key, seeing the fort’s flag the next morning, was inspired to write the lyrics for “The Star-Spangled Banner.”

So when Key penned “No refuge could save the hireling and slave / From the terror of flight or the gloom of the grave,” he was taking great satisfaction in the death of slaves who’d freed themselves. His perspective may have been affected by the fact he owned several slaves himself.

With that in mind, think again about the next two lines: “And the star-spangled banner in triumph doth wave / O’er the land of the free and the home of the brave.”

The reality is that there were human beings fighting for freedom with incredible bravery during the War of 1812. However, “The Star-Spangled Banner” glorifies America’s “triumph” over them — and then turns that reality completely upside down, transforming their killers into the courageous freedom fighters.

by Jon Swarz, The Intercept |  Read more:
Image: Peter Joneleit/Cal Sport Media/AP Images
“He tells it like it is.”

via:
[ed. Nice to see people put some effort into their work.]

Stadiums and Other Sacred Cows

[ed. See also: 5 Amazing Things About the Minnesota Vikings' New Stadium.]

There’s a strange sort of reverence that surrounds our relationship with sports. Jay Coakley first noticed it as a graduate student at the University of Notre Dame, in Indiana; he was studying sociology, so perhaps it was hard not to analyze the sport-centered culture that surrounded him. He observed the hype around football weekends, the mania of pep rallies, and the fundraising enthusiasm of booster clubs. He noticed that football players always seemed to have the nicest cars—and heard through his wife, who worked at the registrar at the time, that sometimes transcripts were changed to keep players on the field.

He was so intrigued that he proposed doing his thesis and dissertation on the topic.

“My faculty advisor in the sociology department said, ‘Are you crazy? You have to focus on something serious, not sports,’ ” Coakley recalls. “I said, ‘How can anything be more serious than something that evokes almost 100 percent of the interest of 100 percent of the people on this campus for five to six weekends of the year at least?’ ”

Coakley ended up doing his dissertation on the racial and religious identities of black Catholic priests, and his Master’s thesis on the race violence seen around the country in 1968. Yet sport, laden as it was with many of the societal tensions he saw in his graduate work, continued to draw him back in. He proposed courses on sports and leisure; he conducted independent studies discussing what sports meant to various individuals; he worked with PTAs and parks and recreation departments; and he began focusing on coaching education. As the years passed, Coakley became one of the most respected authorities in the growing field of sports sociology—a much more serious field than his academic advisor might have ever expected.

Along the way, Coakley developed a theory that finally explained the strange behavior he had first seen at Notre Dame, and which he continued to see throughout the athletic world. He called it “The Great Sports Myth”: the widespread assumption that sport is, inherently, a force of good—despite the fact that it can both empower and humiliate, build bonds and destroy them, blur boundaries and marginalize.

Nautilus sat down with Coakley to talk about the unassailable mythos around sport, and the widespread impacts it can have on our society.

How did you come up with the idea of the Great Sports Myth?

I developed the Great Sports Myth when I was working here in Fort Collins, with a group that was opposing Colorado State University building a $220 million on-campus football stadium, the final cost of which—with just interest—would be over $400 million, and there will be cost overruns in addition to that. All when they have a stadium two miles from campus that needs some renovation, but nevertheless has been a decent place to play. I was working with a group that was opposing this—and by the way, 80 percent of the faculty opposed it, 65 percent of the students opposed it. But there were people who were talking about what this new stadium was going to do, and no matter what kinds of data you came up with to ask them to raise questions about their assumptions, they rejected the data. They rejected all the arguments.

It seemed to me that their position was grounded in something very much like religious faith. I was trying to figure out what was going on, and that’s when I came up with this notion of the Great Sports Myth, which they were using as a basis for rejecting facts, good studies, good logical arguments, and stating that: This is going to be good despite what anybody was saying in opposition.

What’s the historical context of this attitude?


It appears that sports became integrated into American society as a spinoff of what was going on in England. There was this sense that the sons of the elites in society needed something to make them into men, and sport was identified as the mechanism through which that could be done.

That idea was transferred to the United States, but in the United States the importance of sports was tied to a host other factors as well: The need for productivity, the need to socialize and assimilate different immigrant groups, the need to create a military, the need to control young people running loose on the streets during the latter part of the 19th century. So what happened was that sports came to be identified as an important socializing mechanism for boys, a social control mechanism, and a developmental mechanism.

People became committed to sports because it was tied to their own interests as well. And it got put on a pedestal. We even revised Greek history to reify the purity and goodness of sports—talking about how sports were important for developmental purposes among the Greeks, and how they stopped wars to have the Olympic Games.

Sport then gets integrated into the schools in the United States, and all sorts of functions are attributed to it without our ever really examining whether those were valid or not. And so we’ve developed this sense that sport is beyond reproach. If there are any problems associated with sport, it has to be due to bad apples that are involved in it, who are somehow incorrigible enough that they can’t learn the lessons that sport teaches, so we have to get rid of them.

That fits into American culture as a whole, and our emphasis on individualism, personal choice and individual responsibility. (...)

How has this culture trickled down to other aspects of society?

Because sport is a source of excitement, pleasure, and joy, we are less likely to critique it. Sport has also served the interests of powerful people within our culture. It reifies competition and the whole notion of meritocracy, of distributing rewards to the winners, and that people who are successful deserve success. It becomes tied to all sorts of important factors within our culture.

There is this whole sense of the connection between sport and development, for example— both individual development and community development—that gets used by people who want to use sport to further their own interests. For example, by getting $500 million of public money for a stadium that they used to generate private profits.

by Brian J. Barth, Nautilus |  Read more:
Image: Gabriel Heusi / Brasil2016.gov.br/ Wikipedia

National Health Care Struggling

[ed. See also: Obamacare’s Faltering for One Simple Reason: Profit.]

With the hourglass running out for his administration, President Barack Obama's health care law is struggling in many parts of the country. Double-digit premium increases and exits by big-name insurers have caused some to wonder whether "Obamacare" will go down as a failed experiment.

If Democrat Hillary Clinton wins the White House, expect her to mount a rescue effort. But how much Clinton could do depends on finding willing partners in Congress and among Republican governors, a real political challenge.

"There are turbulent waters," said Kathleen Sebelius, Obama's first secretary of Health and Human Services. "But do I see this as a death knell? No."

Next year's health insurance sign-up season starts a week before the Nov. 8 election, and the previews have been brutal. Premiums are expected to go up sharply in many insurance marketplaces, which offer subsidized private coverage to people lacking access to job-based plans.

At the same time, retrenchment by insurers that have lost hundreds of millions of dollars means that more areas will become one-insurer markets, losing the benefits of competition. The consulting firm Avalere Health projects that seven states will only have one insurer in each of their marketplace regions next year.

Administration officials say insurers set prices too low in a bid to gain market share, and the correction is leading to sticker shock. Insurers blame the problems on sicker-than-expected customers, disappointing enrollment and a premium stabilization system that failed to work as advertised. They also say some people are gaming the system, taking advantage of guaranteed coverage to get medical care only when they are sick.

Not all state markets are in trouble. What is more important, most of the 11 million people covered through HealthCare.gov and its state-run counterparts will be cushioned from premium increases by government subsidies that rise with the cost.

But many customers may have to switch to less comprehensive plans to keep their monthly premiums down. And millions of people who buy individual policies outside the government marketplaces get no financial help. They will have to pay the full increases or go without coverage and risk fines. (People with employer coverage and Medicare are largely unaffected.)

Tennessee's insurance commissioner said recently that the individual health insurance market in her state is "very near collapse." Premiums for the biggest insurer are expected to increase by an average of 62 percent. Two competitors will post average increases of 46 percent and 44 percent.

But because the spigot of federal subsidies remains wide open, an implosion of health insurance markets around the country seems unlikely. More than 8 out of 10 HealthCare.gov customers get subsidies covering about 70 percent of their total premiums. Instead, the damage is likely to be gradual. Rising premiums deter healthy people from signing up, leaving an insurance pool that's more expensive to cover each succeeding year.

by Ricardo Alonso-Zaldivar, AP |  Read more:
Image: via:

Saturday, August 27, 2016


[ed. Ok, I've changed my mind about Trump (for World President). Warning! - you can't unwatch this.]

The World Wide Cage

I’d taken up blogging early in 2005, just as it seemed everyone was talking about ‘the blogosphere’. I’d discovered, after a little digging on the domain registrar GoDaddy, that ‘roughtype.com’ was still available (an uncharacteristic oversight by pornographers), so I called my blog Rough Type. The name seemed to fit the provisional, serve-it-raw quality of online writing at the time.

Blogging has since been subsumed into journalism – it’s lost its personality – but back then it did feel like something new in the world, a literary frontier. The collectivist claptrap about ‘conversational media’ and ‘hive minds’ that came to surround the blogosphere missed the point. Blogs were crankily personal productions. They were diaries written in public, running commentaries on whatever the writer happened to be reading or watching or thinking about at the moment. As Andrew Sullivan, one of the form’s pioneers, put it: ‘You just say what the hell you want.’ The style suited the jitteriness of the web, that needy, oceanic churning. A blog was critical impressionism, or impressionistic criticism, and it had the immediacy of an argument in a bar. You hit the Publish button, and your post was out there on the world wide web, for everyone to see.

Or to ignore. Rough Type’s early readership was trifling, which, in retrospect, was a blessing. I started blogging without knowing what the hell I wanted to say. I was a mumbler in a loud bazaar. Then, in the summer of 2005, Web 2.0 arrived. The commercial internet, comatose since the dot-com crash of 2000, was up on its feet, wide-eyed and hungry. Sites such as MySpace, Flickr, LinkedIn and the recently launched Facebook were pulling money back into Silicon Valley. Nerds were getting rich again. But the fledgling social networks, together with the rapidly inflating blogosphere and the endlessly discussed Wikipedia, seemed to herald something bigger than another gold rush. They were, if you could trust the hype, the vanguard of a democratic revolution in media and communication – a revolution that would change society forever. A new age was dawning, with a sunrise worthy of the Hudson River School. (...)

The millenarian rhetoric swelled with the arrival of Web 2.0. ‘Behold,’ proclaimed Wired in an August 2005 cover story: we are entering a ‘new world’, powered not by God’s grace but by the web’s ‘electricity of participation’. It would be a paradise of our own making, ‘manufactured by users’. History’s databases would be erased, humankind rebooted. ‘You and I are alive at this moment.’

The revelation continues to this day, the technological paradise forever glittering on the horizon. Even money men have taken sidelines in starry-eyed futurism. In 2014, the venture capitalist Marc Andreessen sent out a rhapsodic series of tweets – he called it a ‘tweetstorm’ – announcing that computers and robots were about to liberate us all from ‘physical need constraints’. Echoing Etzler (and Karl Marx), he declared that ‘for the first time in history’ humankind would be able to express its full and true nature: ‘we will be whoever we want to be.’ And: ‘The main fields of human endeavour will be culture, arts, sciences, creativity, philosophy, experimentation, exploration, adventure.’ The only thing he left out was the vegetables.

Such prophesies might be dismissed as the prattle of overindulged rich guys, but for one thing: they’ve shaped public opinion. By spreading a utopian view of technology, a view that defines progress as essentially technological, they’ve encouraged people to switch off their critical faculties and give Silicon Valley entrepreneurs and financiers free rein in remaking culture to fit their commercial interests. If, after all, the technologists are creating a world of superabundance, a world without work or want, their interests must be indistinguishable from society’s. To stand in their way, or even to question their motives and tactics, would be self-defeating. It would serve only to delay the wonderful inevitable.

The Silicon Valley line has been given an academic imprimatur by theorists from universities and think tanks. Intellectuals spanning the political spectrum, from Randian right to Marxian left, have portrayed the computer network as a technology of emancipation. The virtual world, they argue, provides an escape from repressive social, corporate and governmental constraints; it frees people to exercise their volition and creativity unfettered, whether as entrepreneurs seeking riches in the marketplace or as volunteers engaged in ‘social production’ outside the marketplace. As the Harvard law professor Yochai Benkler wrote in his influential book The Wealth of Networks (2006):
This new freedom holds great practical promise: as a dimension of individual freedom; as a platform for better democratic participation; as a medium to foster a more critical and self-reflective culture; and, in an increasingly information-dependent global economy, as a mechanism to achieve improvements in human development everywhere.
Calling it a revolution, he said, is no exaggeration.

Benkler and his cohort had good intentions, but their assumptions were bad. They put too much stock in the early history of the web, when the system’s commercial and social structures were inchoate, its users a skewed sample of the population. They failed to appreciate how the network would funnel the energies of the people into a centrally administered, tightly monitored information system organised to enrich a small group of businesses and their owners.

The network would indeed generate a lot of wealth, but it would be wealth of the Adam Smith sort – and it would be concentrated in a few hands, not widely spread. The culture that emerged on the network, and that now extends deep into our lives and psyches, is characterised by frenetic production and consumption – smartphones have made media machines of us all – but little real empowerment and even less reflectiveness. It’s a culture of distraction and dependency. That’s not to deny the benefits of having easy access to an efficient, universal system of information exchange. It is to deny the mythology that shrouds the system. And it is to deny the assumption that the system, in order to provide its benefits, had to take its present form.

Late in his life, the economist John Kenneth Galbraith coined the term ‘innocent fraud’. He used it to describe a lie or a half-truth that, because it suits the needs or views of those in power, is presented as fact. After much repetition, the fiction becomes common wisdom. ‘It is innocent because most who employ it are without conscious guilt,’ Galbraith wrote in 1999. ‘It is fraud because it is quietly in the service of special interest.’ The idea of the computer network as an engine of liberation is an innocent fraud.

I love a good gizmo. When, as a teenager, I sat down at a computer for the first time – a bulging, monochromatic terminal connected to a two-ton mainframe processor – I was wonderstruck. As soon as affordable PCs came along, I surrounded myself with beige boxes, floppy disks and what used to be called ‘peripherals’. A computer, I found, was a tool of many uses but also a puzzle of many mysteries. The more time you spent figuring out how it worked, learning its language and logic, probing its limits, the more possibilities it opened. Like the best of tools, it invited and rewarded curiosity. And it was fun, head crashes and fatal errors notwithstanding.

In the early 1990s, I launched a browser for the first time and watched the gates of the web open. I was enthralled – so much territory, so few rules. But it didn’t take long for the carpetbaggers to arrive. The territory began to be subdivided, strip-malled and, as the monetary value of its data banks grew, strip-mined. My excitement remained, but it was tempered by wariness. I sensed that foreign agents were slipping into my computer through its connection to the web. What had been a tool under my own control was morphing into a medium under the control of others. The computer screen was becoming, as all mass media tend to become, an environment, a surrounding, an enclosure, at worst a cage. It seemed clear that those who controlled the omnipresent screen would, if given their way, control culture as well.

by Nicholas Carr, Aeon |  Read more:
Image: Albert Gea/Reuters

Radical Flâneuserie

I started noticing the ads in the magazines I read. Here is a woman in an asymmetrical black swimsuit, a semitransparent palm tree superimposed on her head, a pink pole behind her. Here is a woman lying down, miraculously balanced on some kind of balustrade, in a white button-down, khaki skirt, and sandals, the same dynamic play of light and palm trees and buildings around her. In the top-right corner, the words Dans l’oeil du flâneur—“in the eye of the flâneur”—and beneath, the Hermès logo. The flâneur though whose “eye” we’re seeing seems to live in Miami. Not a well-known walking city, but why not—surely flânerie needn’t be confined to melancholic European capitals.

The theme was set by Hermès’s artistic director, Pierre-Alexis Dumas. While the media coverage of the campaign and the traveling exhibition that complemented it breathlessly adopted the term, Dumas gave a pretty illuminated definition of it. Flânerie, he explained, is not about “being idle” or “doing nothing.” It’s an “attitude of curiosity … about exploring everything.” It flourished in the nineteenth century, he continued, as a form of resistance to industrialization and the rationalization of everyday life, and “the roots of the spirit of Hermès are in nineteenth-century Flânerie.” This is pretty radical rhetoric for the director of a luxury-goods company with a €4.1 million yearly revenue. Looking at the ads, as well as the merchandise—including an eight-speed bicycle called “The Flâneur” that retailed for $11.3k—it seems someone at Hermès didn’t share, or understand, Dumas’s vision.

There’s something so attractive about wandering aimlessly through the city, taking it all in (especially if we’re wearing Hermès while we do it). We all, deep down, want to detach from our lives. The flâneur, since everyone wants to be one, has a long history of being many different things to different people, to such an extent that the concept has become one of these things we point to without really knowing what we mean—a kind of shorthand for urban, intellectual, curious, cosmopolitan. This is what Hermès is counting on: that we will associate Hermès products with those values and come to believe that buying them will reinforce those aspects of ourselves.

The earliest mention of a flâneur is in the late sixteenth century, possibly borrowed from the Scandinavian flana, “a person who wanders.” It fell largely out of use until the nineteenth century, and then it caught on again. In 1806, an anonymous pamphleteer wrote of the flâneur as “M. Bonhomme,” a man-about-town who comes from sufficient wealth to be able to have the time to wander the city at will, taking in the urban spectacle. He hangs out in cafés and watches the various inhabitants of the city at work and at play. He is interested in gossip and fashion, but not particularly in women. In an 1829 dictionary, a flâneur is someone “who likes to do nothing,” someone who relishes idleness. Balzac’s flâneur took two main forms: the common flâneur, happy to aimlessly wander the streets, and the artist-flâneur, who poured his experiences in the city into his work. (This was the more miserable type of flâneur, who, Balzac noted in his 1837 novel César Birotteau, “is just as frequently a desperate man as an idle one.”) Baudelaire similarly believed that the ultimate flâneur, the true connoisseur of the city, was an artist who “sang of the sorry dog, the poor dog, the homeless dog, the wandering dog [le chien flâneur].” Walter Benjamin’s flâneur, on the other hand, was more feral, a figure who “completely distances himself from the type of the philosophical promenader, and takes on the features of the werewolf restlessly roaming a social wildness,” he wrote in the late 1930s. An “intoxication” comes over him as he walks “long and aimlessly through the streets.”

And so the flâneur shape-shifts according to time, place, and agenda. If he didn’t exist, we would have had to invent him to embody our fantasies about nineteenth-century Paris—or about ourselves, today.

Hermès is similarly ambiguous about who, exactly, the flâneur in their ads is. Is he the man (or woman?) looking at the woman on the balustrade? Or is she the flâneur, too? Is the flâneur the photographer, or the (male?) gaze he represents? Is there a flâneuse, in Hermès’ version? Are we looking at her? Are we—am I, holding the magazine—her?

But I can’t be, because I’m the woman holding the magazine, being asked to buy Hermès products. I click through the pictures of the exhibition Hermès organized on the banks of the Seine, Wanderland, and one of the curiosities on view—joining nineteenth-century canes, an array of ties, an Hermès purse handcuffed to a coatrack—is an image of an androgynous person crossing the road, holding a stack of boxes so high he or she can’t see around them. Is this flânerie, Hermès-style?

Many critics over the years have argued that shopping was at odds with the idle strolling of the flâneur: he walked the arcades, the glass-roofed shopping streets that were the precursor to the department store, but he did not shop. Priscilla Parkhurst Ferguson, writing on the flâneur in her book Paris as Revolution, argues that women could not flâner because women who were shopping in the grands magasins were caught in an economy of spectacle, being tricked into buying things, and having their desires stimulated. By contrast the flâneur’s very raison d’être was having no reason whatsoever.

Before the twentieth century, women did not have the freedom to wander idly through the streets of Paris. The only women with the freedom to circulate (and a limited freedom at that) were the streetwalkers and ragpickers; Baudelaire’s mysterious and alluring passante, immortalized in his poem “To a (Female) Passer-by,” is assumed to have been a woman of the night. Even the word flâneuse doesn’t technically exist in French, except, according to an 1877 dictionary entry, to designate a kind of lounge chair. (So Hermès’s woman reclining on a balustrade was right on the money, for the late nineteenth century.)

But why must the flâneuse be restricted to being a female version of a male concept, especially when no one can agree on what the flâneur is anyway? Why not look at what women were actually doing on the city streets? What could the flâneuse look like then?

by Lauren Elkin, Paris Review | Read more:
Image: John Singer Sargent, A Street in Venice

Thursday, August 25, 2016

U.S. Army Fudged Its Accounts By Trillions of Dollars

[ed. That's Trillions, with a capital "T". Thousands of billions. And this is just the Army. What about the rest of our military services? If you keep throwing insane amounts of money at government agencies with almost zero accountability, what would you expect?]

The United States Army’s finances are so jumbled it had to make trillions of dollars of improper accounting adjustments to create an illusion that its books are balanced.

The Defense Department’s Inspector General, in a June report, said the Army made $2.8 trillion in wrongful adjustments to accounting entries in one quarter alone in 2015, and $6.5 trillion for the year. Yet the Army lacked receipts and invoices to support those numbers or simply made them up.

As a result, the Army’s financial statements for 2015 were “materially misstated,” the report concluded. The “forced” adjustments rendered the statements useless because “DoD and Army managers could not rely on the data in their accounting systems when making management and resource decisions.”

Disclosure of the Army’s manipulation of numbers is the latest example of the severe accounting problems plaguing the Defense Department for decades.

The report affirms a 2013 Reuters series revealing how the Defense Department falsified accounting on a large scale as it scrambled to close its books. As a result, there has been no way to know how the Defense Department – far and away the biggest chunk of Congress’ annual budget – spends the public’s money.

The new report focused on the Army’s General Fund, the bigger of its two main accounts, with assets of $282.6 billion in 2015. The Army lost or didn’t keep required data, and much of the data it had was inaccurate, the IG said.

“Where is the money going? Nobody knows,” said Franklin Spinney, a retired military analyst for the Pentagon and critic of Defense Department planning.

The significance of the accounting problem goes beyond mere concern for balancing books, Spinney said. Both presidential candidates have called for increasing defense spending amid current global tension.

An accurate accounting could reveal deeper problems in how the Defense Department spends its money. Its 2016 budget is $573 billion, more than half of the annual budget appropriated by Congress.

The Army account’s errors will likely carry consequences for the entire Defense Department.

by Scot J. Paltrow, Reuters |  Read more:
Image: US Army, PD; US Money, Geralt, PD via

Nels Cline & Julian Lage

Inside Facebook’s (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-Media Machine

[ed. If this is the Facebook experience these days (I'm not a user) then it must be a truly depressing/irritating place to visit.]

Open your Facebook feed. What do you see? A photo of a close friend’s child. An automatically generated slide show commemorating six years of friendship between two acquaintances. An eerily on-target ad for something you’ve been meaning to buy. A funny video. A sad video. A recently live video. Lots of video; more video than you remember from before. A somewhat less-on-target ad. Someone you saw yesterday feeling blessed. Someone you haven’t seen in 10 years feeling worried.

And then: A family member who loves politics asking, “Is this really who we want to be president?” A co-worker, whom you’ve never heard talk about politics, asking the same about a different candidate. A story about Donald Trump that “just can’t be true” in a figurative sense. A story about Donald Trump that “just can’t be true” in a literal sense. A video of Bernie Sanders speaking, overlaid with text, shared from a source you’ve never seen before, viewed 15 million times. An article questioning Hillary Clinton’s honesty; a headline questioning Donald Trump’s sanity. A few shares that go a bit too far: headlines you would never pass along yourself but that you might tap, read and probably not forget.

Maybe you’ve noticed your feed becoming bluer; maybe you’ve felt it becoming redder. Either way, in the last year, it has almost certainly become more intense. You’ve seen a lot of media sources you don’t recognize and a lot of posts bearing no memorable brand at all. You’ve seen politicians and celebrities and corporations weigh in directly; you’ve probably seen posts from the candidates themselves. You’ve seen people you’re close to and people you’re not, with increasing levels of urgency, declare it is now time to speak up, to take a stand, to set aside allegiances or hangups or political correctness or hate.

Facebook, in the years leading up to this election, hasn’t just become nearly ubiquitous among American internet users; it has centralized online news consumption in an unprecedented way. According to the company, its site is used by more than 200 million people in the United States each month, out of a total population of 320 million. A 2016 Pew study found that 44 percent of Americans read or watch news on Facebook. These are approximate exterior dimensions and can tell us only so much. But we can know, based on these facts alone, that Facebook is hosting a huge portion of the political conversation in America.

The Facebook product, to users in 2016, is familiar yet subtly expansive. Its algorithms have their pick of text, photos and video produced and posted by established media organizations large and small, local and national, openly partisan or nominally unbiased. But there’s also a new and distinctive sort of operation that has become hard to miss: political news and advocacy pages made specifically for Facebook, uniquely positioned and cleverly engineered to reach audiences exclusively in the context of the news feed. These are news sources that essentially do not exist outside of Facebook, and you’ve probably never heard of them. They have names like Occupy Democrats; The Angry Patriot; US Chronicle; Addicting Info; RightAlerts; Being Liberal; Opposing Views; Fed-Up Americans; American News; and hundreds more. Some of these pages have millions of followers; many have hundreds of thousands.

Using a tool called CrowdTangle, which tracks engagement for Facebook pages across the network, you can see which pages are most shared, liked and commented on, and which pages dominate the conversation around election topics. Using this data, I was able to speak to a wide array of the activists and entrepreneurs, advocates and opportunists, reporters and hobbyists who together make up 2016’s most disruptive, and least understood, force in media.

Individually, these pages have meaningful audiences, but cumulatively, their audience is gigantic: tens of millions of people. On Facebook, they rival the reach of their better-funded counterparts in the political media, whether corporate giants like CNN or The New York Times, or openly ideological web operations like Breitbart or Mic. And unlike traditional media organizations, which have spent years trying to figure out how to lure readers out of the Facebook ecosystem and onto their sites, these new publishers are happy to live inside the world that Facebook has created. Their pages are accommodated but not actively courted by the company and are not a major part of its public messaging about media. But they are, perhaps, the purest expression of Facebook’s design and of the incentives coded into its algorithm — a system that has already reshaped the web and has now inherited, for better or for worse, a great deal of America’s political discourse. (...)

This year, political content has become more popular all across the platform: on homegrown Facebook pages, through media companies with a growing Facebook presence and through the sharing habits of users in general. But truly Facebook-native political pages have begun to create and refine a new approach to political news: cherry-picking and reconstituting the most effective tactics and tropes from activism, advocacy and journalism into a potent new mixture. This strange new class of media organization slots seamlessly into the news feed and is especially notable in what it asks, or doesn’t ask, of its readers. The point is not to get them to click on more stories or to engage further with a brand. The point is to get them to share the post that’s right in front of them. Everything else is secondary.

While web publishers have struggled to figure out how to take advantage of Facebook’s audience, these pages have thrived. Unburdened of any allegiance to old forms of news media and the practice, or performance, of any sort of ideological balance, native Facebook page publishers have a freedom that more traditional publishers don’t: to engage with Facebook purely on its terms. These are professional Facebook users straining to build media companies, in other words, not the other way around.

From a user’s point of view, every share, like or comment is both an act of speech and an accretive piece of a public identity. Maybe some people want to be identified among their networks as news junkies, news curators or as some sort of objective and well-informed reader. Many more people simply want to share specific beliefs, to tell people what they think or, just as important, what they don’t. A newspaper-style story or a dry, matter-of-fact headline is adequate for this purpose. But even better is a headline, or meme, that skips straight to an ideological conclusion or rebuts an argument.

by John Hermann, NY Times |  Read more:
Image: Pablo Delcan

How Does the Language of Headlines Work? The Answer May Surprise You

"Headless Body Found in Topless Bar” (New York Post)
“Super Caley Go Ballistic Celtic Are Atrocious” (The Sun)
“Nature Sends Her Egrets” (San Jose Mercury)
“Why Doesn’t America Read Anymore?” (NPR)

Consider the headline: a bunch of words carefully crafted to grab your attention when you least expect it… and then entice you to spread it far and wide, sometimes in spectacular viral fashion. And that’s just for starters. Before you even get to all the news that’s fit to print, the headline is already way ahead of you, with succinct and surprising spoilers—that can only really be understood if you click. By the time you read a headline, you may already have become incensed by provocative questions, been amused by puns and wordplay or have had your faith restored in humanity by viral clickbait.

In an online age where attention spans are worn thin by information overload, these are remarkable feats for a bunch of words, yet headlines get little respect around here. From titillating tabloid titles to clickbait chicanery, headlines these days have often been derided as the empty calories of information, sensationalist trickery, “the art of exaggerating without actually lying” as Otto Friedrich put it.

Why do we even pay so much attention to headlines, when millions are made up and forgotten every day? Some are more memorable than others (particularly when clever wordplay is involved), yet good or bad, there is a common language of headlines. Visual placement aside, there’s a long history to how humble copy editors have developed the weird linguistic tricks that intrigue, shock, and amuse an otherwise cynical audience. What you’ll learn may surprise you (or not). (...)

By the end of the 19th century, editors had started playing around with the language of headlines, switching over to using the present tense in headlines, even for events past, and promoting verbs, making action seem more immediate and palpable. A headline like “FIRES GEN. M’ARTHUR,” (Chicago Tribune, 1951) fires on both of these cylinders, making the verb more prominent by removing the guy doing the firing altogether (a no name, no doubt), and making the action happen right NOW, as you’re reading it. Hurry and read, before the action ends!

Art form or not, short and sweet titles are often hard to figure out. Take a confusing example like “Dead Baby Names Racket,” which could be read a few ways, one much odder than the other (though who are we to question what dead babies like to name in their spare time?). It holds your attention because you read… and then have to reread. Short form headlines assume a lot of reader knowledge (who is Beatle John Lennon for instance?) to be understood, placing readers even closer to the news. We can see that headlines certainly don’t use language the way we might expect but their weird telegraphic forms still seem to condense and convey all the necessary information to readers. Headlines are often better understood and appreciated once stories are read, yet their powers of attraction are very much hinged on readers being able to anticipate what the headlines are referring to and develop some kind of emotion with respect to a story they haven’t even read yet. Once readers are curious enough to be led down the garden path and click/read, the headlines have won. How do they do this?

Linguist Deborah Schaffer shows that in lieu of real news or a respectable reputation, tabloids often make liberal use of “headlinese” to sensationalize stories. This means using an expressive, “connotation-rich” vocabulary that is attention-grabbing and promotes curiosity and a strong emotional connection for the reader, unsurprisingly, similar to advertising language, since “the average newspaper is simply a business enterprise that sells news and uses that lure to sell advertising space” (Otto Friedrich). Words like “sex,” “scandal,” “sizzling,” and “weird” can be used to sell anything, even if they’re unlikely stories like “Surgeon, 70, Makes 11 Nurses Pregnant,” “Marie Osmond puts her 5-yr-old son to work—and church is outraged,” “Lonely UFO Aliens Are Stealing Our Pets,” and “Michael J. Fox Outrages Hotel Guests During His Bizarre Island Honeymoon.” Readers are brought closer to the news in tabloid journalism, given a personal interest, asked to feel sympathy for the “heartbroken” in “tragic” circumstances, who may often be people we know intimately, on a first name or nickname basis—such as celebrities, our dear friends: “Test-Tube Baby for Burt & Loni: Friends Say It’s in the Works.”

You might also notice one weird trick (or several) from these tabloid headlines, moving to longer, conversational sentences that often contain “pseudo-quotes,” and emphasizing emotion and curiosity, that have been picked up by another type of headline.

Enter clickbait, the scourge of the internet. No one likes clickbait, and for good reason, but it’s surprisingly effective at generating viral interest, fast, which is exactly what news publications like. You’ll be outraged at the five or six different ways clickbait actually resembles regular headlines, and vice versa, and it will change everything. Until it doesn’t.

Because of its propensity to generate clicks on poor quality, empty, or fake content, clickbait has been seen as a kind of headline spam, so much so that Facebook plans to save your clicks by detecting the clickbait patterns that outrage us, and banning it forever. (...)

Meanwhile, it appears something must be working with the actual linguistics of clickbait, because more and more mainstream news publications have been observed using clickbait styles, such as NPR’s “The World’s Most Trafficked Mammal Is One You May Never Have Heard Of.” Readers do click on these headlines, and when they find real or satisfying information, they might even engage and share the news. When used on actual news content, it seems that clickbait headline conventions can be a powerful force for the news. At least until Facebook bans NPR and others who make use of these linguistic patterns, that if anything, appear to be part of the natural evolution of headlines.

So what is it about clickbait language that’s so special, given that it looks quite like the tabloid headlines of old? Clickbait headlines have a few different ways to grab your attention. Like tabloids, clickbait titles have become more conversational and less telegraphic, which promotes the human, emotional angle. Where a tabloid headline might reveal “…and church is outraged” or “Michael J Fox Outrages…“, in a clickbait headline the experiencer of emotions becomes us, which is as close to the news as you can get: “you won’t believe what happened next,” “…you may never have heard of,” “… will surprise you…” The news builds a relationship directly with the reader, by anticipating how we might feel or what we might know about a situation and giving us a personal stake in the story.

by Chi Luu, JSTOR |  Read more:
Image: iStock