Sunday, May 29, 2016

Book Review: Age of Em

So, what is the Age of Em?

According to Hanson, AI is really hard and won’t be invented in time to shape the posthuman future. But sometime a century or so from now, scanning technology, neuroscience, and computer hardware will advance enough to allow emulated humans, or “ems”. Take somebody’s brain, scan it on a microscopic level, and use this information to simulate it neuron-by-neuron on a computer. A good enough simulation will map inputs to outputs in exactly the same way as the brain itself, effectively uploading the person to a computer. Uploaded humans will be much the same as biological humans. Given suitable sense-organs, effectuators, virtual avatars, or even robot bodies, they can think, talk, work, play, love, and build in much the same way as their “parent”. But ems have three very important differences from biological humans.

First, they have no natural body. They will never need food or water; they will never get sick or die. They can live entirely in virtual worlds in which any luxuries they want – luxurious penthouses, gluttonous feasts, Ferraris – can be conjured out of nothing. They will have some limited ability to transcend space, talking to other ems’ virtual presences in much the same way two people in different countries can talk on the Internet.

Second, they can run at different speeds. While a normal human brain is stuck running at the speed that physics allow, a computer simulating a brain can simulate it faster or slower depending on preference and hardware availability. With enough parallel hardware, an em could experience a subjective century in an objective week. Alternatively, if an em wanted to save hardware it could process all its mental operations v e r y s l o w l y and experience only a subjective week every objective century.

Third, just like other computer data, ems can be copied, cut, and pasted. One uploaded copy of Robin Hanson, plus enough free hardware, can become a thousand uploaded copies of Robin Hanson, each living in their own virtual world and doing different things. The copies could even converse with each other, check each other’s work, duel to the death, or – yes – have sex with each other. And if having a thousand Robin Hansons proves too much, a quick ctrl-x and you can delete any redundant ems to free up hard disk space for Civilization 6 (coming out this October!)

Would this count as murder? Hanson predicts that ems will have unusually blase attitudes toward copy-deletion. If there are a thousand other copies of me in the world, then going to sleep and not waking up just feels like delegating back to a different version of me. If you’re still not convinced, Hanson’s essay Is Forgotten Party Death? is a typically disquieting analysis of this proposition. But whether it’s true or not is almost irrelevant – at least some ems will think this way, and they will be the ones who tend to volunteer to be copied for short term tasks that require termination of the copy afterwards. If you personally aren’t interested in participating, the economy will leave you behind.

The ability to copy ems as many times as needed fundamentally changes the economy and the idea of economic growth. Imagine Google has a thousand positions for Ruby programmers. Instead of finding a thousand workers, they can find one very smart and very hard-working person and copy her a thousand times. With unlimited available labor supply, wages plummet to subsistence levels. “Subsistence levels” for ems are the bare minimum it takes to rent enough hardware from Amazon Cloud to run an em. The overwhelming majority of ems will exist at such subsistence levels. On the one hand, if you’ve got to exist on a subsistence level, a virtual world where all luxuries can be conjured from thin air is a pretty good place to do it. On the other, such starvation wages might leave ems with little or no leisure time.

Sort of. This gets weird. There’s an urban legend about a “test for psychopaths”. You tell someone a story about a man who attends his mother’s funeral. He met a really pretty girl there and fell in love, but neglected to get her contact details before she disappeared. How might he meet her again? If they answer “kill his father, she’ll probably come to that funeral too”, they’re a psychopath – ordinary people would have a mental block that prevents them from even considering such a drastic solution. And I bring this up because after reading Age of Em I feel like Robin Hanson would be able to come up with some super-solution even the psychopaths can’t think of, some plan that gets the man a threesome with the girl and her even hotter twin sister at the cost of wiping out an entire continent. Everything about labor relations in Age of Em is like this. (...)

There are a lot of similarities between Hanson’s futurology and (my possibly erroneous interpretation of) the futurology of Nick Land. I see Land as saying, like Hanson, that the future will be one of quickly accelerating economic activity that comes to dominate a bigger and bigger portion of our descendents’ lives. But whereas Hanson’s framing focuses on the participants in such economic activity, playing up their resemblances with modern humans, Land takes a bigger picture. He talks about the economy itself acquiring a sort of self-awareness or agency, so that the destiny of civilization is consumed by the imperative of economic growth.

Imagine a company that manufactures batteries for electric cars. The inventor of the batteries might be a scientist who really believes in the power of technology to improve the human race. The workers who help build the batteries might just be trying to earn money to support their families. The CEO might be running the business because he wants to buy a really big yacht. And the whole thing is there to eventually, somewhere down the line, let a suburban mom buy a car to take her kid to soccer practice. Like most companies the battery-making company is primarily a profit-making operation, but the profit-making-ness draws on a lot of not-purely-economic actors and their not-purely-economic subgoals.

Now imagine the company fires all its employees and replaces them with robots. It fires the inventor and replaces him with a genetic algorithm that optimizes battery design. It fires the CEO and replaces him with a superintelligent business-running algorithm. All of these are good decisions, from a profitability perspective. We can absolutely imagine a profit-driven shareholder-value-maximizing company doing all these things. But it reduces the company’s non-masturbatory participation in an economy that points outside itself, limits it to just a tenuous connection with soccer moms and maybe some shareholders who want yachts of their own.

Now take it further. Imagine there are no human shareholders who want yachts, just banks who lend the company money in order to increase their own value. And imagine there are no soccer moms anymore; the company makes batteries for the trucks that ship raw materials from place to place. Every non-economic goal has been stripped away from the company; it’s just an appendage of Global Development.

Now take it even further, and imagine this is what’s happened everywhere. There are no humans left; it isn’t economically efficient to continue having humans. Algorithm-run banks lend money to algorithm-run companies that produce goods for other algorithm-run companies and so on ad infinitum. Such a masturbatory economy would have all the signs of economic growth we have today. It could build itself new mines to create raw materials, construct new roads and railways to transport them, build huge factories to manufacture them into robots, then sell the robots to whatever companies need more robot workers. It might even eventually invent space travel to reach new worlds full of raw materials. Maybe it would develop powerful militaries to conquer alien worlds and steal their technological secrets that could increase efficiency. It would be vast, incredibly efficient, and utterly pointless. The real-life incarnation of those strategy games where you mine Resources to build new Weapons to conquer new Territories from which you mine more Resources and so on forever.

But this seems to me the natural end of the economic system. Right now it needs humans only as laborers, investors, and consumers. But robot laborers are potentially more efficient, companies based around algorithmic trading are already pushing out human investors, and most consumers already aren’t individuals – they’re companies and governments and organizations. At each step you can gain efficiency by eliminating humans, until finally humans aren’t involved anywhere.

True to form, Land doesn’t see this as a dystopia – I think he conflates “maximally efficient economy” with “God”, which is a hell of a thing to conflate – but I do. And I think it provides an important new lens with which to look at the Age of Em.

by Scott Alexander, Slate Star Codex |  Read more:
Image: Age of Em

The Fall of Salon.com

[ed. Reminds me of this piece of advice to young people: "But of course things did have to be this way; our Internet’s strange success contained within it the seeds of its destruction. Once people realize there’s money to be made in something, anything that was once good about it is not long for that world. This is probably where I should put a GIF of a character from a ’90s Nickelodeon show looking sad with the acronym LOL superimposed on it but I like to think I have conveyed that same sentiment using words."]

A Facebook page dedicated to celebrating the 20th anniversary of digital media pioneer Salon is functioning as a crowdsourced eulogy.

Dozens of Salon alumni have, over the past several months, posted their favorite stories from and memories of the once-beloved liberal news site described as a “left-coast, interactive version of The New Yorker,” a progressive powerhouse that over the years has covered politics with a refreshing aggressiveness, in a context that left plenty of room for provocative personal essays and award-winning literary criticism.

“We were inmates who took over the journalistic asylum,” David Talbot, who founded the site in 1995, wrote on the Facebook page. “And we let it rip — we helped create online journalism, making it up as we went along. And we let nobody — investors, advertisers, the jealous media establishment, mad bombers, etc — get in our way.”

They are mourning a publication they barely recognize today.

“Sadly, Salon doesn’t really exist anymore,” wrote Laura Miller, one of Salon’s founding editors who left the site for Slate last fall. “The name is still being used, but the real Salon is gone.”

Salon, which Talbot originally conceived of as a “smart tabloid,” began as a liberal online magazine and was quickly seen as an embodiment of the media’s future. For a while, particularly ahead of the dot-com boom of the late 1990s, it even looked as though it might be a success story. It lured famous writers and tech-company investors and went public in 1999. At the time, Salon was valued at $107 million.

“I think it’s very similar to what a Vox or a Buzzfeed seems today,” said Kerry Lauerman, who joined Salon in 2000 and would serve as the site’s editor in chief from 2010 to 2013. “There was, at first, a lot of money and excitement about Salon. There was no one else, really, in that space. ... It was kind of a brave new world, and Salon was at the forefront.”

Over the last several months, POLITICO has interviewed more than two dozen current and former Salon employees and reviewed years of Salon’s SEC filings. On Monday, after POLITICO had made several unsuccessful attempts to interview Salon CEO Cindy Jeffers, the company dropped a bombshell: Jeffers was leaving the company effective immediately in what was described as an “abrupt departure.”

While the details of Salon’s enormous management and business challenges dominate the internal discussion at the magazine, in liberal intellectual and media circles it is widely believed that the site has lost its way.

“I remember during the Bush years reading them relatively religiously,” Neera Tanden, the president of the Center for American Progress, told POLITICO. “Especially over the last year, they seem to have completely jumped the shark in so many ways. They’ve become — and I think this is sad — they’ve definitely become like a joke, which is terrible for people who care about these progressive institutions.”

So, what happened? (...)

On social media, some have ridiculed the site for its questionable hot takes on the 2016 election and torqued-up lifestyle pieces that wouldn’t pass muster at any serious publication, like “Farewell, once-favorite organ: I am officially breaking up with my penis.” (...)

"We adopted a Huffington Post model, but we didn't have the resources to scale in a way that would've allowed for that kind of a model to actually work. We had 20 people, not 300,” one former staffer said. “I don't think Cindy ever realized that, and instead of modernizing within reason, while protecting the integrity of the brand — which was Salon's most valuable asset, by far — she decided to go full tilt for traffic, and it destroyed the brand."

by Kelsey Sutton and Peter Sterne, Politico |  Read more:
Image: Salon

Saturday, May 28, 2016

Boz Scaggs


[ed. See also: Loan Me A Dime]

Notifications Are Broken. Here's How Google Plans To Fix Them

Notifications suck. They're constantly disrupting us with pointless, ill-timed updates we don't need. True, sometimes they give us pleasure—like when they alert us of messages from real people. And sometimes they save our bacon, by reminding us when a deadline is about to slip by. But for the most part, notifications are broken—a direct pipeline of spam flowing from a million app developers right to the top of our smartphone screens."We need to start a movement to fix notifications," Aranda said.

During a frank session at the 2016 I/O developer conference, Google researchers Julie Aranda, Noor Ali-Hasan, and Safia Baig openly admitted that it was time for notifications to get a major design overhaul. "We need to start a movement to fix notifications," said Aranda, a senior UX researcher within Google.

As part of their research into the problem, Aranda and her colleagues conducted a UX study of 18 New Yorkers, to see how they interacted with notifications on their smartphones, what they hated about them, and what could be done to fix notifications in future versions of Android.

According to Google's research, the major problem with notifications is that developers and users want different things from them. Users primarily want a few things from notifications. First and foremost, they want to get notifications from people. "Notifications from other people make you feel your existence is important," said one of their research subjects, Rachael. And some people are more important than others, which is why notifications from people like your spouse, your mom, or your best friend are more important than a direct message on Twitter, or a group text from the people in your bowling league. In addition, users want notifications that help them stay on top of their life—a reminder of an upcoming deadline or doctor's appointment, for example.

But developers want something different from notifications. First and foremost, they design their notifications to fulfill whatever contract it is that they feel that they have with their users. So if you've designed an exercise app, you might alert someone when they haven't worked out that day; if you are a game developer, you might tell them when someone beat their high score. Yet according to Google, research suggests that the majority of users actively resent such notifications. And that's doubly true for the other kind of notifications developers want to send—notifications that essentially serve no function except to remind users that their app is installed on a user's phone. Google calls these "Crying Wolf" notifications and says they're the absolute nadir of notification design.

This disconnect between what users and developers want is so severe that users go through extreme measures to get away from notifications. Google said that it's seeing more and more users foregoing installing potentially notification-spamming apps on their phones when they can access the same service through a website—where notifications will never be an annoyance. And this actually explains a lot about Google's interest in fixing the notifications problem, because people who aren't downloading Android apps aren't locked into the platform, and aren't spending money on Google Play. (...)

One surprising thing revealed by Google's research? Aranda says they found that people tended to open games, social networks, and news apps so often that notifications actually tended to actually drive users away from the apps, not vice versa. 

by John Brownlee, Co-Design |  Read more:
Image: Google

Is Everything Wrestling?

The charms of professional wrestling — half Shakespeare, half steel-chair shots — may never be universally understood. Every adult fan of the sport has encountered those skeptics who cock their heads and ask, “You do know it’s fake, right?”

Well, sure, but that hasn’t stopped pro wrestling from inching closer and closer to the respectable mainstream. Last year, World Wrestling Entertainment announced a partnership with ESPN, leading to straight-faced wrestling coverage on “SportsCenter.” The biggest action star in the world, Dwayne Johnson, known as the Rock, got his start as an eyebrow-waggling wrestler. When the “Today” show needs a guest host, it enlists the WWE star John Cena to don a suit and crack jokes. No less an emblem of cultivated liberal intelligentsia than Jon Stewart recently hosted wrestling’s annual Summerslam, his first major gig since leaving “The Daily Show.” Wrestling may never be cool, but it is, at the very least, no longer seen as the exclusive province of the unwashed hoi polloi.

This is partly because the rest of the world has caught up to wrestling’s ethos. With each passing year, more and more facets of popular culture become something like wrestling: a stage-managed “reality” in which scripted stories bleed freely into real events, with the blurry line between truth and untruth seeming to heighten, not lessen, the audience’s addiction to the melodrama. The modern media landscape is littered with “reality” shows that audiences happily accept aren’t actually real; that, in essence, is wrestling. (“WWE Raw” leads to “The Real World,” which leads to “Keeping Up With the Kardashians,” and so forth.) The way Beyoncé teased at marital problems in “Lemonade” — writing lyrics people were happy to interpret as literal accusations of her famous husband’s unfaithfulness — is wrestling. The question of whether Steve Harvey meant to announce the wrong Miss Universe winner is wrestling. Did Miley Cyrus and Nicki Minaj authentically snap at each other at last year’s MTV Video Music Awards? The surrounding confusion was straight out of a wrestling playbook.

It’s not just in entertainment, either. For a while, it became trendy to insist that the 2016 presidential election, with all its puffed chests and talk of penis size, seemed more like a wrestling pay-per-view event than a dignified clash of political minds. In politics, as in wrestling, the ultimate goal is simply to get the crowd on your side. And like all the best wrestling villains — or “heels” — Donald Trump is a vivacious, magnetic speaker unafraid to be rude to his opponents; there was even a heelish consistency to his style at early debates, when he actively courted conflict with the moderator, Megyn Kelly, and occasionally paused to let the crowds boo him before shouting back over them. (The connection isn’t just implied, either: Trump was inducted to the WWE’s Hall of Fame in 2013, owing to his participation in several story lines over the years.) Ted Cruz’s rhetorical style, with its dramatic pauses, violent indignation and tendency to see every issue as an epic moral battleground, was sometimes reminiscent of great wrestling heels. The way Rick Perry called Trump’s candidacy a “cancer” that “will lead the Republican Party to perdition” before endorsing Trump and offering to serve as his vice president: this was a tacit admission that all his apocalyptic rhetoric was mainly for show. Pure wrestling, in other words. (...)

What the WWE does care about is keeping control of the way people experience “wrestling” — preferably not as the disreputable carny spectacle it once was, but as a family-friendly, 21st-century entertainment. When recapping wrestling history, it can completely elide the messier incidents: the sex scandals, shady deaths, neglected injuries, drug abuse and more. The audience, meanwhile, knows what the WWE cares about, giving it enough knowledge of wrestling’s inner workings to analyze each narrative not just through its in-world logic (“this guy will win the championship because he seems more driven”) but by considering external forces (“this guy will win the championship because he is well-spoken enough to represent the company when he inevitably shows up on ‘Today’”). Parsing both those layers — the behavior and the meta-behavior, the story told and the story of why it’s being told that way — can be an entertainment in its own right, and speculating on creative decisions has long been a fascination for wrestling fans.

This is how a lot of fields work these days. The audiences and the creators labor alongside each other, building from both ends, to conceive a universe with its own logic: invented worlds that, however false they may be, nevertheless feel good and right and amusing to untangle. Consider the many ways of listening to the song “Sorry,” from Lemonade, in which Beyoncé takes shots at an unnamed woman referred to as “Becky with the good hair”: a person we’re led to believe is having a relationship with the singer’s husband. You can theorize about the real-world identity of “Becky with the good hair,” as the internet did. You can consider the context of the phrase (why “Becky”? why “good hair”?), as the internet did. You can think about why Beyoncé decided to make art suggesting that her real-life husband cheated on her. All of this will be more time-consuming, and thus be interpreted as more meaningful, than if she had said outright, “He did it.” (Or if we said, “It’s just a song.”)

The process of shaping a story by taking all these layers into account seems dangerously similar to what corporations do when they talk about “telling the story of our brand” — only as applied to real people and real events, instead of mascots and promotional stunts. On a spiritual level, it seems distasteful to imagine a living person as a piece being moved around on a narrative chessboard, his every move calculated to advance a maximally entertaining story line. But this is how it all too often works — whether plotted by the public figures themselves or by some canny handler (an adviser, a producer, a PR rep), everyone is looking to sculpt the narrative, to add just the right finishing touch.

So when I think of how politics and pop culture are often compared to wrestling, this is the element that seems most transferable: not the outlandish characters or the jumbo-size threats, but the insistence on telling a great story with no regard for the facts.

by Jeremy Gordon, NY Times |  Read more:
Image: Bill Pugliano/Getty Images

Randy Crawford & Joe Sample

David Sylvian & Takagi Masakatsu

The Blasé Surrealism of Haruki Murakami

[ed. I've read most of Murakami's books, including 1Q84.  As a stylist he's probably as good as anyone, but whenever I finish one of his books I invariably throw my hands up and go "Is that it?" The Atlantic had a pretty good summary a while ago of what a new reader might expect (slightly edited here):  Is the novel’s hero an adrift, feckless man in his mid-30s? Does he have a shrewd girl Friday who doubles as his romantic interest? Does the story begin with the inexplicable disappearance of a person close to the narrator? Is there a metaphysical journey to an alternate plane of reality? Are there gratuitous references to Western novels, films, and popular culture? Which eastern-European composer provides the soundtrack, and will enjoy skyrocketing CD sales in the months ahead—Bartók, Prokofiev, Smetana? Are there ominous omens, signifying nothing; dreams that resist interpretation; cryptic mysteries that will never be resolved? Check, check, check and check. In every book. I'm pretty much done with Murakami, which is too bad because I really do enjoy his writing.]

Three days ago, I began to read 1Q84 by Haruki Murakami. At 1157 pages, 1Q84 is a mammoth novel, a veritable brick of a book, similar in proportion to the unfinished copy of Infinite Jest that currently rests about 15 feet away from me.

In theory, 1Q84 is a poor choice of book for me right now. After all, 2015 has been a rather mild year for me, in terms of how many books I’ve read. In 2014, I think I read close to 30 books, but this year I’ve only finished 5 or 6 (I’ve partially completed several more).

I blame the Internet for this. Or, rather, I blame my own frequent inability to resist the gravity of Twitter or Facebook or Reddit or Instagram or Snapchat. I don’t even want to think about how many books I could have read this year, had my time on social media been replaced by time spent reading books. But, I guess, that’s what I chose to do, so I’ll take responsibility for my actions. I did, at least, manage to read a whole hell of a lot of great essays on the Internet (see this list and this list). (...)

But I’ve digressed somewhat from the intended topic of this essay: Haruki Murakami. Murakami is one such wizard whose works surround me, swallow me, permeate my being, and transport me to worlds that feel no less real than the one in which I’m typing these words.

And so far, his magnum opus, 1Q84, is no exception. As I mentioned earlier, 1Q84 was arguably a poor choice of book for me to begin reading at this time. (...)

So, yes, this renewed tripod of habits is helping me to read more. But that’s not the only catalyst. I correctly suspected that Murakami could draw me back into book-reading because he is a writer who seems unfailingly to write irresistible page-turners. There are a few reasons for this that I can see.

For one, I swear he’s discovered the Platonic ideal combination of steady pacing and incomplete-yet-tantalizing information. Having completed four of his novels now, I can tell you that his novels always seem to revolve around some mystery that needs to be solved, and he does an excellent job of hinting at the grandiose and ominous nature of the mystery within the first few pages, while providing almost no information regarding the mystery’s actual attributes or dimensions. As the novels progress, he gradually reveals the mystery’s shocking, sprawling architecture and all-penetrating implications, dispensing just enough detail at just the right intervals to keep his readers (or, me at least) hopelessly ensnared.

The protagonists in Murakami’s novels tend to be ordinary, solitary people who suddenly find themselves wrapped up in some sort of epic, supernormal circumstances and must undertake a quest that is as much a quest to the heart of their true identity as it is a quest through the external world.

Another trademark of Murakami’s is something I call Blasé Surrealism (let me know if you think of a better name or if one already exists). Blasé Surrealism is characterized by melding mundane, humdrum realism with elements of surrealism and magic realism, while also incorporating (in Murakami’s case, at least) abstract, metaphysical commentary/comparisons intermittently throughout the story. Murakami’s stories are told in a matter-of-fact tone, as if everything that is happening is quite commonplace, and much of it is. But then he’ll nonchalantly introduce a portal to an alternate reality or include a line like, “Hundreds of butterflies flitted in and out of sight like short-lived punctuation marks in a stream of consciousness without beginning or end.”

He’s so casual, so blasé, about this, that the flow of the story isn’t interrupted. Strange, surreal things happen in Murakami novels, but they seem completely natural because he acts like they are. The reader just goes right along with him. Thus Murakami manages effectively to marry normal and abnormal, real and surreal, conventional and magical. The best comparison I can make is to say that reading a Murakami novel is like being in a dream, in that things are clearly off, clearly not the way they typically are, and yet one doesn’t really notice or care, accepting things at face-value. This makes for a uniquely mind-stirring, almost psychedelic, reading experience.

by Jordan Bates, Refine the Mind |  Read more:
Image: IQ84

Why I Bought a Chromebook Instead of a Mac

Chromebooks have surpassed sales of Mac laptops in the United States for the first time ever. And that doesn’t surprise me. Because roughly a year ago I made the same switch. Formerly a lifelong Mac user, I bought my first PC ever in the form of a Chromebook. And I’m never looking back.

Driven by the kind of passion that can only be found in the recently converted, I have aided and abetted friends in renouncing the sins of gluttony and pride uniquely found in the House of Apples. I have helped them find salvation with the Book of Chrome. Glory be the Kingdom of Chrome, for your light shines down upon us at a quarter of the price.

Make no mistake, I grew up on Macs. The first computer I remember my Dad bringing home when I was 5 years old was a Mac. Our family computer throughout the 1990s was a Mac. I used that Mac Performa throughout middle school, and it gave me treasured memories of playing Dark Forces and first discovering the internet. My high school graduation present from my parents in 2002 was my first Mac laptop. And I would continue to buy Mac desktops and laptops for the next decade and a half.

But something happened about a year ago when my Macbook Air was running on fumes. I looked at the Macs and gave my brain a half-second to entertain other options. I owned a functioning Mac desktop, which is my primary machine for heavy lifting. But I started to wonder why I wasn’t entertaining other options for my mobile machine.

The biggest consideration was price. When all was said and done, even the cheapest Mac laptop was going to set me back about $1,300 after taxes and AppleCare. And the siren song of a computer under $200 was calling my name. I got the Acer Chromebook with 2GB of RAM and a 16GB drive. It cost a shockingly low $173. And it was worth every penny. It even came with 100GB of Google Drive storage and twelve GoGo inflight internet passes. If you travel enough, the thing literally pays for itself in airline wifi access.

I rarely have to edit video and my photo manipulation needs are minimal. So when I walk down to the coffee shop to work, what the hell do I need doing that can’t be done on a Chromebook? Nothing, is the answer. Precisely nothing. And if you’re being totally honest with yourself you should probably ask the same question.

Computers have essentially become disposable, for better and for worse. We’ve seen this trend in electronics over the past decade and it’s a great thing from the perspective of American consumers. More people can afford e-readers and tablets that now cost just $50. The mid-2000s dream of “one laptop per child,” which sought to bring the price of mobile computers down to $100, has become a reality thanks to Chromebooks and tablets made by companies like Acer, HP, and Amazon. And with more and more of our computing needs being met by web browsers alone, the average consumer is seeing less incentive to buy a Mac.

This trend should obviously terrify Apple. Computers have become fungible commodities, just like HDTVs before them. Which is to say that the average American doesn’t view a TV as high-tech that requires much homework these days. Any TV will do. Look at the screen and look at the price. Does it look like a TV? Yep. Is it cheap? Double yep. Whip out the credit card.

by Matt Novak, Gizmodo |  Read more:
Image: Shutterstock/Acer

Our Nightmare

Most of us, I imagine, are not consistent political optimists or pessimists. We instead react – and usually overreact – to the short-term political trends before us, unable to look beyond the next election cycle and its immediate impact on ourselves and our political movements. I remember, immediately after the re-election of George W. Bush in 2004, a particularly voluble conservative blogger arguing that it was time for conservatives to “curb stomp” the left, to secure the final victory over liberals and Democrats. Four years later, of course, a very different political revolution appeared to be at hand, and some progressives made the same kind of ill-considered predictions. Neither permanent political victory has come to pass, with Democrats enjoying structural advantages in presidential elections and Republicans making hay with a well-oiled electoral machine in Congressional elections. How long those conditions persist, who can say.

But partisan politics are only a part of the actual political conditions that dictate our lives. Politics, culture, and economics fuse together to create our lived experience. And that experience is bound up in vague but powerful expectations about success, what it means, and who it’s for. There is a future that appears increasingly likely to me, a bleak future, and one which subverts traditional partisan lines. In this future, the meritocratic school of liberalism produces economic outcomes that would be at home with laissez faire economic conservatives, to the detriment of almost all of us.

The future that I envision amounts, depending on your perspective, to either a betrayal of the liberal dream or its completion. In this future, the traditional foundations of liberalism in economic justice and redistribution are amputated from the push for diversity in terms of race, gender, sexual identity, and related issues. (...)

Traditionally, both equality and diversity have been important to liberalism. There are obvious reasons for this connection. To begin with, the persistent inequality and injustice that afflict people of color and women in our society are powerfully represented in economic outcomes, with black and Hispanic Americans and women all suffering from clear and significant gaps in income, wealth, and similar measures of economic success. Economic justice is therefore inseparable from our efforts to truly combat racial and gender inequality. What’s more, the moral case for economic justice stems from the same foundations as the case against racism and sexism, a profound moral duty to provide for all people and to ensure that they live lives of material security and social dignity. The traditional liberal message has therefore been to emphasize the need for diverse institutions and economic justice as intertwined phenomena.

In recent years, however, the liberal imagination has become far less preoccupied with economic issues. Real-world activism retains its focus on economic outcomes, but the media that must function as an incubator of ideas, in any healthy political movement, has grown less and less interested in economic questions as such. Liberal publications devote far less ink, virtual or physical, to core issues of redistribution and worker power than they once did. Follow prominent liberals on Twitter, browse through the world of social justice Tumblr, read socially and culturally liberal websites. You might go weeks without reading the word “union.” Economic issues just aren’t central to the political conceptions of many younger liberals; they devote endless hours to decoding the feminism of Rihanna but display little interest in, say, a guaranteed minimum income or nationalizing the banks. Indeed, the mining of pop cultural minutia for minimally-plausible political content has become such a singular obsession within liberal media that it sometimes appears to be crowding out all over considerations. (...)

As The American Conservative’s Noah Millman once wrote, “the culture war turns politics into a question of identity, of tribalism, and hence narrows the effective choice in elections. We no longer vote for the person who better represents our interests, but for the person who talks our talk, sees the world the way we do, is one of us…. And it’s a good basis for politics from the perspective of economic elites. If the battle between Left and Right is fundamentally over social questions like abortion and gay marriage, then it is not fundamentally over questions like who is making a killing off of government policies and who is getting screwed.” The point is not that those culture war questions are unimportant, but that by treating them as cultural issues, our system pulls them up from their roots in economic foundations and turns them into yet another set of linguistic, symbolic problems. My argument, fundamentally, is that we face a future where strategic superficial diversity among our wealthy elites will only deepen the distraction Millman is describing. Such a future would be disastrous for most women and most people of color, but to many, would represent victory against racism and sexism.

by Fredrik deBoer |  Read more:
Image: Getty

The Persian Rug May Not Be Long for This World

For centuries, Iran’s famed carpets have been produced by hand along the nomad trail in this region of high plains around the ancient city of Shiraz.

Sheep grazed in high mountain pastures and shorn only once a year produce a thick, long wool ideal for the tough thread used in carpet making.

But high-quality production of hand-woven carpets is no longer sustainable on the migration route of the nomads, said Hamid Zollanvari, one of Iran’s biggest carpet makers and dealers.

Instead, he had built a factory with 16 huge cooking pots, where on a recent cool, sunny spring day men in blue overalls stirred the pots with long wooden sticks, boiling and coloring the thread. As the colored waters bubbled, they looked like live volcanos. The air smelled of sheep.

Another room was stacked with herbs. Eucalyptus leaves, indigo, black curd, turmeric, acorn shells and alum, ingredients for the different colors. “The Iranian carpet is 100 percent organic,” Mr. Zollanvari declared. “No machinery is involved.”

It is a scene that seems as ageless as the women who sit before the looms and weave the rugs, a process that can take as long as a year. And now even the factory is threatened. With six years of Western sanctions on the carpet business and punishing competition from rugs machine-made in China and India, these are hard times for the craft of Persian rug making. Many veterans wonder whether it can survive.

Over the centuries invaders, politicians and Iran’s enemies have left their mark on Iran’s carpets, said Prof. Hashem Sedghamiz, a local authority on carpets, sitting in the green courtyard of his restored Qajar-dynasty house in Shiraz. The outsiders demanded changes, started using chemicals for coloring and, most recently, imposed sanctions on the rugs. Those were blows, he said, damaging but not destructive.

But now, Mr. Sedghamiz said, the end is near. Ultimately he said, it is modernity — that all-devouring force that is changing societies at breakneck speed — that is killing the Persian carpet, Iran’s pride and joy. “People simply are no longer interested in quality.”

Or in paying for it, he might have added. (...)

One thing is for sure: Iran’s carpets are among the most complex and labor-intensive handicrafts in the world.

It is on the endless green slopes of Fars Province, in Iran’s heartland, that the “mother of all carpets,” among the first in the world, is produced: the hand-woven nomadic Persian rug.

The process starts with around 1.6 million sheep grazed by shepherds from the nomadic Qashqai and Bakhtiari tribes, who produce that tough, long-fibered wool so perfect for carpets.

Women take over from there, making thread from the wool by hand, twisting it with their fingers. The finished thread is bundled and then dyed, using natural ingredients like pomegranate peels for deep red or wine leaves for green. After days of boiling on a wooden fire, the threads are dried by the cool winds that blow in from the north each afternoon.

Only then does the weaving start. Weavers, almost all of them women, spend several months to a year bent over a horizontally placed loom, stringing and knotting thousands of threads. Some follow established patterns, some create their own. When the carpet is finally done, it is cut, washed and put out in the sun to dry.

“It’s so time consuming, real hand work,” said Mr. Zollanvari, the carpet dealer. “A labor of love. And what does it cost?” he asked, before answering the question himself: “Almost nothing.” A 6-by-9-foot handwoven carpet costs around $400 in Shiraz, depending on the pattern and quality.

by Thomas Erdbrink, NY Times |  Read more:
Image: Newsha Tavakolian

Six True Things About Dinner With Obama


Bun Cha is a typical Hanoi dish, decidedly everyday, and much loved by locals . To the consternation, no doubt, of the Secret Service (who were very cool about it) I was recently joined for dinner by the leader of the free world in a working class joint near the old quarter of town for an upcoming episode of Parts Unknown.

by Anthony Bourdain, Li.st | Read more:
Image: uncredited

Thursday, May 26, 2016

Trail Blazing

For a great long while, I thought there was only one kind of bud: whatever the fuck was available. The first time I smoked weed (And by “smoked weed” I mean “got high”), I was by most accounts pretty old — twenty-two. There had been two former, rather desultory attempts. Once, at a bonfire on Repulse Bay Beach in Hong Kong when I was fifteen (Hong Kong is renowned for several things, but marijuana is not one of them), and another time in Texas, in the garage of some skater dude who was a year older, very hot, and had an identical twin I would’ve gladly settled for. I was green, the weed less so.

The first time I ever smoked successfully, I was working in Brooklyn, in the dead of winter, for profoundly exploitative wages. On the upside, the job happened to come with a young, chill boss who daily smoked two blunts wrapped in Vanilla Dutch Masters, and was fairly generous about sharing. The weed was dopey, didn’t have a name, and helped temper the indignation I felt trekking ninety minutes with two train changes and a bus ride — in the snow — to get to work. That was thirteen years ago.

By the time I moved to California in my thirties, weed was becoming legal, and I secured a cannabis card for dubious medical reasons and credible recreational ones. I learned there was not only a dazzling kaleidoscope of marijuana strains to choose from, but that, depending on my hankering, I could calibrate the weed to my desired vibe. What a time to be alive! No more feeling catatonic on a dinner date or hyper-social and chatty at the movie theater — I was on the path to finding The Perfect High. Not, like, One High to Rule Them All, but more like, the superlative vibe for every chill sitch in my life. The perfect high, of course, is largely subjective. We’re all physiological snowflakes with wildly differing operating systems. It’s why some people can have a grand time on edibles (me) but other people (my best friend Brooke) go bat-shit crazy, curling up in the fetal position until the mania subsides.

There are significant differences in how the body metabolizes the nearly one hundred different cannabinoids present in cannabis. Phytocannabinoids, found in cannabis flowers, are the chemical compounds that we respond to. (We also produce cannabinoids in our bodies — called endogenous cannabinoids or endocannabinoids). The cannabinoid system is old, I mean ancient; even worms respond to cannabinoids. It regulates a bunch of basic processes in our bodies — the immune system, memory, sleep and inflammation. We have cannabinoid receptors in all sorts of places.

You guys: we’re basically designed to get high.

Of all the cannabinoids in cannabis, THC (Tetrahydrocannabinol) and CBD (Cannabidiol) are the most famous, with the prevailing agreement that THC is heady and CBD is about the body high. But it’s the ninety-odd other cannabinoids acting in concert with them that make each high unique. This synergistic effect — the harmonious interplay, and the permutations of cannabinoids — is what makes each strain so darned mysterious. Elan Rae, the in-house cannabis expert for Marley Natural (the official Bob Marley cannabis brand) described the “entourage effect,” as it’s called, as “the combined effect of the cannabinoid profile. It doesn’t allow you to specifically ascribe an effect to one cannabinoid.” To wit: it’s not the amount of THC that gets you high, but how it reacts with a slew of other cannabinoids.

So while you may not know the exact chemistry of why you’re getting a certain type of high, it stands to reason that you can use guidelines to land in the neighborhood of the high you’re after. Think of it this way: you want a kicky, effervescent vinho verde for picnics or beaches, a jigger of bourbon for cozy autumnal nights, and nineteen pitchers of pre-mixed margarita if you want a pernicious hangover to cap off an evening of homicidal mania and sexual regret. Similarly, you’ll want a playful, low-impact Sativa for an al fresco activity, and an Indica or Indica-dominant hybrid for cuffin’ season.

And what exactly is the difference between Indica and Sativa? Within the Cannabis genus, they are two separate species. Pretty much everything we smoke is one, the other, or a hybrid of the two. Indicas are mellower and harder-hitting, perfect for Olympiad-level chilling after a long day. They’re often prescribed to people who have trouble sleeping or need to manage pain. The plant phenotypically tends to be shorter and bushier, with thicker individual leaves. Sativas, on the other hand, tend to be neurologically wavier, generally better for a daytime high. They make most of us feel alert, and they’re excellent for idea generation, provided you don’t fall into too many disparate wormholes. The flower looks like the platonic ideal of weed; it’s the kind you get on a pair of Huf socks, or embroidered onto a red, gold, and green hat.

To say there’s a weed for every occasion is an understatement. Like German nouns, there’s an exact cannabis strain to complement “sentimental pessimism” or the “anguish one feels when comparing the shortcomings of reality to an idealized state of the world.” Some weed is built for fucking, and other weed is for ugly-crying at 4AM at season two of Bojack Horseman because you relate way too hard to an anthropomorphized cartoon horse and his drinking problem. (No judgment.)

It is with this knowledge, clear eyes, and a full heart that I went to my reputable Los Angeles medical center (not to be confused with any old run-of-the-mill bongmonger) and secured eight strains to try: Platinum Jack, XJ13, Dutch Treat, Pineapple Express, J1, Gorilla Glue, Berner’s Cookies and NorCal OG.

by Mary H.K. Choi, The Awl |  Read more:
Image: Retinafunk