Saturday, September 26, 2015
Friday, September 25, 2015
In Memoriam: Yogi Berra
As a boy of 8 and 9 and 10, growing up in the Bronx, I was a big New York Yankees fan. When you grow up in the Bronx, that’s really all there is to brag about. A zoo and the Yankees.
Nearly every game aired on channel 11 WPIX, and I watched as many as I could, which was nearly all of them.
The Yankees are by far the most successful team in the history of American sports. Not even close. They’re probably the most successful team in the world. For this reason, rooting for the Yankees has often been equated with rooting for a large, wealthy corporation like IBM or GM. I’ve always thought it’s a very poor analogy.
Rooting for the Yankees is actually like rooting for the United States. Each in their own way, the Yankees and United States are the 300 lb. gorilla, that most powerful of entities winning far more than anyone else. Their wealth creates many advantages. Supporters expect them to win, and they usually do. Opponents absolutely revel in their defeats.
All that success means you will be adored by some non-natives who are tired of losing and want to bask in your glory, even if it must be from afar. But mostly you are hated. Anywhere you go in America, some people love the Yankees and many more hate them. Just like the United States is either loved or hated everywhere else in the world.
Who hates IBM?
And just as U.S. history, so stuffed with victory, is chock full of famous figures, so too is Yankee lore replete with famous men in pinstripes.
There are 53 former Yankee players, managers, and executives in the Baseball Hall of Fame, just over 1/6 of the Hall’s total membership.
Can I name them all? Of course not. That’s like naming all the presidents. I have a Ph.D. in history and I still get bogged down once I reach the 1840s (who comes after Van Buren?), and can’t resume a steady line until I re-emerge with Buchannan in 1856; you know, the guy before Lincoln.
For the average person, there are the biggies: Washington, Lincoln, a couple of Roosevelts and so forth.
For Yankee fans naming their club’s Hall of Famers is actually tougher than naming presidents. There have only been 43 presidents. So most fans know a bunch but not all of them, and then everyone knows the biggies, the Washingtons and Lincolns of baseball.
You don’t have to be a Yankees fan. Hell, you don’t even have to know anything about baseball. You’ve all heard of these guys because they transcend baseball. They’re part of American culture.
Babe Ruth, Lou Gehrig, Joe DiMaggio, Micky Mantle, and Yogi Berra. Those five.
Ruth is probably the single greatest baseball player of all time and still the most famous American athlete who ever lived; we’ll see how famous Michael Jordan is nearly 70 years after his death. Gherig’s got a disease named after him and hardly anyone knows its actual name (amyotrophic lateral sclerosis). DiMaggio became a memorable lyric in a seminal Simon and Garfunkle song thirty years after he was the topic of his own hit song. The Mick’s boyish good looks and runaway success made him a poster boy of mid-century American baby boomer aspirations. And Yogi had a cartoon bear named after him.
Yogi also said all that stuff. Things you’ve heard that you may or may not have realized he said. Or stuff you thought he said that he may not have said.
Best known is “It ain’t over til it’s over,” which is among the most famous of American axioms, and which he actually said, while managing the New York Mets in 1973. But there are a lot of others.
But he really did say that.
Nearly every game aired on channel 11 WPIX, and I watched as many as I could, which was nearly all of them.

Rooting for the Yankees is actually like rooting for the United States. Each in their own way, the Yankees and United States are the 300 lb. gorilla, that most powerful of entities winning far more than anyone else. Their wealth creates many advantages. Supporters expect them to win, and they usually do. Opponents absolutely revel in their defeats.
All that success means you will be adored by some non-natives who are tired of losing and want to bask in your glory, even if it must be from afar. But mostly you are hated. Anywhere you go in America, some people love the Yankees and many more hate them. Just like the United States is either loved or hated everywhere else in the world.
Who hates IBM?
And just as U.S. history, so stuffed with victory, is chock full of famous figures, so too is Yankee lore replete with famous men in pinstripes.
There are 53 former Yankee players, managers, and executives in the Baseball Hall of Fame, just over 1/6 of the Hall’s total membership.
Can I name them all? Of course not. That’s like naming all the presidents. I have a Ph.D. in history and I still get bogged down once I reach the 1840s (who comes after Van Buren?), and can’t resume a steady line until I re-emerge with Buchannan in 1856; you know, the guy before Lincoln.
For the average person, there are the biggies: Washington, Lincoln, a couple of Roosevelts and so forth.
For Yankee fans naming their club’s Hall of Famers is actually tougher than naming presidents. There have only been 43 presidents. So most fans know a bunch but not all of them, and then everyone knows the biggies, the Washingtons and Lincolns of baseball.
You don’t have to be a Yankees fan. Hell, you don’t even have to know anything about baseball. You’ve all heard of these guys because they transcend baseball. They’re part of American culture.
Babe Ruth, Lou Gehrig, Joe DiMaggio, Micky Mantle, and Yogi Berra. Those five.
Ruth is probably the single greatest baseball player of all time and still the most famous American athlete who ever lived; we’ll see how famous Michael Jordan is nearly 70 years after his death. Gherig’s got a disease named after him and hardly anyone knows its actual name (amyotrophic lateral sclerosis). DiMaggio became a memorable lyric in a seminal Simon and Garfunkle song thirty years after he was the topic of his own hit song. The Mick’s boyish good looks and runaway success made him a poster boy of mid-century American baby boomer aspirations. And Yogi had a cartoon bear named after him.
Yogi also said all that stuff. Things you’ve heard that you may or may not have realized he said. Or stuff you thought he said that he may not have said.
Best known is “It ain’t over til it’s over,” which is among the most famous of American axioms, and which he actually said, while managing the New York Mets in 1973. But there are a lot of others.
Or, as only Yogi could put it, speaking to the phenomenon of misattribution: I really didn’t say everything I said.
- When you come to a fork in the road, take it (giving directions to his home).
- You can observe a lot by just watching.
- No one goes there nowadays, it’s too crowded (speaking of the Copa Cabana nightclub).
- Baseball is ninety percent mental and the other half is physical.
- A nickel ain’t worth a dime anymore.
- Always go to other people’s funerals, otherwise they won’t come to yours.
- We made too many wrong mistakes.
But he really did say that.
by The Public Professor | Read more:
Image: uncredited
Thursday, September 24, 2015
Sex: The Kotaku Review
If you’re already a fan of Sex—and there are plenty of you out there—you probably don’t need this review. But if you find yourself on the fence about whether to try this much-heralded, much-argued-over activity, pull up a chair! We’ve got a lot to discuss.
Like many other extended franchise juggernauts, Sex has been around in some form or another for a long time. Originally released as an open-source application and carefully iterated upon over the years, it’s been through its fair share of reimaginings, reboots, and back-to-basics redesigns. Today’s Sex is the most technically advanced version yet, but as we all know, it takes more than eye-popping visuals and high-tech peripherals to make for a truly meaningful experience.
Sex is best understood as a freeform co-op experience where partners work together to achieve one or more user-defined goals. It’s most often played in groups of two, but sometimes more (or less). Broadly speaking, each match-up follows a similar structure–all players are helping one another to achieve a similar goal, and if they work well together, every player can “win.” Take a closer look, though, and you’ll see how creative Sex teams can be, combining inventive techniques with high-level mechanical mastery to achieve unusual but no less satisfying victories.
Aficionados will be pleased to hear that the Sex’s visual presentation is as great as ever–even though it doesn’t seem to have progressed much as of late. Then again, why mess with something that’s already working so well? Today’s Sex features advanced graphical techniques like soft body physics and subsurface scattering; these were incredible when they were first introduced, and they stand the test of time. But with technological innovations coming faster than ever and innovative new VR technology on the horizon, it’ll be important for Sex to step up its technology in the coming years to keep pace.
As true gamers know, it’s gameplay that matters most. The mechanics undergirding Sex are deceptively simple–even if you’ve never played, you probably already understand the fundamentals. There’s some stroking, and sliding, and slapping, and smacking, and, well, you know. All of that. The beauty of Sex is that those basic actions can be combined in all sorts of interesting ways. Sex embraces what game designers call the property of “emergence,” i.e. the designed opportunity for varied combinations of simple components to create a complex end result.
Despite those strong fundamentals, Sex is not without its share of technical issues. Sex can, and often does, fall prey to many of the same kinds of bugs and glitches we’ve seen in other multiplayer games: synchronization errors, dropped connections, poor response times, and the like. Some people seem to wait around forever in the matchmaking lobby, never getting to the actual game.
by Matthew S. Burns, Kotaku | Read more:
Image: Shutterstock
Like many other extended franchise juggernauts, Sex has been around in some form or another for a long time. Originally released as an open-source application and carefully iterated upon over the years, it’s been through its fair share of reimaginings, reboots, and back-to-basics redesigns. Today’s Sex is the most technically advanced version yet, but as we all know, it takes more than eye-popping visuals and high-tech peripherals to make for a truly meaningful experience.

Aficionados will be pleased to hear that the Sex’s visual presentation is as great as ever–even though it doesn’t seem to have progressed much as of late. Then again, why mess with something that’s already working so well? Today’s Sex features advanced graphical techniques like soft body physics and subsurface scattering; these were incredible when they were first introduced, and they stand the test of time. But with technological innovations coming faster than ever and innovative new VR technology on the horizon, it’ll be important for Sex to step up its technology in the coming years to keep pace.
As true gamers know, it’s gameplay that matters most. The mechanics undergirding Sex are deceptively simple–even if you’ve never played, you probably already understand the fundamentals. There’s some stroking, and sliding, and slapping, and smacking, and, well, you know. All of that. The beauty of Sex is that those basic actions can be combined in all sorts of interesting ways. Sex embraces what game designers call the property of “emergence,” i.e. the designed opportunity for varied combinations of simple components to create a complex end result.
Despite those strong fundamentals, Sex is not without its share of technical issues. Sex can, and often does, fall prey to many of the same kinds of bugs and glitches we’ve seen in other multiplayer games: synchronization errors, dropped connections, poor response times, and the like. Some people seem to wait around forever in the matchmaking lobby, never getting to the actual game.
by Matthew S. Burns, Kotaku | Read more:
Image: Shutterstock
Wednesday, September 23, 2015
Disconfirming Books
Yesterday the The New York Times had a fascinating piece about how ebook sales, contra Aggregation Theory, are actually declining even as publishers and book stores are thriving on the back of print:
Still, none of this explains why ebooks have been stopped in their tracks, and that’s where this discussion gets interesting: not only is it worth thinking about the ebook answer specifically, but also are there broader takeaways that explain what the theory got wrong, and how it can be made better?
EBOOK LESSONS TO BE LEARNED
I think there are three things to be learned from the plateauing in ebook sales:
Five years ago, the book world was seized by collective panic over the uncertain future of print. As readers migrated to new digital devices, ebook sales soared, up 1,260 percent between 2008 and 2010, alarming booksellers that watched consumers use their stores to find titles they would later buy online. Print sales dwindled, bookstores struggled to stay open, and publishers and authors feared that cheaper ebooks would cannibalize their business…
But the digital apocalypse never arrived, or at least not on schedule. While analysts once predicted that ebooks would overtake print by 2015, digital sales have instead slowed sharply. Now, there are signs that some ebook adopters are returning to print, or becoming hybrid readers, who juggle devices and paper. Ebook sales fell by 10 percent in the first five months of this year, according to the Association of American Publishers, which collects data from nearly 1,200 publishers. Digital books accounted last year for around 20 percent of the market, roughly the same as they did a few years ago.
Ebooks’ declining popularity may signal that publishing, while not immune to technological upheaval, will weather the tidal wave of digital technology better than other forms of media, like music and television.First off, I’m not necessarily surprised that publishers haven’t all gone bankrupt en masse. Much like the music labels publishers have always provided more than distribution, including funding (using a venture capital-like process where one hit pays for a bunch of losers), promotion (discovery is the biggest challenge in a world of abundance, and breaking through is expensive), and expertise (someone needs to do the editing, layout, cover design, etc.). And, as long as there is any print business at all, distribution still matters to a degree given the economics of writing a book: very high fixed costs with minimal marginal costs, which dictates as wide a reach as possible.
Still, none of this explains why ebooks have been stopped in their tracks, and that’s where this discussion gets interesting: not only is it worth thinking about the ebook answer specifically, but also are there broader takeaways that explain what the theory got wrong, and how it can be made better?
EBOOK LESSONS TO BE LEARNED
I think there are three things to be learned from the plateauing in ebook sales:
Price: The first thing to consider about ebooks — and the New York Times’ article touches on this — is that they’re not any cheaper than printed books; indeed, in many cases they are more expensive. The Wall Street Journal wrote earlier this month:
What is more interesting about the pricing issue, though, is that the publishers have removed what is traditionally one of digital’s advantages: that it is cheaper. That means the chief advantage of ebooks is that they are more convenient to acquire and store, and that’s about it. And, by extension, that raises the question about just how much lower prices play a role in the success of other aggregators.
User Experience: Note what is lacking when it comes to ebook’s advantages: the user experience. True, some people certainly prefer an e-reader (or their phone or tablet), but a physical book has its advantages as well: relative indestructibility, and little regret if it is destroyed or lost; tangibility, both in regards to feel and in the ability to notate; the ability to share or borrow; and, of course, the fact a book is an escape from the screens we look at nearly constantly. At the very best the user experience comparison (excluding the convenience factor) is a push; I’d argue it tilts towards physical books.
This is in marked contrast to many of the other industries mentioned above. When it comes to media, choosing a show on demand or an individual song is vastly preferable to a programming guide or a CD. Similarly, Uber is better than a taxi in nearly every way, particularly when it comes to payments; Airbnb offers far more selection and rooms that simply aren’t possible through hotel chains; Amazon has superior selection and superior prices, with delivery to your doorstep to boot. It’s arguable the user experience is undervalued in my Aggregation Theory analysis.
Modularization: Notice, though, that there is something in common to all of my user experience examples: what matters is not only that the aggregators are digital, but also that they broke up the incumbent offering to its atomic unit. Netflix offered shows, not channels; first iTunes then Spotify offered songs, not albums; Uber offered the ability to chart individual cars on-demand; Airbnb offered rooms, not hotels; Amazon offers every product, not just the ones that will fit in a bricks-and-mortar retail store.
Ebooks, on the other hand, well, they’re pretty much the same thing as physical books, except they need an expensive device to read them on, while books have their own built-in screen that is both disposable and of a superior resolution (no back-lighting though).
When the world’s largest publishers struck e-book distribution deals with Amazon.com Inc. over the past several months, they seemed to get what they wanted: the right to set the prices of their titles and avoid the steep discounts the online retail giant often applies. But in the early going, that strategy doesn’t appear to be paying off. Three big publishers that signed new pacts with Amazon— Lagardere SCA’s Hachette Book Group, News Corp’s HarperCollins Publishers and CBS Corp.’s Simon & Schuster—reported declining e-book revenue in their latest reporting periods.Pricing is certainly an art — go too low and you leave money on the table, go too high and you lose too many customers — and there is obviously a case to be made (and Amazon has made it) that in the case of books there is significant elasticity (i.e. price has a significant impact on purchase decisions). Then again, while e-book sales have fallen, they’ve stayed the same percentage of overall book sales — about 20% — which potentially means that the price change didn’t really have an effect at all (more on this in a bit).
What is more interesting about the pricing issue, though, is that the publishers have removed what is traditionally one of digital’s advantages: that it is cheaper. That means the chief advantage of ebooks is that they are more convenient to acquire and store, and that’s about it. And, by extension, that raises the question about just how much lower prices play a role in the success of other aggregators.
User Experience: Note what is lacking when it comes to ebook’s advantages: the user experience. True, some people certainly prefer an e-reader (or their phone or tablet), but a physical book has its advantages as well: relative indestructibility, and little regret if it is destroyed or lost; tangibility, both in regards to feel and in the ability to notate; the ability to share or borrow; and, of course, the fact a book is an escape from the screens we look at nearly constantly. At the very best the user experience comparison (excluding the convenience factor) is a push; I’d argue it tilts towards physical books.
This is in marked contrast to many of the other industries mentioned above. When it comes to media, choosing a show on demand or an individual song is vastly preferable to a programming guide or a CD. Similarly, Uber is better than a taxi in nearly every way, particularly when it comes to payments; Airbnb offers far more selection and rooms that simply aren’t possible through hotel chains; Amazon has superior selection and superior prices, with delivery to your doorstep to boot. It’s arguable the user experience is undervalued in my Aggregation Theory analysis.
Modularization: Notice, though, that there is something in common to all of my user experience examples: what matters is not only that the aggregators are digital, but also that they broke up the incumbent offering to its atomic unit. Netflix offered shows, not channels; first iTunes then Spotify offered songs, not albums; Uber offered the ability to chart individual cars on-demand; Airbnb offered rooms, not hotels; Amazon offers every product, not just the ones that will fit in a bricks-and-mortar retail store.
Ebooks, on the other hand, well, they’re pretty much the same thing as physical books, except they need an expensive device to read them on, while books have their own built-in screen that is both disposable and of a superior resolution (no back-lighting though).
by Ben Thompson, Stratechery | Read more:
Image: Stratechery
When Dinner Proves Divisive: One Strategy, Many Dishes
[ed. See also: Best Weeknight Recipes]
Back when I cooked only to please myself and one or two other consenting adults, choosing recipes was a breeze. Nothing was off limits. Dishes with olives, stinky cheeses, bitter greens and mushrooms — sometimes all of the above — were on regular rotation. Then I began cooking for kids (picky, omnivorous and otherwise). With them came their nut-allergic friends, vegan guitar teachers and chile-fearing in-laws. Forced to adapt my NC-17 cooking style to a G-rated audience, I paged through cookbooks in search of “crowd pleasers” that proved elusive.

Eventually, I realized that the quest for a perfect recipe that pleases everyone at the table, including oneself, was fruitless.
But in the process, a workaround solution emerged: recipes that could be configured to produce many different dishes at one meal. Like Transformers or fantasy football teams, these meals are both modular and complete, constructed from parts that can be added or subtracted from at whim.
Suddenly, my weeknight repertoire increased exponentially. It’s easier on the cook when the week assumes a familiar pattern — pasta one night, a main-course salad another night, beans on a third — but to prevent boredom, the dishes themselves needn’t be exactly the same. (Unless, of course, the culinary conservative in your household demands otherwise.)
Just like taco night or baked-potato night, the meal starts with a base element: pasta, beans, fluffy greens. After that, it’s about piling on, or politely passing along, the garnishes.
The definition of a garnish may need some stretching: This is not a shy sprinkling of parsley or a scattering of sesame seeds. The garnish that makes a meal must be full-throated and filling. Half of a ripe avocado is a garnish. Likewise, a soft-yolk egg (boiled, poached or fried). Bacon lardons, shredded chicken and diced steak. Crushed chiles and leftover roasted vegetables. With enough garnishes, even the plainest of plain foods — pasta with butter and cheese — can balloon into a lively meal.
by Julia Moskin, NY Times | Read more:
Image: Melina Hammer Tuesday, September 22, 2015
Death to the Internet! Long Live the Internet!
Net neutrality, cultural decay, the corporate web, classism, & the decline of western civilization — all in one convenient essay!
Even now it’s a struggle to clearly remember that ecstatic time of positive internet esprit de corps before money and narcissism utterly dominated the culture. Those ancient ‘90s to early oughts before endlessly aggressive advertising, encyclopedic terms of service, incessant tracking, the constant need to register everywhere, subversive clickbait, the legions of trolls, threats of doxxing, careers ended by a single tweet, and all those untiring spam bots which attempt to plague every digital inch of it.
Difficult to explain to anyone under twenty-five who did not directly witness the foundational times. Or anyone over twenty-five who did not participate. Or to anyone right now who uses only Facebook and Amazon. That lost age has become the Old West of the internet: a brief memory before once verdant lands were dominated and overrun by exploitative business interests and ignorant bumbling settlers. You can’t go back, and there’s no museum for an experience. That early culture was ineffable and fleeting. Not unlike, say: the concept of lifetime job security, which no longer even seems plausible.
Now, of course, plenty of happy and creative people still use the internet (at least, to like, buy an appliance or a book or something) but they don’t make up most of internet culture; that majority of online participation which sets the social standards, creates the original content, and is now broadly, inescapably corralled by social media. Those who spend more than 20 hours a week actively participating online (like me) who are forced into the corporate tide, or relegated to the sluggish unknown hinterlands. (...)
Need we wonder why the book “Nineteen Eighty-Four” remains so relevant? Even thirty years after Steve Jobs commemorated the futuristic date by ironically pretending to destroy the entrenched corporate power structure. The same man who turned out to be one of the most proprietary-minded technologists ever to influence popular computing culture. The person who cemented the sale of style over utility, which continues to unendingly trick people. Selling the trappings of refined taste instead of core pragmatism. Like how the classic campaign to “Think Different” fetishized intellectual and artistic rebellion in order to ironically sell a massmarket consumer product. And it worked amazingly. People have been strongly influenced to desire a unique personal experience and an individualized version of success instead of a shared communal growth. So in this fragmented and increasingly de-localized culture, everyone becomes the protagonist of their own little narcissistic adventure instead of a powerful collective assisting each other for the greater good. And because not everyone can be that one-in-billion genius, much existential disappointment has been ingrained once it was set as the highest goal.
This is advantageous to business interests because unsatisfied people are more susceptible to the sale of solutions to combat unhappiness. And this emotional and cultural development also makes it easier to dehumanize others, to be jealous of their successes, and feel left out when not receiving high accolades. Creating the much lamented vicious cycle of kindergarten graduation ceremonies and participation trophies which has wrought themost egotistical generation ever recorded. It also has an oligarchic benefit of justifying power held in the small circles of the moneyed class, because success, even if born into, is often assumed to be deserved.
So it’s no coincidence that wealthy special interests have gained massive control over democracy by incentivizing and preaching the supremacy of individual gains over communal interests. Unlike a more simplistic fascism, this grants minority power to the upperclass by motivating the populace to work hard towards individual goals and individual distractions without requiring the classic top-down crushing social conformity which is more obvious and easier to fight. Instead, the insidious dreams of grand individual success, in spite of all contrary indications, keeps everyone’s broader rewards lowered. It’s like a lottery for human desires: many pay in and get essentially nothing while a tiny few win it all so as to demonstrate it is supposedly possible. Justified elite power is the cultural root of corruption, as Thomas Jefferson ironically understood, and must be fought with repeated revolution.
We all recognize a nebulous natural cynicism these days found not only in the post-apocalyptic and zombie fictions so symbolically appealing to our collective unconscious, but also in the simple facts of a historically deadlocked legislature, a rampantly scare-mongering media, the rise once again of an excessively wealthy upperclass, and the corruption of debt-based higher learning. That last being perhaps the most intellectually disheartening, as the ivory tower repeatedly demonstrates its moral bankruptcy by a reliance on horrific levels of tuition, exploitive wasteful sporting, shoddy oversight of publishing, general lack of moral center, and a scattered vision for the future (pigeon-holed rather correctly by conservatives as often out of touch). Much could perhaps be excused by the inevitable corruption of institutionalization, but where is the forethought of previous generations? Why must we rely on impulsive social media and a polarized profit-oriented mass media for our appraisals of the future?
If Obama’s unpredictable election proved anything it’s that positive ideological movements are so frightening to the moneyed establishment they’ll foster complete obstruction to thwart even the simple belief that hope and change are actually possible. Generating cynicism aids complacency, because it’s difficult for a person dealing with all their own daily struggles to constantly study the complex system and renew the idealism required to force political change, especially during periods of nominally acceptable economic stability. Revolutions are motivated by hunger and heavy oppression, generally years after the slow and determined rise of a stratified class system (a pattern which has plagued us since the dawn of civilization).
For thirty years now capitalism’s trickle-down variant has been systematically attempting to recreate an intransigent system of wealth and privilege. Conservative propaganda has assured us that if the rich succeed, everyone benefits. But how long must this ludicrous delusion be perpetuated? Is not the entire history of civil humanity a testament to the popular misery of allowing an upper class minority to rule? This should be especially poignant in a country which was designed to break hereditary dominance and unrepresentative power. Yet here we are again, watching civilization repeat its famous pattern, locking the populace into hard work and distraction without sharing in the full rewards. America chugs along with its bread and circuses, like a late-season Happy Days episode, where the original magic is gone but the characters continue acting out a hollow version of the thing we used to love and cherish. So goes sitcoms… so goes the world wide web… so goes civilization…
The rise of an entirely corporate internet is just one more idealistic casualty of allowing the amoral dollar to inform every aspect our lives. Market efficiencies, so touted by the right, can generate competition between otherwise possible monopolies, but function best only in fields of limited and uncoordinated resources. They are not necessitated to everything, and especially something as nearly immaterial and gigantic as cyberspace, where supply and demand do not function normally; a place where capitalism has often struggled to find what it can sell. Where demand has to be generated artificially with subtle and disguised viral marketing to trick and deceive us. The newest things you didn’t know you needed but all the cool kids have. Since wealth expands to dominate all emerging cultural forms, it works to control even the nearly limitless virtual environments formed of patterned energy and communal human consciousness.
In the same manner that liberty gets subsumed for security, creativity often dies upon the altar of sales. Advertising’s goal is convincing and deceiving, not compassion. It is the art of propaganda and should constantly be doubted. Excessive needs, worries, and calamities are fostered so that new cures and products can be sold. Just as rulers create fear to limit freedom, so corporations must generate the need for increased consumption.
Cultivating social anxiety can make warrantless wiretapping, indefinite detention, terrorist watchlists, illegal foreign prisons, preemptive perpetual war, pushbutton murder by drone, and being bathed in x-rays at every airport seem incrementally acceptable. If you pile on the impediments slowly, and each seems necessary at the time, they morph into those inevitable and accepted hassles of modern life. Such as how general anxiety generates the sale of status items, snake-oil cures, distracting entertainments, and self-help regimes — it’s the creep of supposed necessity. Just like websites becoming overrun with advertisements, click-bait, registering, tracking, profiling, and endless general noise. In return for which we get increasingly bland and controlled services. With all these small losses, the cultural whole is diminished.
by Nicholas Kerkhoff, Medium | Read more:
Image: uncredited
Even now it’s a struggle to clearly remember that ecstatic time of positive internet esprit de corps before money and narcissism utterly dominated the culture. Those ancient ‘90s to early oughts before endlessly aggressive advertising, encyclopedic terms of service, incessant tracking, the constant need to register everywhere, subversive clickbait, the legions of trolls, threats of doxxing, careers ended by a single tweet, and all those untiring spam bots which attempt to plague every digital inch of it.

Now, of course, plenty of happy and creative people still use the internet (at least, to like, buy an appliance or a book or something) but they don’t make up most of internet culture; that majority of online participation which sets the social standards, creates the original content, and is now broadly, inescapably corralled by social media. Those who spend more than 20 hours a week actively participating online (like me) who are forced into the corporate tide, or relegated to the sluggish unknown hinterlands. (...)
Need we wonder why the book “Nineteen Eighty-Four” remains so relevant? Even thirty years after Steve Jobs commemorated the futuristic date by ironically pretending to destroy the entrenched corporate power structure. The same man who turned out to be one of the most proprietary-minded technologists ever to influence popular computing culture. The person who cemented the sale of style over utility, which continues to unendingly trick people. Selling the trappings of refined taste instead of core pragmatism. Like how the classic campaign to “Think Different” fetishized intellectual and artistic rebellion in order to ironically sell a massmarket consumer product. And it worked amazingly. People have been strongly influenced to desire a unique personal experience and an individualized version of success instead of a shared communal growth. So in this fragmented and increasingly de-localized culture, everyone becomes the protagonist of their own little narcissistic adventure instead of a powerful collective assisting each other for the greater good. And because not everyone can be that one-in-billion genius, much existential disappointment has been ingrained once it was set as the highest goal.
This is advantageous to business interests because unsatisfied people are more susceptible to the sale of solutions to combat unhappiness. And this emotional and cultural development also makes it easier to dehumanize others, to be jealous of their successes, and feel left out when not receiving high accolades. Creating the much lamented vicious cycle of kindergarten graduation ceremonies and participation trophies which has wrought themost egotistical generation ever recorded. It also has an oligarchic benefit of justifying power held in the small circles of the moneyed class, because success, even if born into, is often assumed to be deserved.
So it’s no coincidence that wealthy special interests have gained massive control over democracy by incentivizing and preaching the supremacy of individual gains over communal interests. Unlike a more simplistic fascism, this grants minority power to the upperclass by motivating the populace to work hard towards individual goals and individual distractions without requiring the classic top-down crushing social conformity which is more obvious and easier to fight. Instead, the insidious dreams of grand individual success, in spite of all contrary indications, keeps everyone’s broader rewards lowered. It’s like a lottery for human desires: many pay in and get essentially nothing while a tiny few win it all so as to demonstrate it is supposedly possible. Justified elite power is the cultural root of corruption, as Thomas Jefferson ironically understood, and must be fought with repeated revolution.
We all recognize a nebulous natural cynicism these days found not only in the post-apocalyptic and zombie fictions so symbolically appealing to our collective unconscious, but also in the simple facts of a historically deadlocked legislature, a rampantly scare-mongering media, the rise once again of an excessively wealthy upperclass, and the corruption of debt-based higher learning. That last being perhaps the most intellectually disheartening, as the ivory tower repeatedly demonstrates its moral bankruptcy by a reliance on horrific levels of tuition, exploitive wasteful sporting, shoddy oversight of publishing, general lack of moral center, and a scattered vision for the future (pigeon-holed rather correctly by conservatives as often out of touch). Much could perhaps be excused by the inevitable corruption of institutionalization, but where is the forethought of previous generations? Why must we rely on impulsive social media and a polarized profit-oriented mass media for our appraisals of the future?
If Obama’s unpredictable election proved anything it’s that positive ideological movements are so frightening to the moneyed establishment they’ll foster complete obstruction to thwart even the simple belief that hope and change are actually possible. Generating cynicism aids complacency, because it’s difficult for a person dealing with all their own daily struggles to constantly study the complex system and renew the idealism required to force political change, especially during periods of nominally acceptable economic stability. Revolutions are motivated by hunger and heavy oppression, generally years after the slow and determined rise of a stratified class system (a pattern which has plagued us since the dawn of civilization).
For thirty years now capitalism’s trickle-down variant has been systematically attempting to recreate an intransigent system of wealth and privilege. Conservative propaganda has assured us that if the rich succeed, everyone benefits. But how long must this ludicrous delusion be perpetuated? Is not the entire history of civil humanity a testament to the popular misery of allowing an upper class minority to rule? This should be especially poignant in a country which was designed to break hereditary dominance and unrepresentative power. Yet here we are again, watching civilization repeat its famous pattern, locking the populace into hard work and distraction without sharing in the full rewards. America chugs along with its bread and circuses, like a late-season Happy Days episode, where the original magic is gone but the characters continue acting out a hollow version of the thing we used to love and cherish. So goes sitcoms… so goes the world wide web… so goes civilization…
The rise of an entirely corporate internet is just one more idealistic casualty of allowing the amoral dollar to inform every aspect our lives. Market efficiencies, so touted by the right, can generate competition between otherwise possible monopolies, but function best only in fields of limited and uncoordinated resources. They are not necessitated to everything, and especially something as nearly immaterial and gigantic as cyberspace, where supply and demand do not function normally; a place where capitalism has often struggled to find what it can sell. Where demand has to be generated artificially with subtle and disguised viral marketing to trick and deceive us. The newest things you didn’t know you needed but all the cool kids have. Since wealth expands to dominate all emerging cultural forms, it works to control even the nearly limitless virtual environments formed of patterned energy and communal human consciousness.
In the same manner that liberty gets subsumed for security, creativity often dies upon the altar of sales. Advertising’s goal is convincing and deceiving, not compassion. It is the art of propaganda and should constantly be doubted. Excessive needs, worries, and calamities are fostered so that new cures and products can be sold. Just as rulers create fear to limit freedom, so corporations must generate the need for increased consumption.
Cultivating social anxiety can make warrantless wiretapping, indefinite detention, terrorist watchlists, illegal foreign prisons, preemptive perpetual war, pushbutton murder by drone, and being bathed in x-rays at every airport seem incrementally acceptable. If you pile on the impediments slowly, and each seems necessary at the time, they morph into those inevitable and accepted hassles of modern life. Such as how general anxiety generates the sale of status items, snake-oil cures, distracting entertainments, and self-help regimes — it’s the creep of supposed necessity. Just like websites becoming overrun with advertisements, click-bait, registering, tracking, profiling, and endless general noise. In return for which we get increasingly bland and controlled services. With all these small losses, the cultural whole is diminished.
by Nicholas Kerkhoff, Medium | Read more:
Image: uncredited
Labels:
Critical Thought,
Culture,
Media,
Politics,
Technology
The Dimming of the Light
With its revolutionary heat and rational cool, French thought once dazzled the world. Where did it all go wrong?
There are many things we have come to regard as quintessentially French: Coco Chanel’s little black dress, the love of fine wines and gastronomy, the paintings of Auguste Renoir, the smell of burnt rubber in the Paris Métro. Equally distinctive is the French mode and style of thinking, which the Irish political philosopher Edmund Burke described in 1790 as ‘the conquering empire of light and reason’. He meant this as a criticism of the French Revolution, but this expression would undoubtedly have been worn as a badge of honour by most French thinkers from the Enlightenment onwards.
Indeed, the notion that rationality is the defining quality of humankind was first celebrated by the 17th-century thinker René Descartes, the father of modern French philosophy. His skeptical method of reasoning led him to conclude that the only certainty was the existence of his own mind: hence his ‘cogito ergo sum’ (‘I think, therefore I am’). This French rationalism was also expressed in a fondness for abstract notions and a preference for deductive reasoning, which starts with a general claim or thesis and eventually works its way towards a specific conclusion – thus the consistent French penchant for grand theories. As the essayist Emile Montégut put it in 1858: ‘There is no people among whom abstract ideas have played such a great role, and whose history is rife with such formidable philosophical tendencies.’
The French way of thinking is a matter of substance, but also style. This is most notably reflected in the emphasis on rhetorical elegance and analytical lucidity, often claimed to stem from the very properties of the French language: ‘What is not clear,’ affirmed the writer Antoine de Rivarol in 1784, somewhat ambitiously, ‘is not French.’ Typically French, too, is a questioning and adversarial tendency, also arising from Descartes’ skeptical method. The historian Jules Michelet summed up this French trait in 1974 in the following way: ‘We gossip, we quarrel, we expend our energy in words; we use strong language, and fly into great rages over the smallest of subjects.’ A British Army manual issued before the Normandy landings in 1944 sounded this warning about the cultural habits of the natives: ‘By and large, Frenchmen enjoy intellectual argument more than we do. You will often think that two Frenchmen are having a violent quarrel when they are simply arguing about some abstract point.’
Yet even this disputatiousness comes in a very tidy form: the habit of dividing issues into two. It is not fortuitous that the division of political space between Left and Right is a French invention, nor that the distinction between presence and absence lies at the heart of Jacques Derrida’s philosophy of deconstruction. French public debate has been framed around enduring oppositions such as good and evil, opening and closure, unity and diversity, civilisation and barbarity, progress and decadence, and secularism and religion.
Underlying this passion for ideas is a belief in the singularity of France’s mission. This is a feature of all exceptionalist nations, but it is rendered here in a particular trope: that France has a duty to think not just for herself, but for the whole world. In the lofty words of the author Jean d’Ormesson, writing in the magazine Le Point in 2011: ‘There is at the heart of Frenchness something which transcends it. France is not only a matter of contradiction and diversity. She also constantly looks over her shoulder, towards others, and towards the world which surrounds her. More than any nation, France is haunted by a yearning towards universality.’
This specification of a distinct French way of thinking is not rooted in a claim about Gallic ‘national character’. These ideas are not a genetic inheritance, but rather the product of specific social and political factors. The Enlightenment, for example, was a cultural phenomenon which spread rationalist ideas across Europe and the Americas. But in France, from the mid-18th century, this intellectual movement produced a particular type of philosophical radicalism, which was articulated by a remarkable group of thinkers, the philosophes. Thanks to the influence of the likes of Voltaire, Diderot and Rousseau, the French version of rationalism took on a particularly anti-clerical, egalitarian and transformative quality. These subversive precepts also circulated through another French cultural innovation, the salon: this private cultural gathering flourished in high society, contributing to the dissemination of philosophical and artistic ideas among French elites, and the empowerment of women.
This intellectual effervescence challenged the established order of the ancien régime during the second half of the 18th century. It also gave a particularly radical edge to the French Revolution, compared, notably, with its American counterpart. Thus, 1789 was not only a landmark in French thought, but the culmination of the Enlightenment’s philosophical radicalism: it gave rise to a new republican political culture, and enduringly associated the very idea of Frenchness with novelty and resistance to oppression. It also crystallised an entirely original way of thinking about the public sphere, centred around general principles such as the ‘Declaration of the Rights of Man’, the civic conception of the nation (resting on shared values as opposed to blood ties), the ideals of liberty, equality and fraternity, and the notions of the general interest and popular sovereignty.
One might object that, despite this common and lasting revolutionary heritage, the French have remained too diverse and individualistic to be characterised in terms of a general mind-set. Yet there are two decisive reasons why it is possible – and indeed necessary – to speak of a collective French way of thinking. Firstly, since the Enlightenment, France has granted a privileged role to thinkers, recognising them as moral and spiritual guides to society – a phenomenon reflected in the very notion of the ‘intellectual’, which is a late-19th-century (French) invention. Public intellectuals exist elsewhere, of course, but in France they enjoy an unparalleled degree of visibility and social legitimacy.
Secondly, to an extent that is also unique in modern Western culture, France’s major cultural bodies – from the State to the great institutions of secondary and higher education, the major academies, the principal publishing houses, and the leading press organs – are all concentrated in Paris. This cultural centralisation extends to the school curriculum (all high-school students have to study philosophy up to the baccalauréat), and this explains how and why French ways of thought have exhibited such a striking degree of stylistic consistency.
by Sudhir Hazareesingh, Aeon | Read more:
Image: Jean-Paul Sartre and Simone de Beauvoir having lunch at the "La Coupole" Brasserie, December 1973. Photo by Guy Le Querrec/Magnum
Ryan Adams
[ed. Ryan Adams reimagines Taylor Swift's 1989.]
People like you always want back the love they gave away
And people like me wanna believe you when you say you've changed
The more I think about it now
The less I know
All I know is that you drove us off the road
Stay
Hey, all you had to do was stay
Had me in the palm of your hand
Man, why'd you have to go and lock me out when I let you in
Stay, hey, now you say you want it
Back, now that it's just too late
Well could've been easy, all you had to do was stay
All you had to do was stay
more...
And people like me wanna believe you when you say you've changed
The more I think about it now
The less I know
All I know is that you drove us off the road
Stay
Hey, all you had to do was stay
Had me in the palm of your hand
Man, why'd you have to go and lock me out when I let you in
Stay, hey, now you say you want it
Back, now that it's just too late
Well could've been easy, all you had to do was stay
All you had to do was stay
more...
Let's Sell Some Shit To These Millennials
Welcome, everyone, and thanks for choosing Market 2 Millennials Co. I’m thrilled to be working with you all from Hardwick Sandwich Bags. In front of each seat at this conference table is a cheese sandwich, in a bag, with a name written on it. Please, find yours. And as you do, I just have to say, and I’m speaking from the heart here: You folks make great bags. It was a joy bagging those sandwiches for you. So baggable.
But you have a problem. We conducted a survey for you, and, listen, there’s no easy way to say this, so I’m just going to be upfront: While millennials are Snapchatting, they are ninety-seven percent less likely to be bagging lunch. Less than one tenth of one percent of all tweets are about sandwich bags. Let that sink in. Nobody is tweeting about bags.
This won’t turn itself around. Millennials weren’t raised the way we were, with a passion for sandwich bags. To instill it in them, I’ve developed a customized, four-step plan that I call B.A.G.S. Let’s go through it.
Step “B”: Build The Base! First, we need millennials to understand sandwich bags. We need to make them relevant. The kids, they don’t eat sandwiches. They’ve never seen one. But their iPhone? That fits inside a bag. And the bag is clear. Do you see where I’m going with this? If you were all 24 years old, you would. Kids can take a selfie… with the phone… inside the bag! So fun, right? So we’ll get them started, and have them call it a baggie. They’ll hashtag it: #baggie. All the love millennials have for selfies will be transposed onto bags, and then it’s time for…
Step “A”: Activate! It’s off to the races. Millennials will do literally anything you ask, so long as it involves a catchy hashtag. We’ll tweet out #MyBagBrag, inviting millennials to show off their brand-new bags. We’ve got #FlyTheBagFlag, where a kid ties the corner of a bag to his finger, waves it around and Vines it. And you’ve got to go edgy, of course, so we’ll do #ShagBag, where the kids are encouraged to have sex with a bag. Don’t worry about logistics. They’ll figure it out.
Making you hungry yet? Feel free to open those bags in front of you. God, I love that sound of crinkly plastic.
Step “G”: Galvanize! Once we’ve shown millennials what a bag can do, they’ll need a bag of their own. This generation loathes anything from before 1997. Just look at the briefcase industry—absolutely murdered by millennials. And that’s why it’s critical, right now, that you launch a new line of sandwich bags very explicitly for young people. They need to feel involved. Catered to.
Here’s the rollout: First, we hire YouTube stars to walk into bars in New York and L.A., order beer, and pour it all directly into a bag. But not just any bag: a gold-tinted bag that says SWAG BAG in big, bold letters. All the millennials in the bar will crowd around and Instagram it. Everyone will want to know: What’s this bag, and where can I get it? Great buzz there. And then we announce the product with a big, splashy, sponsored content post on BuzzFeed called 14 Things You Can Put In Bags. By week’s end, millennials will be lining up overnight outside of supermarkets like they’re buying the Apple Watch.

This won’t turn itself around. Millennials weren’t raised the way we were, with a passion for sandwich bags. To instill it in them, I’ve developed a customized, four-step plan that I call B.A.G.S. Let’s go through it.
Step “B”: Build The Base! First, we need millennials to understand sandwich bags. We need to make them relevant. The kids, they don’t eat sandwiches. They’ve never seen one. But their iPhone? That fits inside a bag. And the bag is clear. Do you see where I’m going with this? If you were all 24 years old, you would. Kids can take a selfie… with the phone… inside the bag! So fun, right? So we’ll get them started, and have them call it a baggie. They’ll hashtag it: #baggie. All the love millennials have for selfies will be transposed onto bags, and then it’s time for…
Step “A”: Activate! It’s off to the races. Millennials will do literally anything you ask, so long as it involves a catchy hashtag. We’ll tweet out #MyBagBrag, inviting millennials to show off their brand-new bags. We’ve got #FlyTheBagFlag, where a kid ties the corner of a bag to his finger, waves it around and Vines it. And you’ve got to go edgy, of course, so we’ll do #ShagBag, where the kids are encouraged to have sex with a bag. Don’t worry about logistics. They’ll figure it out.
Making you hungry yet? Feel free to open those bags in front of you. God, I love that sound of crinkly plastic.
Step “G”: Galvanize! Once we’ve shown millennials what a bag can do, they’ll need a bag of their own. This generation loathes anything from before 1997. Just look at the briefcase industry—absolutely murdered by millennials. And that’s why it’s critical, right now, that you launch a new line of sandwich bags very explicitly for young people. They need to feel involved. Catered to.
Here’s the rollout: First, we hire YouTube stars to walk into bars in New York and L.A., order beer, and pour it all directly into a bag. But not just any bag: a gold-tinted bag that says SWAG BAG in big, bold letters. All the millennials in the bar will crowd around and Instagram it. Everyone will want to know: What’s this bag, and where can I get it? Great buzz there. And then we announce the product with a big, splashy, sponsored content post on BuzzFeed called 14 Things You Can Put In Bags. By week’s end, millennials will be lining up overnight outside of supermarkets like they’re buying the Apple Watch.
by Jason Feifer, The Awl | Read more:
Image: PrismpakAmazon Cuts Price of Prime Subscription For One Day Only

The retail giant announced on Tuesday that it's trimming the cost of Amazon Prime to $67 for one year from the usual $99. The one-day price cut will start Friday, September 25 at 12:00 a.m. ET and end at 11:59 p.m. PT. The deal is good only for new subscribers (sorry, all you existing Prime members) and will available at the Amazon Prime sign-up page. (...)
Normally $99 a year, a Prime membership offers subscribers Prime instant video, free two-day shipping, unlimited music streaming with Prime Music, unlimited photo storage with Prime Photos and a Kindle Owner's Lending Library with more than 800,000 e-books.
by Lance Whitney, CNET | Read more:
Image: via:
Monday, September 21, 2015
Why a VR Game About Flirting is as Scary as a Horror Game
On the surface, the two PlayStation VR games on display at Sony’s Tokyo Game Show booth couldn’t be any more different. One is a horror scenario that drops you into a gruesome, terrifying predicament. The other puts you on a beautiful seaside next to an attractive young lady.
But after playing both, I couldn’t shake the feeling that I’d played the same demo twice. The sheer intimacy of these experiences—the feeling of having one’s personal space invaded, if you will—was unmistakeable, and I can still feel it now, at a distance.
A long line stretches to the back of Sony’s Tokyo Game Show stand, people patiently awaiting their turn on the 20 or so demo stations of PlayStation VR (nee Project Morpheus), Sony’s upcoming virtual reality headset for PlayStation 4. There were many different demos, but those that left the deepest impression on me were Capcom’s horror demo Kitchen and Bandai Namco’s romance sim Summer Lesson. (...)
My reaction to fake scary things, I have discovered by going to a few of those crazy haunted houses, is to laugh nervously. I was cracking up during Kitchen because Kitchen was some no-holds-barred scary stuff. At one point, I am ashamed to say, some part of the VR rig slipped down and tapped me on the shoulder, making me gasp quite loudly and reach around ready to kill whatever it was.
Then the ghoul left the room. And nothing happened for a while.
This was the scariest thing of all, because I knew something was going to jump out at me. But where? When? In a PlayStation VR demo, when time is of the essence because you move people through as soon as possible, Sony let this go on for a while. Nothing was happening—nothing—and I was as riveted as I’ve ever been in a game. When I died, following another very up close and personal encounter from which I could not look away, it came as something of a relief.
I didn’t expect to feel similarly uncomfortable during Summer Lesson, the pitch for which is you are a teacher providing a private tutoring session to an attractive female student.
Things started off fairly benign, with the student appearing and sitting down next to you, reading from a textbook. You could choose to teach English to a Japanese student in her bedroom, or Japanese to an American student at a beach house. At one point, and this happens in either scenario, the student leans in very close and asks, “Sensei, how do you read this word?” Then she places the book in front of your face, leaning into you in what can only be described as an extremely intimate manner.

A long line stretches to the back of Sony’s Tokyo Game Show stand, people patiently awaiting their turn on the 20 or so demo stations of PlayStation VR (nee Project Morpheus), Sony’s upcoming virtual reality headset for PlayStation 4. There were many different demos, but those that left the deepest impression on me were Capcom’s horror demo Kitchen and Bandai Namco’s romance sim Summer Lesson. (...)
My reaction to fake scary things, I have discovered by going to a few of those crazy haunted houses, is to laugh nervously. I was cracking up during Kitchen because Kitchen was some no-holds-barred scary stuff. At one point, I am ashamed to say, some part of the VR rig slipped down and tapped me on the shoulder, making me gasp quite loudly and reach around ready to kill whatever it was.
Then the ghoul left the room. And nothing happened for a while.
This was the scariest thing of all, because I knew something was going to jump out at me. But where? When? In a PlayStation VR demo, when time is of the essence because you move people through as soon as possible, Sony let this go on for a while. Nothing was happening—nothing—and I was as riveted as I’ve ever been in a game. When I died, following another very up close and personal encounter from which I could not look away, it came as something of a relief.
I didn’t expect to feel similarly uncomfortable during Summer Lesson, the pitch for which is you are a teacher providing a private tutoring session to an attractive female student.
Things started off fairly benign, with the student appearing and sitting down next to you, reading from a textbook. You could choose to teach English to a Japanese student in her bedroom, or Japanese to an American student at a beach house. At one point, and this happens in either scenario, the student leans in very close and asks, “Sensei, how do you read this word?” Then she places the book in front of your face, leaning into you in what can only be described as an extremely intimate manner.
by Chris Kohler, Wired | Read more:
Image: Bandai Namco
Perfect Genetic Knowledge
Genomics is about to transform the world.
In case you weren’t paying attention, a lot has been happening in the science of genomics over the past few years. It is, for example, now possible to read one human genome and correct all known errors. Perhaps this sounds terrifying, but genomic science has a track-record in making science fiction reality. ‘Everything that’s alive we want to rewrite,’ boasted Austen Heinz, the CEO of Cambrian Genomics, last year.
It was only in 2010 that Craig Venter’s team in Maryland led us into the era of synthetic genomics when they created Synthia, the first living organism to have a computer for a mother. A simple bacterium, she has a genome just over half a million letters of DNA long, but the potential for scaling up is vast; synthetic yeast and worm projects are underway. (...)
Commensurate with their power to change biology as we know it, the new technologies are driving renewed ethical debates. Uneasiness is being expressed, not only among the general public, but also in high-profile articles and interviews by scientists. When China announced it was modifying human embryos this April, the term ‘CRISPR/Cas9’ trended on the social media site Twitter. CRISPR/Cas9, by the way, is a protein-RNA combo that defends bacteria against marauding viruses. Properly adapted, it allows scientists to edit strings of DNA inside living cells with astonishing precision. It has, for example, been used to show that HIV can be ‘snipped’ out of the human genome, and that female mosquitoes can be turned male to stop the spread of malaria (only females bite).
But one of CRISPR’s co-developers, Jennifer Doudna of the University of California in Berkeley, has ‘strongly discouraged’ any attempts to edit the human genome pending a review of the ethical issues. Well, thanks to China, that ship has sailed. Indeed, now the technology appears to be finding its way into the hands of hobbyists: Nature recently reported that members of the ‘biohacker’ sub-culture have been messing around with CRISPR, though the enthusiast they interviewed didn’t appear to have a clear idea of what he wanted to do with it.
Given that our genetic abilities appear to be reaching a critical threshold, it is worth taking a fairly hard-headed look at what the next few years promise. For instance, could DNA solve some of our pressing energy issues? One project hopes to engineer trees that glow in the dark. You can sign up to preorder one now – at least the weed version of it; trees take too long to mature to be good prototypes. Perhaps the day is not far off when our streets are lined with bioluminescent foliage. This would presumably drive electric streetlamps into obsolescence, like so many other energy-hungry ‘old-fashioned’ technologies.
But this is hardly the only potentially revolutionary project that aims to play out in the next five to 10 years. Venter is working on re-engineering pig lungs so that they can be used in human transplants. This could have a much larger impact than is immediately obvious: about one in 10 deaths in Europe is caused by lung disease. Farther afield, Venter is in the race to find life on Mars with DNA sequencers, and is developing methods of ‘biological teleportation’ – the idea is that you sequence microbial DNA on Mars and then reconstruct the genomes on Earth using 3D printing. The process could work the other way around, too. Venter and Elon Musk are talking of using this technology to terraform Mars with 3D-printed earthly microbes. The whole thing boggles the imagination, of course, but Venter and Musk do have form for pulling off amazing feats. Nevertheless, perhaps we should start our tour of the horizon closer to home.
By 2020, many hospitals will have genomic medicine departments, designing medical therapies based on your personal genetic constitution. Gene sequencers – machines that can take a blood sample and reel off your entire genetic blueprint – will shrink below the size of USB drives. Supermarkets will have shelves of home DNA tests, perhaps nestled between the cosmetics and medicines, for everything from whether your baby will be good at sports to the breed of cat you just adopted, to whether your kitchen counter harbours enough ‘good bacteria’. We will all know someone who has had their genome probed for medical reasons, perhaps even ourselves. Personal DNA stories – including the quality of the bugs in your gut– will be the stuff of cocktail party chitchat.
By 2025, projections suggest that we will have sequenced the genomes of billions of individuals. This is largely down to the explosive growth in the field of cancer genomics. Steve Jobs, the co-founder of Apple, became one of the early adopters of genomic medicine when he had the cancer that killed him sequenced. Many others will follow. And we will become more and more willing to act on what our genes tell us.
In case you weren’t paying attention, a lot has been happening in the science of genomics over the past few years. It is, for example, now possible to read one human genome and correct all known errors. Perhaps this sounds terrifying, but genomic science has a track-record in making science fiction reality. ‘Everything that’s alive we want to rewrite,’ boasted Austen Heinz, the CEO of Cambrian Genomics, last year.

Commensurate with their power to change biology as we know it, the new technologies are driving renewed ethical debates. Uneasiness is being expressed, not only among the general public, but also in high-profile articles and interviews by scientists. When China announced it was modifying human embryos this April, the term ‘CRISPR/Cas9’ trended on the social media site Twitter. CRISPR/Cas9, by the way, is a protein-RNA combo that defends bacteria against marauding viruses. Properly adapted, it allows scientists to edit strings of DNA inside living cells with astonishing precision. It has, for example, been used to show that HIV can be ‘snipped’ out of the human genome, and that female mosquitoes can be turned male to stop the spread of malaria (only females bite).
But one of CRISPR’s co-developers, Jennifer Doudna of the University of California in Berkeley, has ‘strongly discouraged’ any attempts to edit the human genome pending a review of the ethical issues. Well, thanks to China, that ship has sailed. Indeed, now the technology appears to be finding its way into the hands of hobbyists: Nature recently reported that members of the ‘biohacker’ sub-culture have been messing around with CRISPR, though the enthusiast they interviewed didn’t appear to have a clear idea of what he wanted to do with it.
Given that our genetic abilities appear to be reaching a critical threshold, it is worth taking a fairly hard-headed look at what the next few years promise. For instance, could DNA solve some of our pressing energy issues? One project hopes to engineer trees that glow in the dark. You can sign up to preorder one now – at least the weed version of it; trees take too long to mature to be good prototypes. Perhaps the day is not far off when our streets are lined with bioluminescent foliage. This would presumably drive electric streetlamps into obsolescence, like so many other energy-hungry ‘old-fashioned’ technologies.
But this is hardly the only potentially revolutionary project that aims to play out in the next five to 10 years. Venter is working on re-engineering pig lungs so that they can be used in human transplants. This could have a much larger impact than is immediately obvious: about one in 10 deaths in Europe is caused by lung disease. Farther afield, Venter is in the race to find life on Mars with DNA sequencers, and is developing methods of ‘biological teleportation’ – the idea is that you sequence microbial DNA on Mars and then reconstruct the genomes on Earth using 3D printing. The process could work the other way around, too. Venter and Elon Musk are talking of using this technology to terraform Mars with 3D-printed earthly microbes. The whole thing boggles the imagination, of course, but Venter and Musk do have form for pulling off amazing feats. Nevertheless, perhaps we should start our tour of the horizon closer to home.
By 2020, many hospitals will have genomic medicine departments, designing medical therapies based on your personal genetic constitution. Gene sequencers – machines that can take a blood sample and reel off your entire genetic blueprint – will shrink below the size of USB drives. Supermarkets will have shelves of home DNA tests, perhaps nestled between the cosmetics and medicines, for everything from whether your baby will be good at sports to the breed of cat you just adopted, to whether your kitchen counter harbours enough ‘good bacteria’. We will all know someone who has had their genome probed for medical reasons, perhaps even ourselves. Personal DNA stories – including the quality of the bugs in your gut– will be the stuff of cocktail party chitchat.
By 2025, projections suggest that we will have sequenced the genomes of billions of individuals. This is largely down to the explosive growth in the field of cancer genomics. Steve Jobs, the co-founder of Apple, became one of the early adopters of genomic medicine when he had the cancer that killed him sequenced. Many others will follow. And we will become more and more willing to act on what our genes tell us.
by Dawn Field, Aeon | Read more:
Image: rett Baker/UTMSI/Cameron Thrash (LSU) /Olivia Mason (FSU)
Labels:
Critical Thought,
Environment,
Science,
Technology
Sunday, September 20, 2015
Subscribe to:
Posts (Atom)