Monday, September 15, 2014
Saturday, September 13, 2014
Amazon, Publishers, and Readers
This skirmish will end, though, and when it does, we’ll be left with the larger questions of what the landscape of writing and reading will look like in the English-speaking world. On those questions, we should be backing Amazon, not because different principles are at stake, but because the same principle — Whose actions will benefit the reader? — leads to different conclusions. Many of the people rightly enraged at Amazon’s mistreatment of customers don’t understand how their complaint implicates the traditional model of publishing and selling as well.
Some of the strongest criticism of Amazon comes from authors most closely aligned with the prestigious parts of the old system, many of those complaints appearing as reviews of “The Everything Store”, Brad Stone’s recent book on Amazon and Jeff Bezos. Steve Coll, Dean of the Columbia Journalism School, wrote one such, “Citizen Bezos,” in The New York Review of Books:
At least two qualities distinguished Bezos from other pioneers of e-commerce and help to explain his subsequent success. The first was his gargantuan vision. He did not see himself merely chipping away at Barnes & Noble’s share of retail book sales; he saw himself developing one of the greatest retailers in history, on the scale of Sears Roebuck or Walmart. Secondly, Bezos focused relentlessly on customer service — low prices, ease of use on his website, boundless inventory, and reliable shipping. To this day, Amazon is remarkably successful at pleasing customers.Coll does not intend any of this as a compliment.
He writes about book-making and selling as if there are only two possible modes: Either the current elites remain firmly in charge, or else Amazon will become a soul-crushing monopoly. The apres nous, le deluge!-ness of this should be enough to convince anyone that the publishers are bullshitting, but if your worry is market manipulation, the publishing cartel we have today has has already created decidedly non-hypothetical harms.
Back in 2007, when publishers began selling large numbers of books in digital format, they used digital rights management (DRM) to lock their books to a particular piece of hardware, Amazon’s new Kindle. DRM is designed to transfer pricing power from content owners to hardware vendors. The publishers clearly assumed they could hand Amazon consolidated control without ever having to conspire with one another, and that Amazon would reward them by passing cost-savings back as inflated profits. When Amazon instead decided to side with the customer, passing the savings on as reduced price, they panicked, and started looking around for an alternative conspirator.
Starting in 2009, five of the six biggest publishers colluded with Apple to re-inflate ebook prices. The model they worked out netted them less revenue per digital sale, because of Apple’s cut, but ebooks were not their immediate worry. They wanted (and want) to protect first editions; as long as ebook prices remained high, hardback sales could be protected. No one had any trouble seeing the big record companies as unscrupulous rentiers when they tried to keep prices for digital downloads as high as they had been for CDs; the book industry went further, violating anti-trust law as they attempted to protect their more profitable product.
Faced with evidence of their connivance, the publishers all settled with the Department of Justice. (Apple argued they’d done nothing wrong, took the case to court, and lost.) For all the worries about a future where Justice has to investigate Amazon, nothing that company has done comes close to conspiring against their customers. Coll concedes that these publishers did, in fact, break the law, but excuses them on the grounds that had they not colluded, they might make less money. (...)
As has been widely noted, the last time the industry panicked about increased access to reading material was with the original spread of paperbacks, an invention that occasioned similar hand-wringing about the economics and prestige of publishing. “Successful authors are not interested in original publishing at 25 cents,” said one publisher at the time, a sentiment as vain as it was wrong. Whole genres were born after the spread of low-cost publishing, a happy colloquium of new writers and new readers previously thwarted by high prices.
Although Hachette’s CEO recently claimed “The invention of mass-market paperbacks was great for all”, the real story is one of co-optation. When paperback publishers were independent, prices fell for the first two decades of the new format. Agitated publishers worried that the new format “could undermine the whole structure of publishing.” They finally figured out how to restore that structure in the early 1960s, through industry-wide consolidation. Over the next two decades, hardback publishers bought up the competition and increased paperback prices by almost 300%, while delaying their publication for a year or more.
All this had the effect of degrading paperbacks as a substitute for hardbacks. The industry’s idea of co-existence looks like a reduction in competition rather than a response to it. The same dynamics are playing out today. The big publishers complain about the Kindle, but they could create a competitive market for ebook readers tomorrow morning, by simply publishing without DRM (as Tor, O’Reilly, Baen and other publishers currently do.) This would make digital distribution more attractive, though, which is the last thing they want. (...)
Similarly, the idea that only the Big Five will fund speculative work for small audiences doesn’t jibe with the growth of niche publishing enabled by lower publishing costs. (A quarter-million titles have appeared on the Kindle in the last 90 days.) Nothing here is magic. Books are just large chunks of writing. Digital publishing creates many new ways for delivering those chunks from writer to reader. Only some of those new ways require the services of people who work in lower Manhattan.
by Clay Shirky, Medium | Read more:
Image: uncredited
Social Networks Are Like The Eye
On of the oft-repeated phrases on Edge is "New Technologies=New Perceptions". As we create tools we recreate ourselves. In the digital information age, we have moved from thinking about silicon, transistors, and microprocessors, to redefining, to the edge of creating life itself. As we have seen in recent editions of Edge — "Life: What A Concept!" (Freeman Dyson, Craig Venter, George Church, Robert Shapiro, Dimitar Sasselov, Seth Lloyd) at Eastover Farm in August, "Life: A Gene-Centric View" (Richard Dawkins and Craig Venter) in Munich in January; "Engineering Biology" (Drew Endy) in our most recent edition — we are redefining who and what we are.Such scientific explorations are not limited to biology. Recently, Harvard professor and sociologist Nicholas Christakis has shown that there's more to think about regarding social networks such as Facebook, MySpace, Flickr, and Twitter than considerations of advertising and revenue models. According to The New York Times , ("On Facebook, Scholars Link Up With Data", by Stephanie Rosenbloom 12.17.07):
Each day about 1,700 juniors at an East Coast college log on to Facebook.com to accumulate "friends," compare movie preferences, share videos and exchange cybercocktails and kisses. Unwittingly, these students have become the subjects of academic research. To study how personal tastes, habits and values affect the formation of social relationships (and how social relationships affect tastes, habits and values), a team of researchers from Harvard and the University of California, Los Angeles, are monitoring the Facebook profiles of an entire class of students at one college, which they declined to name because it could compromise the integrity of their research.Christakis notes that he is "interested not in biological contagion, but in social contagion. One possible mechanism is that I observe you and you begin to display certain behaviors that I then copy. For example, you might start running and then I might start running. Or you might invite me to go running with you. Or you might start eating certain fatty foods and I might start copying that behavior and eat fatty foods. Or you might take me with you to restaurants where I might eat fatty foods. What spreads from person to person is a behavior, and it is the behavior that we both might exhibit that then contributes to our changes in body size. So, the spread of behaviors from person to person might cause or underlie the spread of obesity." (...)
Christakis, along with his colleague James Fowler, "have started with several projects that seek to understand the processes of contagion, and we have also begun a body of work looking at the processes of network formation — how structure starts and why it changes. We have made some empirical discoveries about the nature of contagion within networks. And also, in the latter case, with respect to how networks arise, we imagine that the formation of networks obeys certain fundamental biological, genetic, physiological, sociological, and technological rules. "
"So we have been investigating both what causes networks to form and how networks operate."
Image: Nicoloas A Christakis
Labels:
Critical Thought,
Culture,
Psychology,
Technology
Friday, September 12, 2014
The World’s Slowest Motorcycle Racing Is Also the Craziest
Motorcycles are dangerous, even when the rider is skilled and the bike is outfitted with modern safety features. So what happens when you ditch the paved roads for natural terrain and instead of simply avoiding the boulder in front you, you decide to ride up and over it? You have trials motorcycle riding, either the pinnacle of two-wheeled badassery or the dangerous product of gearheads with more ambition than brains.
The idea is simple: Strip a motorcycle of every part possible until it’s basically a mountain bike with a small motor, and take it up a massively treacherous hill. Speed isn’t the goal here, the way to win is to keep your feet off the ground and make it to the top. Since all riders have the same amount of power at their disposal, the game is about the exquisite use of throttle, breaking, and clutch, along with weight shifting. A good run requires a near perfect performance from the rider. (...)
The idea is simple: Strip a motorcycle of every part possible until it’s basically a mountain bike with a small motor, and take it up a massively treacherous hill. Speed isn’t the goal here, the way to win is to keep your feet off the ground and make it to the top. Since all riders have the same amount of power at their disposal, the game is about the exquisite use of throttle, breaking, and clutch, along with weight shifting. A good run requires a near perfect performance from the rider. (...)
A trials motorcycling course is the antithesis of a high-speed circuit. Sanctioned runs typically take place on natural terrain cluttered with logs, streams, and rock walls, with no pavement in sight. In North American competitions, riders follow a set course under the scrutiny of a judge (the sport is also called “Observed Trials”). The goal is to stay on the bike at all times–they pick up a point each time their feet touch the ground. Among those who finish within the time allowance, and without crashing, the rider with the fewest points wins.
The bikes don’t need big engines, so they run on spartan single-cylinders with small displacements, typically between 125- and 250-cc, occasionally as low as 50-cc. They do, however, need to be as light as possible. They’re stripped of anything that would make them even close to street-legal or civilized, all in the name of responsiveness. Cruise control? Nope. Aerodynamic fairings? None. A seat? Please. All told, they rarely break the 200-pound mark, nothing compared to a 452-pound Ducati Diavel, or even a street-legal 320-pound Honda CRF250L dual sport.
Riding a motorcycle slower than you walk is damn difficult, and it’s way tougher than going fast. Like on a bicycle, speed provides stability. At 5 mph, a motorcycle is liable to simply fall over, and knock its rider out of competition. Turning, for example, requires counter-balancing: Against your natural understanding of physics, you push your weight away from the turn, so the bike leans while you stay upright. “It can be frustrating if you’re not ready for it,” LaPlante says. “Your body is such a big portion of the overall weight.”
The bikes don’t need big engines, so they run on spartan single-cylinders with small displacements, typically between 125- and 250-cc, occasionally as low as 50-cc. They do, however, need to be as light as possible. They’re stripped of anything that would make them even close to street-legal or civilized, all in the name of responsiveness. Cruise control? Nope. Aerodynamic fairings? None. A seat? Please. All told, they rarely break the 200-pound mark, nothing compared to a 452-pound Ducati Diavel, or even a street-legal 320-pound Honda CRF250L dual sport.
Riding a motorcycle slower than you walk is damn difficult, and it’s way tougher than going fast. Like on a bicycle, speed provides stability. At 5 mph, a motorcycle is liable to simply fall over, and knock its rider out of competition. Turning, for example, requires counter-balancing: Against your natural understanding of physics, you push your weight away from the turn, so the bike leans while you stay upright. “It can be frustrating if you’re not ready for it,” LaPlante says. “Your body is such a big portion of the overall weight.”
by Alexander George, Wired | Read more:
Image: Javier Santos RomeroHow I Rebuilt Tinder And Discovered The Shameful Secret Of Attraction
Suppose you’re a straight woman thumbing through Tinder while waiting for the train, avoiding your homework, or bored at work. A picture of a deeply bronzed man pops up in your stream. How do you swipe? More interestingly, if someone asked you to explain why, how would you answer?
Say that it’s this guy:
His location is exotic. He’s doing something that requires a wetsuit. Chances are, he needed a good amount of money to do what he’s doing in the place he’s doing it. But the dark tan, large tattoo, long hair, and name like “Kip” indicate a lifestyle that is probably not that of an investment banker. You can’t really see his face, but surprisingly that doesn’t really matter because the overwhelming reason that hundreds of men and women who swiped “no” in a full-fledged Tinder simulation I unleashed on the internet had nothing to do with attractiveness. Instead, it had everything to do with the type of person Kip seemed to be:
But maybe what we call the argument of one’s genitals is, in truth, incredibly — and both consciously and subconsciously — influenced by the cultures in which we grow up as well as our distinct (and equally culturally influenced) ideas of what a “good couple” or “good relationship” would look like. Put differently, we swipe because someone’s “hot,” but we find someone “hot” based on unconscious codes of class, race, education level, religion, and corresponding interests embedded within the photos of their profile.
Essentially, we’re constantly inventing narratives about the people who surround us — where he works, what he loves, whether our family would like him. And more than other dating services, which offer up comprehensive match dossiers, Tinder appears to encourage these narratives and crystallize the extrapolation process and package it into a five-second, low-stakes decision. We swipe, in other words, because of semiotics.
“Semiotics” is, quite simply, the study of signs. The field of semiotics tries to figure out how we come up with symbols — even as simple as the word in front of you — that stand in for a larger concept. Why does the word “lake” mean that massive blue watery thing? Or how does the stop sign, even without the word “stop,” make everyone understand not to go forward?
But signs aren’t always static in their meaning — it’s all about context. (...)
I first noticed this “crystallizing” tendency in Tinder when a friend, let’s call her Katie, starting playing it for fun, three beers in, at a bar. She was thumbing through prospective matches’ profiles (usually comprising six Facebook pictures, authenticated Facebook age, and a brief bio line) for the table, yelling out her immediate reaction: too old, too manscaped, too short, too bald, too Jersey, HOT, too douchey, too finance-bro, too “ew,” too hipster, too boring, too CrossFit, TOTALLY HOT. (...)
Katie’s verdicts were often based on obvious, glaring “facts” of the profile: A 5-foot-7 male was “too short.” A 39-year-old guy was decidedly “too old” for Katie’s 33 years. Another is bald; she decides him “too” much so. But other swipes relied upon more a more vague, albeit immediate, calculus. To be “too douchey” is to have a bad goatee, a shiny shirt, an unfortunate facial expression, or a certain type of sunglasses. “Too ew” could be any blend of traits that, to white, straight, middle-class Katie, read as repugnant.
But some judgments are too secret — and shameful — to say out loud, or even admit to ourselves. Katie never said “too not-white,” “too poor,” or “too uneducated.” We cloak those judgments in language that generally circles the issue: “Nothing in common,” “he wouldn’t like me,” “I can’t see us together.” Those statements aren’t necessarily lies, but they’re also not always full truths either — and often rely on overarching assumptions about what differences in race, class, education, and religion dictate not only in a relationship, but any interaction, romantic or otherwise.
After watching Katie and tinkering around on the app myself in a game-like fashion, I wanted to see if, relying on anonymity, I could get at the heart of the subconscious snap judgments behind each wipe. Why do we swipe the way we swipe? And are those assumptions “just human,” or indicative of larger, enduring, and possibly destructive cultural divides?
Say that it’s this guy:
His location is exotic. He’s doing something that requires a wetsuit. Chances are, he needed a good amount of money to do what he’s doing in the place he’s doing it. But the dark tan, large tattoo, long hair, and name like “Kip” indicate a lifestyle that is probably not that of an investment banker. You can’t really see his face, but surprisingly that doesn’t really matter because the overwhelming reason that hundreds of men and women who swiped “no” in a full-fledged Tinder simulation I unleashed on the internet had nothing to do with attractiveness. Instead, it had everything to do with the type of person Kip seemed to be:
“He probably calls himself a ‘humanist’ instead of a feminist and tries to impress people with how much he ‘made friends with the natives’ when he travels. Barf.” —straight/white
“I love the tattoo, but he seems too skeezy in a way I can’t put my finger on. Scuba is pretentious? Longer greasy hair?” —bi/Hapa/Japanese
“close call, but i hate his sunglasses and also i am imputing all sorts of things about him. like he probably says namaste to the barista at the coffee shop and has a profile picture of him with a bunch of african children” —bi/white
“Lol he’s too old and it looks like the sea is his mistress already I can’t compete with that.” —straight/whiteIt’s possible these respondents are “overthinking” their response to what, on the surface, is a very straightforward question: Am I attracted to this person or not? Indeed, some would argue that there’s no reason to even explain: You can’t argue with your genitals.
But maybe what we call the argument of one’s genitals is, in truth, incredibly — and both consciously and subconsciously — influenced by the cultures in which we grow up as well as our distinct (and equally culturally influenced) ideas of what a “good couple” or “good relationship” would look like. Put differently, we swipe because someone’s “hot,” but we find someone “hot” based on unconscious codes of class, race, education level, religion, and corresponding interests embedded within the photos of their profile.
Essentially, we’re constantly inventing narratives about the people who surround us — where he works, what he loves, whether our family would like him. And more than other dating services, which offer up comprehensive match dossiers, Tinder appears to encourage these narratives and crystallize the extrapolation process and package it into a five-second, low-stakes decision. We swipe, in other words, because of semiotics.
“Semiotics” is, quite simply, the study of signs. The field of semiotics tries to figure out how we come up with symbols — even as simple as the word in front of you — that stand in for a larger concept. Why does the word “lake” mean that massive blue watery thing? Or how does the stop sign, even without the word “stop,” make everyone understand not to go forward?
But signs aren’t always static in their meaning — it’s all about context. (...)
I first noticed this “crystallizing” tendency in Tinder when a friend, let’s call her Katie, starting playing it for fun, three beers in, at a bar. She was thumbing through prospective matches’ profiles (usually comprising six Facebook pictures, authenticated Facebook age, and a brief bio line) for the table, yelling out her immediate reaction: too old, too manscaped, too short, too bald, too Jersey, HOT, too douchey, too finance-bro, too “ew,” too hipster, too boring, too CrossFit, TOTALLY HOT. (...)
Katie’s verdicts were often based on obvious, glaring “facts” of the profile: A 5-foot-7 male was “too short.” A 39-year-old guy was decidedly “too old” for Katie’s 33 years. Another is bald; she decides him “too” much so. But other swipes relied upon more a more vague, albeit immediate, calculus. To be “too douchey” is to have a bad goatee, a shiny shirt, an unfortunate facial expression, or a certain type of sunglasses. “Too ew” could be any blend of traits that, to white, straight, middle-class Katie, read as repugnant.
But some judgments are too secret — and shameful — to say out loud, or even admit to ourselves. Katie never said “too not-white,” “too poor,” or “too uneducated.” We cloak those judgments in language that generally circles the issue: “Nothing in common,” “he wouldn’t like me,” “I can’t see us together.” Those statements aren’t necessarily lies, but they’re also not always full truths either — and often rely on overarching assumptions about what differences in race, class, education, and religion dictate not only in a relationship, but any interaction, romantic or otherwise.
After watching Katie and tinkering around on the app myself in a game-like fashion, I wanted to see if, relying on anonymity, I could get at the heart of the subconscious snap judgments behind each wipe. Why do we swipe the way we swipe? And are those assumptions “just human,” or indicative of larger, enduring, and possibly destructive cultural divides?
by Anne Helen Petersen, BuzzFeed | Read more:
Image: Thinkstock/BuzzFeed
The Digital Wallet Revolution
This week Apple announced two new pieces of hardware, the iPhone 6 and a “smartwatch.” But as flashy as they are, neither item is as groundbreaking as a piece of software that will accompany them: a digital wallet, allowing users to eschew cash and credit cards for a quick swipe of their device at the register.
Apple’s digital wallet, if widely adopted, could usher in a new era of ease and convenience. But the really exciting part is the fast-emerging future that it points toward, in which virtual assets of all sorts — traditional currencies, but also Bitcoin, airline miles, cellphone minutes — are interchangeable, opening up enormous purchasing power for consumers and creating tough challenges for governments around the world. (...)
We don’t typically think of these as currency, because virtual money has traditionally been locked down, in the sense that its use was strictly limited: If you earned points from Amazon, only you could use them, and you could exchange them for dollars only within the Amazon marketplace. Meanwhile, up to now, the only currencies you could use everywhere in an economy were state-issued currencies, like the dollar.
But that distinction is eroding: After all, the value of a currency lies in what you can buy with it, not in the fact that a government says it’s worth something. So if I want to buy a widget, and the only thing I can use to buy it is Widgetcash, then I am willing to trade dollars or euros or anything else for Widgetcash. When I buy something with Widgetcash, it doesn’t go through any bank.
That’s why a digital wallet, loaded with your dollars, credit and loyalty points, is such a revolutionary technology — it makes those transfers and transactions seamless and safe. (...)
The revolution is what comes next: an exchange that connects and trades these different stores of value to find the most cost-efficient one to use, both within your wallet and between wallet users, worldwide. Let’s say you want to buy an audiobook from Best Buy. It costs $16, or 1,000 My Best Buy points, or M.B.B.P.s. Your wallet contains several hundred dollars and 200 Best Buy points. The wallet software automatically determines that, at the current exchange rate between M.B.B.P.s and dollars, it is better to buy using the points.
But then let’s say you only have 50 M.B.B.P.s. The wallet system searches its clients and finds someone — call her Hannah — with enough M.B.B.P.s for the transaction. It buys the audiobook with her points and sends it to you, and sends Hannah dollars from your account.
Following Bitcoin’s protocol, the wallet software broadcasts these transactions to the network, and every wallet in the world updates the M.B.B.P.-to-dollar exchange rate.
The idea is that you can buy anything, with anything. The wallet will find the best deal and execute it. In so doing, it will ignore the historical and cultural differences between dollars, points, coins and virtual property. It’s all bits anyway.
Apple’s digital wallet, if widely adopted, could usher in a new era of ease and convenience. But the really exciting part is the fast-emerging future that it points toward, in which virtual assets of all sorts — traditional currencies, but also Bitcoin, airline miles, cellphone minutes — are interchangeable, opening up enormous purchasing power for consumers and creating tough challenges for governments around the world. (...)We don’t typically think of these as currency, because virtual money has traditionally been locked down, in the sense that its use was strictly limited: If you earned points from Amazon, only you could use them, and you could exchange them for dollars only within the Amazon marketplace. Meanwhile, up to now, the only currencies you could use everywhere in an economy were state-issued currencies, like the dollar.
But that distinction is eroding: After all, the value of a currency lies in what you can buy with it, not in the fact that a government says it’s worth something. So if I want to buy a widget, and the only thing I can use to buy it is Widgetcash, then I am willing to trade dollars or euros or anything else for Widgetcash. When I buy something with Widgetcash, it doesn’t go through any bank.
That’s why a digital wallet, loaded with your dollars, credit and loyalty points, is such a revolutionary technology — it makes those transfers and transactions seamless and safe. (...)
The revolution is what comes next: an exchange that connects and trades these different stores of value to find the most cost-efficient one to use, both within your wallet and between wallet users, worldwide. Let’s say you want to buy an audiobook from Best Buy. It costs $16, or 1,000 My Best Buy points, or M.B.B.P.s. Your wallet contains several hundred dollars and 200 Best Buy points. The wallet software automatically determines that, at the current exchange rate between M.B.B.P.s and dollars, it is better to buy using the points.
But then let’s say you only have 50 M.B.B.P.s. The wallet system searches its clients and finds someone — call her Hannah — with enough M.B.B.P.s for the transaction. It buys the audiobook with her points and sends it to you, and sends Hannah dollars from your account.
Following Bitcoin’s protocol, the wallet software broadcasts these transactions to the network, and every wallet in the world updates the M.B.B.P.-to-dollar exchange rate.
The idea is that you can buy anything, with anything. The wallet will find the best deal and execute it. In so doing, it will ignore the historical and cultural differences between dollars, points, coins and virtual property. It’s all bits anyway.
byEdward Castronova and Joshua A.T. Fairfield, NY Times | Read more:
Image: Getty
Thursday, September 11, 2014
Talk Like a Physicist
- Use “canonical” when you mean “usual” or “standard.” As in, “the canonical example of talking like a physicist is to use the word ‘canonical.’”
- Use “orthogonal” to refer to things that are mutually-exclusive or can’t coincide. “We keep playing phone tag — I think our schedules must be orthogonal”
- “About” becomes “to a first-order approximation”
- Things are not difficult, they are “non-trivial”
- Large discrepancies are “orders of magnitude apart”
- Refer to coordinates and coordinate systems. “I got shafted” becomes “I took one up the z-axis”
- Any actual personal experience becomes “empirical data.” i.e. a burn on your hand is empirical data that the stove is hot.
- You’re not being lazy, you are in your "ground state"
- A semi-educated guess is an "extrapolation"
- You aren’t ignoring details, you are "taking the ideal case"
- A tiny amount is “vanishingly small” or “negligible.” Really small is “infinitesimal”
- You aren’t overweight, you are "thermodynamically efficient"
by Swans on Tea | Read more:
Image: via:
Green Tea-Black Sesame Mousse Cake
[ed. I could never makes something like this, but it sure looks enticing. Recipe here.]
via:
Warning: Wild Extrapolation (A Classification System for Science News)
Science news and science writing is increasingly popular. There are increasing numbers of people getting into science, which is great. But science is a huge field, with many different disciplines and areas, all of which can go into quite painstaking detail. Obviously there’s a lot to talk about, which can prove daunting to the newly interested, so good science writing is important.
However, science and science news/reporting/writing is the work of humans, and humans are rarely 100% logical. So, to step into the world of science is to step into years/decades/centuries of disputes, controversies, unfamiliar habits, power-plays, strange politics and countless other things that manifest in science articles and could befuddle the unwary reader. What can we do about this?
One option is to adopt an approach from the world of film. Every film released to the public comes with a classification, to warn potential viewers of the type of content to expect without spoiling the actual thing itself, so the viewer can go in prepared. These classifications now come with explanations, like “contains mild peril”. Wouldn’t it be useful to adopt something similar for science articles, to give newcomers some grasp of what they’re looking at? So here’s a potential classification system for science writing. It’s a bit more complex admittedly, and unlike films, multiple classifications can be applied to a single piece. How like science, to be so uncertain.
by Dean Burnett, The Guardian | Read more:
Image: Barry WelchAmazon vs Hachette is Nothing: Just Wait for the Audiobook Wars
In my latest Locus column, Audible, Comixology, Amazon, and Doctorow’s First Law, I unpick the technological forces at work in the fight between Amazon and Hachette, one of the "big five" publishers, whose books have not been normally available through Amazon for months now, as the publisher and the bookseller go to war over the terms on which Amazon will sell books in the future.
The publishing world is, by and large, rooting for Hachette, but hasn't paid much attention to the ways in which Hachette made itself especially vulnerable to Amazon in this fight: by insisting that all its books be sold with Amazon's DRM, it has permanently locked all its customers into Amazon's ecosystem, and if Hachette tries to convince them to start buying ebooks elsewhere, it would mean asking their readers to abandon their libraries in the bargain (or maintain two separate, incompatible libraries with different apps, URLs, and even devices to read them).
Worse still: people in publishing who are alarmed about Hachette are still allowing their audiobooks to be sold by Audible, the Amazon division that controls 90% of the audiobook market and will only sell audiobooks in a format that can't be legally played with anything except Amazon-approved technology. Audible has already started putting the screws to its audiobook suppliers -- the publishers and studios that make most of the audiobooks it sells -- even as it has gone into business competing with them.
It's profoundly, heartbreakingly naive to expect that Amazon will be any less ruthless in exploiting the advantage it is being handed over audiobooks than it has been in its exploitation of ebooks.
Image: DRM PNG 900 2, Listentomyvoice, CC-BY-SA
The publishing world is, by and large, rooting for Hachette, but hasn't paid much attention to the ways in which Hachette made itself especially vulnerable to Amazon in this fight: by insisting that all its books be sold with Amazon's DRM, it has permanently locked all its customers into Amazon's ecosystem, and if Hachette tries to convince them to start buying ebooks elsewhere, it would mean asking their readers to abandon their libraries in the bargain (or maintain two separate, incompatible libraries with different apps, URLs, and even devices to read them).Worse still: people in publishing who are alarmed about Hachette are still allowing their audiobooks to be sold by Audible, the Amazon division that controls 90% of the audiobook market and will only sell audiobooks in a format that can't be legally played with anything except Amazon-approved technology. Audible has already started putting the screws to its audiobook suppliers -- the publishers and studios that make most of the audiobooks it sells -- even as it has gone into business competing with them.
It's profoundly, heartbreakingly naive to expect that Amazon will be any less ruthless in exploiting the advantage it is being handed over audiobooks than it has been in its exploitation of ebooks.
Take Amazon’s subsidiary Audible, a great favorite among science fiction writers and fans. The company has absolute dominance over the audiobook market, accounting for as much as 90 percent of sales for major audio publishers. Audible has a no-exceptions requirement for DRM, even where publishers and authors object (my own audiobooks are not available through Audible as a result). Audible is also the sole audiobook supplier for iTunes, meaning that authors and publishers who sell audiobooks through iTunes are likewise bound to lock these to Amazon’s platform and put them in Amazon’s perpetual control.by Cory Doctorow, Boing Boing | Read more:
Image: DRM PNG 900 2, Listentomyvoice, CC-BY-SA
Wednesday, September 10, 2014
Instant Gratification
A half-hour east of Seattle, not far from the headquarters of Microsoft, Amazon, and other icons of the digital revolution, reSTART, a rehab center for Internet addicts, reveals some of the downsides of that revolution. Most of the clients here are trying to quit online gaming, an obsession that has turned their careers, relationships, and health to shambles. For the outsider, the addiction can be incomprehensible. But listening to the patients’ stories, the appeal comes sharply into focus. In a living room overlooking the lawn, 29-year-old Brett Walker talks about his time in World of Warcraft, a popular online role-playing game in which participants become warriors in a steampunk medieval world. For four years, even as his real life collapsed, Walker enjoyed a near-perfect online existence, with unlimited power and status akin to that of a Mafia boss crossed with a rock star. “I could do whatever I wanted, go where I wanted,” Walker tells me with a mixture of pride and self-mockery. “The world was my oyster.”
Walker appreciates the irony. His endless hours as an online superhero left him physically weak, financially destitute, and so socially isolated he could barely hold a face-to-face conversation. There may also have been deeper effects. Studies suggest that heavy online gaming alters brain structures involved in decision making and self-control, much as drug and alcohol use do. Emotional development can be delayed or derailed, leaving the player with a sense of self that is incomplete, fragile, and socially disengaged—more id than superego. Or as Hilarie Cash, reSTART cofounder and an expert in online addiction, tells me, “We end up being controlled by our impulses.”
Which, for gaming addicts, means being even more susceptible to the complex charms of the online world. Gaming companies want to keep players playing as long as possible—the more you play, the more likely you’ll upgrade to the next version. To this end, game designers have created sophisticated data feedback systems that keep players on an upgrade treadmill. As Walker and his peers battle their way through their virtual worlds, the data they generate are captured and used to make subsequent game iterations even more “immersive,” which means players play more, and generate still more data, which inform even more immersive iterations, and so on. World of Warcraft releases periodic patches featuring new weapons and skills that players must have if they want to keep their godlike powers, which they always do. The result is a perpetual-motion machine, driven by companies’ hunger for revenues, but also by players’ insatiable appetite for self-aggrandizement. Until the day he quit, Walker never once declined the chance to “level up,” but instead consumed each new increment of power as soon as it was offered—even as it sapped his power in real life.
On the surface, stories of people like Brett Walker may not seem relevant to those of us who don’t spend our days waging virtual war. But these digital narratives center on a dilemma that every citizen in postindustrial society will eventually confront: how to cope with a consumer culture almost too good at giving us what we want. I don’t just mean the way smartphones and search engines and Netflix and Amazon anticipate our preferences. I mean how the entire edifice of the consumer economy, digital and actual, has reoriented itself around our own agendas, self-images, and inner fantasies. In North America and the United Kingdom, and to a lesser degree in Europe and Japan, it is now entirely normal to demand a personally customized life. We fine-tune our moods with pharmaceuticals and Spotify. We craft our meals around our allergies and ideologies. We can choose a vehicle to express our hipness or hostility. We can move to a neighborhood that matches our social values, find a news outlet that mirrors our politics, and create a social network that “likes” everything we say or post. With each transaction and upgrade, each choice and click, life moves closer to us, and the world becomes our world.
And yet … the world we’re busily refashioning in our own image has some serious problems. Certainly, our march from one level of gratification to the next has imposed huge costs—most recently in a credit binge that nearly sank the global economy. But the issue here isn’t only one of overindulgence or a wayward consumer culture. Even as the economy slowly recovers, many people still feel out of balance and unsteady. It’s as if the quest for constant, seamless self-expression has become so deeply embedded that, according to social scientists like Robert Putnam, it is undermining the essential structures of everyday life. In everything from relationships to politics to business, the emerging norms and expectations of our self-centered culture are making it steadily harder to behave in thoughtful, civic, social ways. We struggle to make lasting commitments. We’re uncomfortable with people or ideas that don’t relate directly and immediately to us. Empathy weakens, and with it, our confidence in the idea, essential to a working democracy, that we have anything in common.
Our unease isn’t new, exactly. In the 1970s, social critics such as Daniel Bell, Christopher Lasch, and Tom Wolfe warned that our growing self-absorption was starving the idealism and aspirations of the postwar era. The “logic of individualism,” argued Lasch in his 1978 polemic, The Culture of Narcissism, had transformed everyday life into a brutal social competition for affirmation that was sapping our days of meaning and joy. Yet even these pessimists had no idea how self-centered mainstream culture would become. Nor could they have imagined the degree to which the selfish reflexes of the individual would become the template for an entire society. Under the escalating drive for quick, efficient “returns,” our whole socioeconomic system is adopting an almost childlike impulsiveness, wholly obsessed with short-term gain and narrow self-interest and increasingly oblivious to long-term consequences.
Walker appreciates the irony. His endless hours as an online superhero left him physically weak, financially destitute, and so socially isolated he could barely hold a face-to-face conversation. There may also have been deeper effects. Studies suggest that heavy online gaming alters brain structures involved in decision making and self-control, much as drug and alcohol use do. Emotional development can be delayed or derailed, leaving the player with a sense of self that is incomplete, fragile, and socially disengaged—more id than superego. Or as Hilarie Cash, reSTART cofounder and an expert in online addiction, tells me, “We end up being controlled by our impulses.”Which, for gaming addicts, means being even more susceptible to the complex charms of the online world. Gaming companies want to keep players playing as long as possible—the more you play, the more likely you’ll upgrade to the next version. To this end, game designers have created sophisticated data feedback systems that keep players on an upgrade treadmill. As Walker and his peers battle their way through their virtual worlds, the data they generate are captured and used to make subsequent game iterations even more “immersive,” which means players play more, and generate still more data, which inform even more immersive iterations, and so on. World of Warcraft releases periodic patches featuring new weapons and skills that players must have if they want to keep their godlike powers, which they always do. The result is a perpetual-motion machine, driven by companies’ hunger for revenues, but also by players’ insatiable appetite for self-aggrandizement. Until the day he quit, Walker never once declined the chance to “level up,” but instead consumed each new increment of power as soon as it was offered—even as it sapped his power in real life.
On the surface, stories of people like Brett Walker may not seem relevant to those of us who don’t spend our days waging virtual war. But these digital narratives center on a dilemma that every citizen in postindustrial society will eventually confront: how to cope with a consumer culture almost too good at giving us what we want. I don’t just mean the way smartphones and search engines and Netflix and Amazon anticipate our preferences. I mean how the entire edifice of the consumer economy, digital and actual, has reoriented itself around our own agendas, self-images, and inner fantasies. In North America and the United Kingdom, and to a lesser degree in Europe and Japan, it is now entirely normal to demand a personally customized life. We fine-tune our moods with pharmaceuticals and Spotify. We craft our meals around our allergies and ideologies. We can choose a vehicle to express our hipness or hostility. We can move to a neighborhood that matches our social values, find a news outlet that mirrors our politics, and create a social network that “likes” everything we say or post. With each transaction and upgrade, each choice and click, life moves closer to us, and the world becomes our world.
And yet … the world we’re busily refashioning in our own image has some serious problems. Certainly, our march from one level of gratification to the next has imposed huge costs—most recently in a credit binge that nearly sank the global economy. But the issue here isn’t only one of overindulgence or a wayward consumer culture. Even as the economy slowly recovers, many people still feel out of balance and unsteady. It’s as if the quest for constant, seamless self-expression has become so deeply embedded that, according to social scientists like Robert Putnam, it is undermining the essential structures of everyday life. In everything from relationships to politics to business, the emerging norms and expectations of our self-centered culture are making it steadily harder to behave in thoughtful, civic, social ways. We struggle to make lasting commitments. We’re uncomfortable with people or ideas that don’t relate directly and immediately to us. Empathy weakens, and with it, our confidence in the idea, essential to a working democracy, that we have anything in common.
Our unease isn’t new, exactly. In the 1970s, social critics such as Daniel Bell, Christopher Lasch, and Tom Wolfe warned that our growing self-absorption was starving the idealism and aspirations of the postwar era. The “logic of individualism,” argued Lasch in his 1978 polemic, The Culture of Narcissism, had transformed everyday life into a brutal social competition for affirmation that was sapping our days of meaning and joy. Yet even these pessimists had no idea how self-centered mainstream culture would become. Nor could they have imagined the degree to which the selfish reflexes of the individual would become the template for an entire society. Under the escalating drive for quick, efficient “returns,” our whole socioeconomic system is adopting an almost childlike impulsiveness, wholly obsessed with short-term gain and narrow self-interest and increasingly oblivious to long-term consequences.
by Paul Roberts, American Scholar | Read more:
Image: David HerbickSky Burial
[ed. If you've read Mary Roach's fascinating (and frequently humorous) book Stiff: The Curious Lives of Human Cadavers you'll have a good idea about the incredible number of things that can be done with a human body after you've donated it to medical (and forensic) science. Not for me.]
Just beyond the gates is where I meet Kate Spradley, a youthful, petite, and unfailingly polite woman of forty. She has short, mousy hair that’s often clipped in place with a barrette, and dresses in yoga-studio t-shirts that explain her slim, almost boyish figure. Kate is so utterly normal that it takes a moment to register the peculiarity of her life’s work: she spends her days handling and cataloguing human remains.
Kate, an associate professor at Texas State University in San Marcos, does most of her work at their Forensic Anthropology Center (FACTS)—the centerpiece of which is the Forensic Anthropology Research Facility (FARF), the largest of America’s five “body farms.” Including Kate, FACTS has three full-time researchers, a rotating crew of anthropology graduate students and undergraduate volunteers, and a steady influx of cadaver donations from both individuals and their next of kin—brought in from Texas hospitals, hospices, medical examiners’ offices, and funeral homes. When I arrive, Kate is helping lead a weeklong forensics workshop for undergrads, spread out across five excavation sites where skeletal remains have been buried to simulate “crime scenes.” Under a camping shelter, out of the intense sun, she stands before a carefully delineated pit that contains one such skeleton: jaws agape, rib cage slightly collapsed, leg bones bent in a half-plié. In the time since it was hidden here, a small animal has built a nest in the hollow of its pelvis.
Over a year ago, back when he was “fully fleshed” (as they say), this donor was placed out in the field under a two-foot-high cage and exposed to the elements, his steady decomposition religiously photographed and recorded for science. Across the property are dozens of cadavers in various stages of rot and mummification, each with its purpose, each with its expanding file of data: the inevitable changes to the body that the rest of us willfully ignore are here obsessively documented. For the past six years, FACTS has been collecting data on human “decomp” while steadily amassing a contemporary skeletal collection (about 150 individuals now) to update our understanding of human anatomy. More specifically, for the forensic sciences, FACTS works to improve methods of determining time since death, as well as the environmental impact on a corpse—particularly in the harsh Texan climate. Texas Rangers consult with them, and law enforcement officers from around the state come to train here each summer, much like this collection of nineteen- and twenty-year-olds.
While her students continue brushing dirt from bone, Kate offers to take me on a walking tour of the cages. Or, as she gently puts it: “I’ll show you some things.”
As we wander down the grassy path in the late spring heat, the first thing I encounter is the smell. “Is that nature or human?” I ask.
“Oh, I can’t smell anything right now—sometimes it depends on what direction the wind is blowing. But probably human.”
The smell of rotting human corpses is unique and uniquely efficient. You need never have experienced the scent before, but the moment you do, you recognize it: the stench of something gone horribly wrong. It reeks of rotten milk and wet leather. (...)
The odor is strong as I walk among the cages, the air redolent with the heavy, sour-wet scent of these bodies letting go of their bile, staining the grasses all around them. I look at the sprawl, each individual in its strange shelter, shriveled and shocked-looking; each with more or less of its flesh and insides; each, in its post-person state, given a new name: a number. They died quietly, in an old-age home; they died painfully, of cancer; they died suddenly, in some violent accident; they died deliberately, a suicide. In spite of how little they had in common in life, they now lie exposed alongside one another, their very own enzymes propelling them toward the same final state. Here, in death, unintentionally, they have formed a community of equals.
by Alex Mar, Oxford American | Read more:
Image: "Passing Through—60 minutes in Foster City, California," by Ajay MalghanApple Hasn’t Solved the Smart Watch Dilemma
[ed. Ugh. I'm with Felix. See also: Apple's Watch is Like a High-Tech Mood Ring]
Does anybody remember OS 10.0? It was a disaster, and even people who installed it spent 90% of their time in OS 9 instead. The very first MacBook Air? An underpowered exercise in frustration. The original iPad? Heavy and clunky. The original iPod? Was not only heavy and clunky and expensive, it was also tied to the Macintosh, and didn’t work either alone or with a PC.The best-case scenario for the Apple Watch is that the product we saw announced today will eventually iterate into something really great. Because anybody who’s ever worn a watch will tell you: this thing has serious problems.
For one thing, Apple has been worryingly silent on the subject of battery life, but there’s no indication that this thing will last even 24 hours. A watch’s battery should last for months; even watches which don’t have batteries will last for a couple of days, if you have to wind them manually, or indefinitely, if they’re automatic and all you have to do is wear them.
Watches might be complicated on the inside, but they’re simple on the outside, and they should never come with a charging cable. (To make matters worse, even though the Apple Watch only works if you have an iPhone, the iPhone charging cable will not charge the Apple Watch; you need a different charging cable entirely.) (...)
Behind all the shiny options (sport! gold! different straps!) the watch itself is always pretty much the same: thick, clunky, a computer strapped to your wrist. Which is great, I suppose, if you’re the kind of person who likes to strap a computer to your wrist.
by Felix Salmon, Medium | Read more:
Image: uncredited
Subscribe to:
Comments (Atom)










