Friday, September 12, 2014

The World’s Slowest Motorcycle Racing Is Also the Craziest

Motorcycles are dangerous, even when the rider is skilled and the bike is outfitted with modern safety features. So what happens when you ditch the paved roads for natural terrain and instead of simply avoiding the boulder in front you, you decide to ride up and over it? You have trials motorcycle riding, either the pinnacle of two-wheeled badassery or the dangerous product of gearheads with more ambition than brains.

The idea is simple: Strip a motorcycle of every part possible until it’s basically a mountain bike with a small motor, and take it up a massively treacherous hill. Speed isn’t the goal here, the way to win is to keep your feet off the ground and make it to the top. Since all riders have the same amount of power at their disposal, the game is about the exquisite use of throttle, breaking, and clutch, along with weight shifting. A good run requires a near perfect performance from the rider. (...)

A trials motorcycling course is the antithesis of a high-speed circuit. Sanctioned runs typically take place on natural terrain cluttered with logs, streams, and rock walls, with no pavement in sight. In North American competitions, riders follow a set course under the scrutiny of a judge (the sport is also called “Observed Trials”). The goal is to stay on the bike at all times–they pick up a point each time their feet touch the ground. Among those who finish within the time allowance, and without crashing, the rider with the fewest points wins.

The bikes don’t need big engines, so they run on spartan single-cylinders with small displacements, typically between 125- and 250-cc, occasionally as low as 50-cc. They do, however, need to be as light as possible. They’re stripped of anything that would make them even close to street-legal or civilized, all in the name of responsiveness. Cruise control? Nope. Aerodynamic fairings? None. A seat? Please. All told, they rarely break the 200-pound mark, nothing compared to a 452-pound Ducati Diavel, or even a street-legal 320-pound Honda CRF250L dual sport.

Riding a motorcycle slower than you walk is damn difficult, and it’s way tougher than going fast. Like on a bicycle, speed provides stability. At 5 mph, a motorcycle is liable to simply fall over, and knock its rider out of competition. Turning, for example, requires counter-balancing: Against your natural understanding of physics, you push your weight away from the turn, so the bike leans while you stay upright. “It can be frustrating if you’re not ready for it,” LaPlante says. “Your body is such a big portion of the overall weight.”

by Alexander George, Wired |  Read more:
Image: Javier Santos Romero

How I Rebuilt Tinder And Discovered The Shameful Secret Of Attraction

Suppose you’re a straight woman thumbing through Tinder while waiting for the train, avoiding your homework, or bored at work. A picture of a deeply bronzed man pops up in your stream. How do you swipe? More interestingly, if someone asked you to explain why, how would you answer?

Say that it’s this guy:


His location is exotic. He’s doing something that requires a wetsuit. Chances are, he needed a good amount of money to do what he’s doing in the place he’s doing it. But the dark tan, large tattoo, long hair, and name like “Kip” indicate a lifestyle that is probably not that of an investment banker. You can’t really see his face, but surprisingly that doesn’t really matter because the overwhelming reason that hundreds of men and women who swiped “no” in a full-fledged Tinder simulation I unleashed on the internet had nothing to do with attractiveness. Instead, it had everything to do with the type of person Kip seemed to be:
“He probably calls himself a ‘humanist’ instead of a feminist and tries to impress people with how much he ‘made friends with the natives’ when he travels. Barf.” —straight/white 
“I love the tattoo, but he seems too skeezy in a way I can’t put my finger on. Scuba is pretentious? Longer greasy hair?” —bi/Hapa/Japanese 
“close call, but i hate his sunglasses and also i am imputing all sorts of things about him. like he probably says namaste to the barista at the coffee shop and has a profile picture of him with a bunch of african children” —bi/white 
“Lol he’s too old and it looks like the sea is his mistress already I can’t compete with that.” —straight/white
It’s possible these respondents are “overthinking” their response to what, on the surface, is a very straightforward question: Am I attracted to this person or not? Indeed, some would argue that there’s no reason to even explain: You can’t argue with your genitals.

But maybe what we call the argument of one’s genitals is, in truth, incredibly — and both consciously and subconsciously — influenced by the cultures in which we grow up as well as our distinct (and equally culturally influenced) ideas of what a “good couple” or “good relationship” would look like. Put differently, we swipe because someone’s “hot,” but we find someone “hot” based on unconscious codes of class, race, education level, religion, and corresponding interests embedded within the photos of their profile.

Essentially, we’re constantly inventing narratives about the people who surround us — where he works, what he loves, whether our family would like him. And more than other dating services, which offer up comprehensive match dossiers, Tinder appears to encourage these narratives and crystallize the extrapolation process and package it into a five-second, low-stakes decision. We swipe, in other words, because of semiotics.

“Semiotics” is, quite simply, the study of signs. The field of semiotics tries to figure out how we come up with symbols — even as simple as the word in front of you — that stand in for a larger concept. Why does the word “lake” mean that massive blue watery thing? Or how does the stop sign, even without the word “stop,” make everyone understand not to go forward?

But signs aren’t always static in their meaning — it’s all about context. (...)

I first noticed this “crystallizing” tendency in Tinder when a friend, let’s call her Katie, starting playing it for fun, three beers in, at a bar. She was thumbing through prospective matches’ profiles (usually comprising six Facebook pictures, authenticated Facebook age, and a brief bio line) for the table, yelling out her immediate reaction: too old, too manscaped, too short, too bald, too Jersey, HOT, too douchey, too finance-bro, too “ew,” too hipster, too boring, too CrossFit, TOTALLY HOT. (...)

Katie’s verdicts were often based on obvious, glaring “facts” of the profile: A 5-foot-7 male was “too short.” A 39-year-old guy was decidedly “too old” for Katie’s 33 years. Another is bald; she decides him “too” much so. But other swipes relied upon more a more vague, albeit immediate, calculus. To be “too douchey” is to have a bad goatee, a shiny shirt, an unfortunate facial expression, or a certain type of sunglasses. “Too ew” could be any blend of traits that, to white, straight, middle-class Katie, read as repugnant.

But some judgments are too secret — and shameful — to say out loud, or even admit to ourselves. Katie never said “too not-white,” “too poor,” or “too uneducated.” We cloak those judgments in language that generally circles the issue: “Nothing in common,” “he wouldn’t like me,” “I can’t see us together.” Those statements aren’t necessarily lies, but they’re also not always full truths either — and often rely on overarching assumptions about what differences in race, class, education, and religion dictate not only in a relationship, but any interaction, romantic or otherwise.

After watching Katie and tinkering around on the app myself in a game-like fashion, I wanted to see if, relying on anonymity, I could get at the heart of the subconscious snap judgments behind each wipe. Why do we swipe the way we swipe? And are those assumptions “just human,” or indicative of larger, enduring, and possibly destructive cultural divides?

by Anne Helen Petersen, BuzzFeed |  Read more:
Image: Thinkstock/BuzzFeed

The Digital Wallet Revolution

This week Apple announced two new pieces of hardware, the iPhone 6 and a “smartwatch.” But as flashy as they are, neither item is as groundbreaking as a piece of software that will accompany them: a digital wallet, allowing users to eschew cash and credit cards for a quick swipe of their device at the register.

Apple’s digital wallet, if widely adopted, could usher in a new era of ease and convenience. But the really exciting part is the fast-emerging future that it points toward, in which virtual assets of all sorts — traditional currencies, but also Bitcoin, airline miles, cellphone minutes — are interchangeable, opening up enormous purchasing power for consumers and creating tough challenges for governments around the world. (...)

We don’t typically think of these as currency, because virtual money has traditionally been locked down, in the sense that its use was strictly limited: If you earned points from Amazon, only you could use them, and you could exchange them for dollars only within the Amazon marketplace. Meanwhile, up to now, the only currencies you could use everywhere in an economy were state-issued currencies, like the dollar.

But that distinction is eroding: After all, the value of a currency lies in what you can buy with it, not in the fact that a government says it’s worth something. So if I want to buy a widget, and the only thing I can use to buy it is Widgetcash, then I am willing to trade dollars or euros or anything else for Widgetcash. When I buy something with Widgetcash, it doesn’t go through any bank.

That’s why a digital wallet, loaded with your dollars, credit and loyalty points, is such a revolutionary technology — it makes those transfers and transactions seamless and safe. (...)

The revolution is what comes next: an exchange that connects and trades these different stores of value to find the most cost-efficient one to use, both within your wallet and between wallet users, worldwide. Let’s say you want to buy an audiobook from Best Buy. It costs $16, or 1,000 My Best Buy points, or M.B.B.P.s. Your wallet contains several hundred dollars and 200 Best Buy points. The wallet software automatically determines that, at the current exchange rate between M.B.B.P.s and dollars, it is better to buy using the points.

But then let’s say you only have 50 M.B.B.P.s. The wallet system searches its clients and finds someone — call her Hannah — with enough M.B.B.P.s for the transaction. It buys the audiobook with her points and sends it to you, and sends Hannah dollars from your account.

Following Bitcoin’s protocol, the wallet software broadcasts these transactions to the network, and every wallet in the world updates the M.B.B.P.-to-dollar exchange rate.

The idea is that you can buy anything, with anything. The wallet will find the best deal and execute it. In so doing, it will ignore the historical and cultural differences between dollars, points, coins and virtual property. It’s all bits anyway.

byEdward Castronova and Joshua A.T. Fairfield, NY Times |  Read more:
Image: Getty

Thursday, September 11, 2014


Manhattan Chinatown, nyclove on flickr.
via:

Talk Like a Physicist

  • Use “canonical” when you mean “usual” or “standard.” As in, “the canonical example of talking like a physicist is to use the word ‘canonical.’”
  • Use “orthogonal” to refer to things that are mutually-exclusive or can’t coincide. “We keep playing phone tag — I think our schedules must be orthogonal”
  • About” becomes “to a first-order approximation
  • Things are not difficult, they are “non-trivial
  • Large discrepancies are “orders of magnitude apart
  • Refer to coordinates and coordinate systems. “I got shafted” becomes “I took one up the z-axis
  • Any actual personal experience becomes “empirical data.” i.e. a burn on your hand is empirical data that the stove is hot.
  • You’re not being lazy, you are in your "ground state"
  • A semi-educated guess is an "extrapolation"
  • You aren’t ignoring details, you are "taking the ideal case"
  • A tiny amount is “vanishingly small” or “negligible.” Really small is “infinitesimal
  • You aren’t overweight, you are "thermodynamically efficient"
by Swans on Tea |  Read more:
Image: via:

Green Tea-Black Sesame Mousse Cake
[ed. I could never makes something like this, but it sure looks enticing. Recipe here.]
via:

Warning: Wild Extrapolation (A Classification System for Science News)


Science news and science writing is increasingly popular. There are increasing numbers of people getting into science, which is great. But science is a huge field, with many different disciplines and areas, all of which can go into quite painstaking detail. Obviously there’s a lot to talk about, which can prove daunting to the newly interested, so good science writing is important.

However, science and science news/reporting/writing is the work of humans, and humans are rarely 100% logical. So, to step into the world of science is to step into years/decades/centuries of disputes, controversies, unfamiliar habits, power-plays, strange politics and countless other things that manifest in science articles and could befuddle the unwary reader. What can we do about this?

One option is to adopt an approach from the world of film. Every film released to the public comes with a classification, to warn potential viewers of the type of content to expect without spoiling the actual thing itself, so the viewer can go in prepared. These classifications now come with explanations, like “contains mild peril”. Wouldn’t it be useful to adopt something similar for science articles, to give newcomers some grasp of what they’re looking at? So here’s a potential classification system for science writing. It’s a bit more complex admittedly, and unlike films, multiple classifications can be applied to a single piece. How like science, to be so uncertain.

by Dean Burnett, The Guardian |  Read more:
Image: Barry Welch

Amazon vs Hachette is Nothing: Just Wait for the Audiobook Wars

In my latest Locus column, Audible, Comixology, Amazon, and Doctorow’s First Law, I unpick the technological forces at work in the fight between Amazon and Hachette, one of the "big five" publishers, whose books have not been normally available through Amazon for months now, as the publisher and the bookseller go to war over the terms on which Amazon will sell books in the future.

The publishing world is, by and large, rooting for Hachette, but hasn't paid much attention to the ways in which Hachette made itself especially vulnerable to Amazon in this fight: by insisting that all its books be sold with Amazon's DRM, it has permanently locked all its customers into Amazon's ecosystem, and if Hachette tries to convince them to start buying ebooks elsewhere, it would mean asking their readers to abandon their libraries in the bargain (or maintain two separate, incompatible libraries with different apps, URLs, and even devices to read them).

Worse still: people in publishing who are alarmed about Hachette are still allowing their audiobooks to be sold by Audible, the Amazon division that controls 90% of the audiobook market and will only sell audiobooks in a format that can't be legally played with anything except Amazon-approved technology. Audible has already started putting the screws to its audiobook suppliers -- the publishers and studios that make most of the audiobooks it sells -- even as it has gone into business competing with them.

It's profoundly, heartbreakingly naive to expect that Amazon will be any less ruthless in exploiting the advantage it is being handed over audiobooks than it has been in its exploitation of ebooks.
Take Amazon’s subsidiary Audible, a great favorite among science fiction writers and fans. The company has absolute dominance over the audiobook market, accounting for as much as 90 percent of sales for major audio publishers. Audible has a no-exceptions requirement for DRM, even where publishers and authors object (my own audiobooks are not available through Audible as a result). Audible is also the sole audiobook supplier for iTunes, meaning that authors and publishers who sell audiobooks through iTunes are likewise bound to lock these to Amazon’s platform and put them in Amazon’s perpetual control. 
by Cory Doctorow, Boing Boing |  Read more: 
Image: DRM PNG 900 2, Listentomyvoice, CC-BY-SA

Wednesday, September 10, 2014

LCD Soundsystem


[ed. Repost]

Instant Gratification

A half-hour east of Seattle, not far from the headquarters of Microsoft, Amazon, and other icons of the digital revolution, reSTART, a rehab center for Internet addicts, reveals some of the downsides of that revolution. Most of the clients here are trying to quit online gaming, an obsession that has turned their careers, relationships, and health to shambles. For the outsider, the addiction can be incomprehensible. But listening to the patients’ stories, the appeal comes sharply into focus. In a living room overlooking the lawn, 29-year-old Brett Walker talks about his time in World of Warcraft, a popular online role-playing game in which participants become warriors in a steampunk medieval world. For four years, even as his real life collapsed, Walker enjoyed a near-perfect online existence, with unlimited power and status akin to that of a Mafia boss crossed with a rock star. “I could do whatever I wanted, go where I wanted,” Walker tells me with a mixture of pride and self-mockery. “The world was my oyster.”

Walker appreciates the irony. His endless hours as an online superhero left him physically weak, financially destitute, and so socially isolated he could barely hold a face-to-face conversation. There may also have been deeper effects. Studies suggest that heavy online gaming alters brain structures involved in decision making and self-control, much as drug and alcohol use do. Emotional development can be delayed or derailed, leaving the player with a sense of self that is incomplete, fragile, and socially disengaged—more id than superego. Or as Hilarie Cash, reSTART cofounder and an expert in online addiction, tells me, “We end up being controlled by our impulses.”

Which, for gaming addicts, means being even more susceptible to the complex charms of the online world. Gaming companies want to keep players playing as long as possible—the more you play, the more likely you’ll upgrade to the next version. To this end, game designers have created sophisticated data feedback systems that keep players on an upgrade treadmill. As Walker and his peers battle their way through their virtual worlds, the data they generate are captured and used to make subsequent game iterations even more “immersive,” which means players play more, and generate still more data, which inform even more immersive iterations, and so on. World of Warcraft releases periodic patches featuring new weapons and skills that players must have if they want to keep their godlike powers, which they always do. The result is a perpetual-motion machine, driven by companies’ hunger for revenues, but also by players’ insatiable appetite for self-aggrandizement. Until the day he quit, Walker never once declined the chance to “level up,” but instead consumed each new increment of power as soon as it was offered—even as it sapped his power in real life.

On the surface, stories of people like Brett Walker may not seem relevant to those of us who don’t spend our days waging virtual war. But these digital narratives center on a dilemma that every citizen in postindustrial society will eventually confront: how to cope with a consumer culture almost too good at giving us what we want. I don’t just mean the way smartphones and search engines and Netflix and Amazon anticipate our preferences. I mean how the entire edifice of the consumer economy, digital and actual, has reoriented itself around our own agendas, self-images, and inner fantasies. In North America and the United Kingdom, and to a lesser degree in Europe and Japan, it is now entirely normal to demand a personally customized life. We fine-tune our moods with pharmaceuticals and Spotify. We craft our meals around our allergies and ideologies. We can choose a vehicle to express our hipness or hostility. We can move to a neighborhood that matches our social values, find a news outlet that mirrors our politics, and create a social network that “likes” everything we say or post. With each transaction and upgrade, each choice and click, life moves closer to us, and the world becomes our world.

And yet … the world we’re busily refashioning in our own image has some serious problems. Certainly, our march from one level of gratification to the next has imposed huge costs—most recently in a credit binge that nearly sank the global economy. But the issue here isn’t only one of overindulgence or a wayward consumer culture. Even as the economy slowly recovers, many people still feel out of balance and unsteady. It’s as if the quest for constant, seamless self-expression has become so deeply embedded that, according to social scientists like Robert Putnam, it is undermining the essential structures of everyday life. In everything from relationships to politics to business, the emerging norms and expectations of our self-centered culture are making it steadily harder to behave in thoughtful, civic, social ways. We struggle to make lasting commitments. We’re uncomfortable with people or ideas that don’t relate directly and immediately to us. Empathy weakens, and with it, our confidence in the idea, essential to a working democracy, that we have anything in common.

Our unease isn’t new, exactly. In the 1970s, social critics such as Daniel Bell, Christopher Lasch, and Tom Wolfe warned that our growing self-absorption was starving the idealism and aspirations of the postwar era. The “logic of individualism,” argued Lasch in his 1978 polemic, The Culture of Narcissism, had transformed everyday life into a brutal social competition for affirmation that was sapping our days of meaning and joy. Yet even these pessimists had no idea how self-centered mainstream culture would become. Nor could they have imagined the degree to which the selfish reflexes of the individual would become the template for an entire society. Under the escalating drive for quick, efficient “returns,” our whole socioeconomic system is adopting an almost childlike impulsiveness, wholly obsessed with short-term gain and narrow self-interest and increasingly oblivious to long-term consequences.

by Paul Roberts, American Scholar |  Read more:
Image: David Herbick

Sky Burial


[ed. If you've read Mary Roach's fascinating (and frequently humorous) book Stiff: The Curious Lives of Human Cadavers you'll have a good idea about the incredible number of things that can be done with a human body after you've donated it to medical (and forensic) science. Not for me.]

The few thousand acres of Freeman Ranch in San Marcos, Texas, include a working farm; fields studded with black-eyed Susans; and a population of white-tailed deer, Rio Grande turkeys, and brawny Gelbvieh bulls. But there’s more nested here: if, on your way from town, you turn off at the sign onto dirt road, and if your vehicle can handle the jerky, winding drive five miles deeper into the property, you will come across two tiers of chain-link fence. Behind this double barrier, accessed by key card, sixteen acres of land have been secured for a special purpose: at this place, settled in the grasses or tucked under clusters of oak trees, about seventy recently dead humans have been laid out in cages, naked, to decompose.

Just beyond the gates is where I meet Kate Spradley, a youthful, petite, and unfailingly polite woman of forty. She has short, mousy hair that’s often clipped in place with a barrette, and dresses in yoga-studio t-shirts that explain her slim, almost boyish figure. Kate is so utterly normal that it takes a moment to register the peculiarity of her life’s work: she spends her days handling and cataloguing human remains.

Kate, an associate professor at Texas State University in San Marcos, does most of her work at their Forensic Anthropology Center (FACTS)—the centerpiece of which is the Forensic Anthropology Research Facility (FARF), the largest of America’s five “body farms.” Including Kate, FACTS has three full-time researchers, a rotating crew of anthropology graduate students and undergraduate volunteers, and a steady influx of cadaver donations from both individuals and their next of kin—brought in from Texas hospitals, hospices, medical examiners’ offices, and funeral homes. When I arrive, Kate is helping lead a weeklong forensics workshop for undergrads, spread out across five excavation sites where skeletal remains have been buried to simulate “crime scenes.” Under a camping shelter, out of the intense sun, she stands before a carefully delineated pit that contains one such skeleton: jaws agape, rib cage slightly collapsed, leg bones bent in a half-plié. In the time since it was hidden here, a small animal has built a nest in the hollow of its pelvis.

Over a year ago, back when he was “fully fleshed” (as they say), this donor was placed out in the field under a two-foot-high cage and exposed to the elements, his steady decomposition religiously photographed and recorded for science. Across the property are dozens of cadavers in various stages of rot and mummification, each with its purpose, each with its expanding file of data: the inevitable changes to the body that the rest of us willfully ignore are here obsessively documented. For the past six years, FACTS has been collecting data on human “decomp” while steadily amassing a contemporary skeletal collection (about 150 individuals now) to update our understanding of human anatomy. More specifically, for the forensic sciences, FACTS works to improve methods of determining time since death, as well as the environmental impact on a corpse—particularly in the harsh Texan climate. Texas Rangers consult with them, and law enforcement officers from around the state come to train here each summer, much like this collection of nineteen- and twenty-year-olds.

While her students continue brushing dirt from bone, Kate offers to take me on a walking tour of the cages. Or, as she gently puts it: “I’ll show you some things.”

As we wander down the grassy path in the late spring heat, the first thing I encounter is the smell. “Is that nature or human?” I ask.

“Oh, I can’t smell anything right now—sometimes it depends on what direction the wind is blowing. But probably human.”

The smell of rotting human corpses is unique and uniquely efficient. You need never have experienced the scent before, but the moment you do, you recognize it: the stench of something gone horribly wrong. It reeks of rotten milk and wet leather. (...)

The odor is strong as I walk among the cages, the air redolent with the heavy, sour-wet scent of these bodies letting go of their bile, staining the grasses all around them. I look at the sprawl, each individual in its strange shelter, shriveled and shocked-looking; each with more or less of its flesh and insides; each, in its post-person state, given a new name: a number. They died quietly, in an old-age home; they died painfully, of cancer; they died suddenly, in some violent accident; they died deliberately, a suicide. In spite of how little they had in common in life, they now lie exposed alongside one another, their very own enzymes propelling them toward the same final state. Here, in death, unintentionally, they have formed a community of equals.

by Alex Mar, Oxford American |  Read more:
Image: "Passing Through—60 minutes in Foster City, California," by Ajay Malghan

Apple Hasn’t Solved the Smart Watch Dilemma

[ed. Ugh. I'm with Felix. See also: Apple's Watch is Like a High-Tech Mood Ring]

There’s a decent rule of thumb, when it comes to anything Apple: When it introduces something brand new, don’t buy version 1.o. Wait until the second or third version instead, you’ll be much better off.

Does anybody remember OS 10.0? It was a disaster, and even people who installed it spent 90% of their time in OS 9 instead. The very first MacBook Air? An underpowered exercise in frustration. The original iPad? Heavy and clunky. The original iPod? Was not only heavy and clunky and expensive, it was also tied to the Macintosh, and didn’t work either alone or with a PC.

The best-case scenario for the Apple Watch is that the product we saw announced today will eventually iterate into something really great. Because anybody who’s ever worn a watch will tell you: this thing has serious problems.

For one thing, Apple has been worryingly silent on the subject of battery life, but there’s no indication that this thing will last even 24 hours. A watch’s battery should last for months; even watches which don’t have batteries will last for a couple of days, if you have to wind them manually, or indefinitely, if they’re automatic and all you have to do is wear them.

Watches might be complicated on the inside, but they’re simple on the outside, and they should never come with a charging cable. (To make matters worse, even though the Apple Watch only works if you have an iPhone, the iPhone charging cable will not charge the Apple Watch; you need a different charging cable entirely.) (...)

Behind all the shiny options (sport! gold! different straps!) the watch itself is always pretty much the same: thick, clunky, a computer strapped to your wrist. Which is great, I suppose, if you’re the kind of person who likes to strap a computer to your wrist.

by Felix Salmon, Medium |  Read more:
Image: uncredited

Tuesday, September 9, 2014


A fight at the Ukrainian Parliament transformed into a Caravaggio-like painting… that’s why we love the internet. :-D
via:

Rumors of Tribes


If you went to an American high school of a certain size, the social landscape was probably populated by some teenagers who were known by just one name, the “jocks.”

And from the 1950s to the present day, their sworn enemies were probably known by a greater diversity of names: the “smokers,” the “greasers,” the “scrubs” (kids in vocational classes, also known as “shop kids” or “shop rats”), and “shrubs” (who are also known as “rockers,” “metalheads,” “bangers,” “Hessians,” or “heshers”), as well as an assortment of “burnouts,” “skaters,” “punks,” “emos,” “hippies,” “goths,” “stoners” (who were known at one West Hartford, Conn., school in the early 1980s as “the double door crowd” because they hung out in the school’s entryway) and “taggers,” a relatively recent term for graffiti vandals. (...)

There’s a lot that adults end up speculating about when it comes to high school crowd labels. Why do they change? How much do they vary from place to place? Where do new ones come from? Some labels, like “jocks,” stand the test of time, while others (“emos,” “wiggers”) rise with clothing styles and musical subgenres, and as much as one might like to imagine some high school in a tiny valley that time forgot where “greasers” battle “bebops,” those labels are no more. Obviously, American society changes, and mass media reinforce some names. But experts say that when they go back to schools they’ve studied before, they find that the crowd labels have been refreshed by some inscrutable linguistic tide.

I credit my son’s babysitter for getting me interested in crowd labels and clique names when she told me that the popular kids at her school were the “windswept hair people.” I thought, surely teen social landscapes have interesting names and rich naming practices like “windswept hair people.” But for the most part, as far as I’ve been able to tell, the labels don’t vary much, and if creative names exist, they’re not easy to hear about. Over the decades up until now, studious college-bound kids are usually known as “brains,” “brainiacs,” “nerds,” “geeks,” “eggheads,” or “the intelligentsia.” I collected high school names using Twitter and Survey Monkey and came across “crumbsters,” a label for the popular kids, and the “queer cult,” for the semi-populars, though if awards were given out for creative crowd labels, the kids at one New Jersey high school in the mid-2000s might win for naming their popular crowd in the aftermath of a food fight that happened in ninth grade, when one of the most popular girls shrieked and generally overreacted after she was hit in the face with a flying chicken patty. The popular kids became known as “chicken patties.” Depending on where and when you went to high school, you might have called them “preps,” “socialites,” “Ivy Leaguers,” “soshes,” or simply “the popular kids.”

Those four crowds—jocks, smart kids, popular kids, and deviants—are said by adolescent researchers to be standard in American high schools. Then there’s a grab-bag group: kids into drama and band (“drama fags,” “band fags,” “drama geeks,” “band geeks,” etc.), as well as “gang bangers,” “girly girls,” “cholos,” “Asian Pride” (and other racial and ethnic groups, like “FOBs”—a derogatory term referring to recent immigrants), “Gay Pride,” and depending on the region “plowjocks” (also ag students, or “aggies,” “hicks,” “cowboys,” and “rednecks”) and “Wangsters” (who are “wannabe” gangsters). At a highly selective residential public school from the middle of the U.S., upper-middle-class ethnic students were known as “teen girl squad” and “teen jerk squad,” while the mainly white, rural students were known as “second floor boys and girls,” because of where they lived in a dorm. (...)

The relative stability and blandness of certain crowd labels may have something to do with what kids need labels for (and they do need them—more about that in a bit). Basically, Brown said, once you leave the enclosed classroom of the elementary school, “you have a bigger sea of humanity that needs to be navigated without much oversight or guidance.” New middle-schoolers and high-schoolers now have to deal with far more people than they can have individual relationships with. You need some way to make sense of who might be a friend and who might be an enemy. And communicating about where you think you and others belong only works if your crowd labels are conventional enough for people to understand. I don’t know what a “grater” is, but I know what a “band nerd” is.

You can tell a lot about the social uses of crowd labels from their absence. For instance, people who went to smaller high schools don’t report having crowd labels. It’s not that they were closer-knit, it’s that the school’s population was closer to the number of individual social relationships the human brain can process (called Dunbar’s Number, it ranges between 100 and 130). Also, the first teenage crowd labels in America date to a 1942 study by August Hollingshead (who reported three labels: “elites,” “good kids,” and “the grubby gang”). Not until the 1940s were there a lot of high schools large enough to be all-encompassing social worlds.

by Michael Erard, TMN |  Read more:
Image: uncredited

Stereotyping Japan

[ed. See also: Designing Postwar Japan]

In 1945, The Saturday Evening Post proudly proclaimed that “The G.I. Is Civilizing the Jap” by showing the “savage” and “dirty” natives how to fix cars without breaking them, and how to go to the bathroom. A 1951 follow-up subsequently reported that the Japanese they visited six year prior, with their nightsoil gardens and Shintoism, now had gas stoves and Christianity! This idea, that the American Occupation would teach the Japanese how to act like modern men and women, was quite strong at the end of the Pacific War.

For General Douglas MacArthur, commander of the Occupation administration (SCAP), democratizing Japan was as much a personal manifest destiny for him as America’s presence in Asia was a benevolent historical one. Since basic cultural stereotypes underwrote US policy and media coverage, the primary debate among the participants was about the Japanese people’s perceived educability by American teachers. The story of the Occupation, from 1945 to 1952, is about shared assumptions regarding the limitations of the Japanese psyche: how much racism would be applied, basically. In the end, the conservative viewpoint won out, due mainly to concerns about alleged communist infiltration, and ideas about the Japanese being naturally vulnerable to Bolshevism. (...)

However, as the US advanced on the home islands, and it became clear there would be a postwar occupation, this tone began to decline. Propagandists, bureaucrats, and journalists now focused on the question of the Japanese mind, the molding of “a developing Asian consciousness,” in the words of Secretary of State Dean Acheson. This represented a turn from wartime propaganda, which depicted the Japanese as alien, insectile, and simian.

Now, the official line was that “it is a mistake to think that all Japanese are predominantly the monkey-man type.” The Saturday Evening Post, in an indication of this evolving thinking, posited in September 1944 that the good behavior of Japanese and Japanese-Americans in Hawaii showed that the Japanese in Japan “can, in time, be turned into decent, law-respecting citizens” too and were not “a hopeless immoral race” after all. So, the Japanese could be redeemed. But how?

It was clear that the old certainties were useless. “There was a school of thought that believed it possible to determine the friends of the United States by table etiquette,” wrote SCAP economics officer E. M. Hadley in her memoir, “those with beautiful table manners were friends; those ignorant of such matters were not.” She felt that these individuals gravitated towards those Japanese most implicated in imperialism. This was a clear indictment of people like former Ambassador Joseph Grew, who surrounded himself with cosmopolitan Japanese that ultimately supported the Empire.

The idea of “genuine” Westernized Japanese, which is how redemption would take place, was therefore a tricky one. Here, a more cautious racism emerged than the imperial idea of baptizing the Japanese in Western values. The most common refrain from Grew’s critics, in fact, was not that he was a big business conservative, as later historians like Howard Schonberger would charge, but that he had been bamboozled by Japan’s “wily” fake liberals. Wealthy, squeezing, unscrupulous, and false were words of choice for these men. Duplicity was a perfected Japanese art form, and Americans – like ostriches with their heads in the sand – fell for it.

Grew himself supported this characterization: he did not think anything better could be expected of the “Yamato race.” Though his views were less extreme than the more openly racist feelings expressed by other U.S. officials, it was still grounded in a racial determinism: they “dress like us,” but “they don’t think as we do.” In contrast, the “reputable citizens” who peppered Grew’s memoir, Ten Years in Japan reads as a yearbook of Imperial Japanese high society. These were the people who would resume the westernization of Japan.

Grew knew and respected this old guard- men who the New Dealers from the start regarded as revanchist liabilities. They, Grew’s friend Joseph Ballantine told George F. Kennan in 1947, were “able to raise Japan from a feudal state into a first class power in the course of seventy-five years,” after all. Grew and his more business-minded associates, in what became know as the Japan Lobby after his retirement, were ideologically very close to these nobles, prewar politicians, and zaibatsu families. Despite all of the talk about alien Japanese mentalities and childish inferiority complex, one very strong trans-Pacific cultural connection before the war, and that was through the business communities that Grew and his allies circled through in both countries.

Most other areas of possible cultural exchange were anemic. What followed was only logical. American skeptics of the New Deal at home, like Grew and Ballantine, also skeptical of the “common” Japanese person’s capacity for free thinking, came to share the views of Japan’s ruling class that any substantial reforms would bring anarchy. “[My] experience [has] shown that democracy in Japan would never work,” Grew had concluded just before the war’s end.

by Paul Mutter, Souciant | Read more:
Image:uncredited

Sigmar Polke. Untitled (Quetta, Pakistan). 1974/1978
via:

Tell Me What You Like and I'll Tell You Who You Are

The Facebook like button was first released in 2009. As of September of 2013, a total of 1.13 trillion likes had been registered across the earth, according to OkCupid co-founder Christian Rudder in his new book Dataclysm. Much has been written about how “likes” limit our social interaction or increase our engagement with brands. But these likes have another function, they’re becoming a source of data that will eventually tell social scientists more about who we are than what we share.

According to a research group in the UK, it turns out that what people choose to “like” on Facebook can be used to determine with 95% accuracy whether they are Caucasian or African American, 88% accuracy whether they are gay or straight, and 65% accuracy whether they are a drug user, among other things. So what you post on Facebook may not give as true a signal of your genuine self as what you like on Facebook. Rudder writes:
“This stuff was computed from three years of data collected from people who joined Facebook after decades of being on earth without it. What will be possible when someone’s been using these services since she was a child? That’s the darker side of the longitudinal data I’m otherwise so excited about. Tests like Myers-Briggs and Stanford-Binet have long been used by employers, schools, the military. You sit down, do your best, and they sort you. For the most part, you’ve opted in. But it’s increasingly the case that you’re taking these tests just by living your life.”
Is it possible that in the future your SAT score, personality, and employability might simply be predicted by all the data collected from your digital device use? I asked Rudder whether a person’s like pattern on Facebook could be used as a proxy for an intelligence or IQ score. He told me:
“I think we are still far away from saying with any real certainty how smart any one person is based on Facebook likes. In aggregate, finding out that people who like X, Y, Z, have traits A, B, C, D, I think we’re already there. We’re already tackling life history questions based on Facebook likes. For example, did your parents get divorced before they were 21, they can unlock that with 60% certitude. Given that it’s only a few years’ worth of likes, imagine that it’s in five or 10 years and there’s that much more data to go on, and people are revealing their lives through their smartphones and their laptops.”
by Jonathan Wai, Quartz |  Read more:
Image: Dado Ruvic