Wednesday, September 10, 2014

LCD Soundsystem


[ed. Repost]

Instant Gratification

A half-hour east of Seattle, not far from the headquarters of Microsoft, Amazon, and other icons of the digital revolution, reSTART, a rehab center for Internet addicts, reveals some of the downsides of that revolution. Most of the clients here are trying to quit online gaming, an obsession that has turned their careers, relationships, and health to shambles. For the outsider, the addiction can be incomprehensible. But listening to the patients’ stories, the appeal comes sharply into focus. In a living room overlooking the lawn, 29-year-old Brett Walker talks about his time in World of Warcraft, a popular online role-playing game in which participants become warriors in a steampunk medieval world. For four years, even as his real life collapsed, Walker enjoyed a near-perfect online existence, with unlimited power and status akin to that of a Mafia boss crossed with a rock star. “I could do whatever I wanted, go where I wanted,” Walker tells me with a mixture of pride and self-mockery. “The world was my oyster.”

Walker appreciates the irony. His endless hours as an online superhero left him physically weak, financially destitute, and so socially isolated he could barely hold a face-to-face conversation. There may also have been deeper effects. Studies suggest that heavy online gaming alters brain structures involved in decision making and self-control, much as drug and alcohol use do. Emotional development can be delayed or derailed, leaving the player with a sense of self that is incomplete, fragile, and socially disengaged—more id than superego. Or as Hilarie Cash, reSTART cofounder and an expert in online addiction, tells me, “We end up being controlled by our impulses.”

Which, for gaming addicts, means being even more susceptible to the complex charms of the online world. Gaming companies want to keep players playing as long as possible—the more you play, the more likely you’ll upgrade to the next version. To this end, game designers have created sophisticated data feedback systems that keep players on an upgrade treadmill. As Walker and his peers battle their way through their virtual worlds, the data they generate are captured and used to make subsequent game iterations even more “immersive,” which means players play more, and generate still more data, which inform even more immersive iterations, and so on. World of Warcraft releases periodic patches featuring new weapons and skills that players must have if they want to keep their godlike powers, which they always do. The result is a perpetual-motion machine, driven by companies’ hunger for revenues, but also by players’ insatiable appetite for self-aggrandizement. Until the day he quit, Walker never once declined the chance to “level up,” but instead consumed each new increment of power as soon as it was offered—even as it sapped his power in real life.

On the surface, stories of people like Brett Walker may not seem relevant to those of us who don’t spend our days waging virtual war. But these digital narratives center on a dilemma that every citizen in postindustrial society will eventually confront: how to cope with a consumer culture almost too good at giving us what we want. I don’t just mean the way smartphones and search engines and Netflix and Amazon anticipate our preferences. I mean how the entire edifice of the consumer economy, digital and actual, has reoriented itself around our own agendas, self-images, and inner fantasies. In North America and the United Kingdom, and to a lesser degree in Europe and Japan, it is now entirely normal to demand a personally customized life. We fine-tune our moods with pharmaceuticals and Spotify. We craft our meals around our allergies and ideologies. We can choose a vehicle to express our hipness or hostility. We can move to a neighborhood that matches our social values, find a news outlet that mirrors our politics, and create a social network that “likes” everything we say or post. With each transaction and upgrade, each choice and click, life moves closer to us, and the world becomes our world.

And yet … the world we’re busily refashioning in our own image has some serious problems. Certainly, our march from one level of gratification to the next has imposed huge costs—most recently in a credit binge that nearly sank the global economy. But the issue here isn’t only one of overindulgence or a wayward consumer culture. Even as the economy slowly recovers, many people still feel out of balance and unsteady. It’s as if the quest for constant, seamless self-expression has become so deeply embedded that, according to social scientists like Robert Putnam, it is undermining the essential structures of everyday life. In everything from relationships to politics to business, the emerging norms and expectations of our self-centered culture are making it steadily harder to behave in thoughtful, civic, social ways. We struggle to make lasting commitments. We’re uncomfortable with people or ideas that don’t relate directly and immediately to us. Empathy weakens, and with it, our confidence in the idea, essential to a working democracy, that we have anything in common.

Our unease isn’t new, exactly. In the 1970s, social critics such as Daniel Bell, Christopher Lasch, and Tom Wolfe warned that our growing self-absorption was starving the idealism and aspirations of the postwar era. The “logic of individualism,” argued Lasch in his 1978 polemic, The Culture of Narcissism, had transformed everyday life into a brutal social competition for affirmation that was sapping our days of meaning and joy. Yet even these pessimists had no idea how self-centered mainstream culture would become. Nor could they have imagined the degree to which the selfish reflexes of the individual would become the template for an entire society. Under the escalating drive for quick, efficient “returns,” our whole socioeconomic system is adopting an almost childlike impulsiveness, wholly obsessed with short-term gain and narrow self-interest and increasingly oblivious to long-term consequences.

by Paul Roberts, American Scholar |  Read more:
Image: David Herbick

Sky Burial


[ed. If you've read Mary Roach's fascinating (and frequently humorous) book Stiff: The Curious Lives of Human Cadavers you'll have a good idea about the incredible number of things that can be done with a human body after you've donated it to medical (and forensic) science. Not for me.]

The few thousand acres of Freeman Ranch in San Marcos, Texas, include a working farm; fields studded with black-eyed Susans; and a population of white-tailed deer, Rio Grande turkeys, and brawny Gelbvieh bulls. But there’s more nested here: if, on your way from town, you turn off at the sign onto dirt road, and if your vehicle can handle the jerky, winding drive five miles deeper into the property, you will come across two tiers of chain-link fence. Behind this double barrier, accessed by key card, sixteen acres of land have been secured for a special purpose: at this place, settled in the grasses or tucked under clusters of oak trees, about seventy recently dead humans have been laid out in cages, naked, to decompose.

Just beyond the gates is where I meet Kate Spradley, a youthful, petite, and unfailingly polite woman of forty. She has short, mousy hair that’s often clipped in place with a barrette, and dresses in yoga-studio t-shirts that explain her slim, almost boyish figure. Kate is so utterly normal that it takes a moment to register the peculiarity of her life’s work: she spends her days handling and cataloguing human remains.

Kate, an associate professor at Texas State University in San Marcos, does most of her work at their Forensic Anthropology Center (FACTS)—the centerpiece of which is the Forensic Anthropology Research Facility (FARF), the largest of America’s five “body farms.” Including Kate, FACTS has three full-time researchers, a rotating crew of anthropology graduate students and undergraduate volunteers, and a steady influx of cadaver donations from both individuals and their next of kin—brought in from Texas hospitals, hospices, medical examiners’ offices, and funeral homes. When I arrive, Kate is helping lead a weeklong forensics workshop for undergrads, spread out across five excavation sites where skeletal remains have been buried to simulate “crime scenes.” Under a camping shelter, out of the intense sun, she stands before a carefully delineated pit that contains one such skeleton: jaws agape, rib cage slightly collapsed, leg bones bent in a half-pliĆ©. In the time since it was hidden here, a small animal has built a nest in the hollow of its pelvis.

Over a year ago, back when he was “fully fleshed” (as they say), this donor was placed out in the field under a two-foot-high cage and exposed to the elements, his steady decomposition religiously photographed and recorded for science. Across the property are dozens of cadavers in various stages of rot and mummification, each with its purpose, each with its expanding file of data: the inevitable changes to the body that the rest of us willfully ignore are here obsessively documented. For the past six years, FACTS has been collecting data on human “decomp” while steadily amassing a contemporary skeletal collection (about 150 individuals now) to update our understanding of human anatomy. More specifically, for the forensic sciences, FACTS works to improve methods of determining time since death, as well as the environmental impact on a corpse—particularly in the harsh Texan climate. Texas Rangers consult with them, and law enforcement officers from around the state come to train here each summer, much like this collection of nineteen- and twenty-year-olds.

While her students continue brushing dirt from bone, Kate offers to take me on a walking tour of the cages. Or, as she gently puts it: “I’ll show you some things.”

As we wander down the grassy path in the late spring heat, the first thing I encounter is the smell. “Is that nature or human?” I ask.

“Oh, I can’t smell anything right now—sometimes it depends on what direction the wind is blowing. But probably human.”

The smell of rotting human corpses is unique and uniquely efficient. You need never have experienced the scent before, but the moment you do, you recognize it: the stench of something gone horribly wrong. It reeks of rotten milk and wet leather. (...)

The odor is strong as I walk among the cages, the air redolent with the heavy, sour-wet scent of these bodies letting go of their bile, staining the grasses all around them. I look at the sprawl, each individual in its strange shelter, shriveled and shocked-looking; each with more or less of its flesh and insides; each, in its post-person state, given a new name: a number. They died quietly, in an old-age home; they died painfully, of cancer; they died suddenly, in some violent accident; they died deliberately, a suicide. In spite of how little they had in common in life, they now lie exposed alongside one another, their very own enzymes propelling them toward the same final state. Here, in death, unintentionally, they have formed a community of equals.

by Alex Mar, Oxford American |  Read more:
Image: "Passing Through—60 minutes in Foster City, California," by Ajay Malghan

Apple Hasn’t Solved the Smart Watch Dilemma

[ed. Ugh. I'm with Felix. See also: Apple's Watch is Like a High-Tech Mood Ring]

There’s a decent rule of thumb, when it comes to anything Apple: When it introduces something brand new, don’t buy version 1.o. Wait until the second or third version instead, you’ll be much better off.

Does anybody remember OS 10.0? It was a disaster, and even people who installed it spent 90% of their time in OS 9 instead. The very first MacBook Air? An underpowered exercise in frustration. The original iPad? Heavy and clunky. The original iPod? Was not only heavy and clunky and expensive, it was also tied to the Macintosh, and didn’t work either alone or with a PC.

The best-case scenario for the Apple Watch is that the product we saw announced today will eventually iterate into something really great. Because anybody who’s ever worn a watch will tell you: this thing has serious problems.

For one thing, Apple has been worryingly silent on the subject of battery life, but there’s no indication that this thing will last even 24 hours. A watch’s battery should last for months; even watches which don’t have batteries will last for a couple of days, if you have to wind them manually, or indefinitely, if they’re automatic and all you have to do is wear them.

Watches might be complicated on the inside, but they’re simple on the outside, and they should never come with a charging cable. (To make matters worse, even though the Apple Watch only works if you have an iPhone, the iPhone charging cable will not charge the Apple Watch; you need a different charging cable entirely.) (...)

Behind all the shiny options (sport! gold! different straps!) the watch itself is always pretty much the same: thick, clunky, a computer strapped to your wrist. Which is great, I suppose, if you’re the kind of person who likes to strap a computer to your wrist.

by Felix Salmon, Medium |  Read more:
Image: uncredited

Tuesday, September 9, 2014


A fight at the Ukrainian Parliament transformed into a Caravaggio-like painting… that’s why we love the internet. :-D
via:

Rumors of Tribes


If you went to an American high school of a certain size, the social landscape was probably populated by some teenagers who were known by just one name, the “jocks.”

And from the 1950s to the present day, their sworn enemies were probably known by a greater diversity of names: the “smokers,” the “greasers,” the “scrubs” (kids in vocational classes, also known as “shop kids” or “shop rats”), and “shrubs” (who are also known as “rockers,” “metalheads,” “bangers,” “Hessians,” or “heshers”), as well as an assortment of “burnouts,” “skaters,” “punks,” “emos,” “hippies,” “goths,” “stoners” (who were known at one West Hartford, Conn., school in the early 1980s as “the double door crowd” because they hung out in the school’s entryway) and “taggers,” a relatively recent term for graffiti vandals. (...)

There’s a lot that adults end up speculating about when it comes to high school crowd labels. Why do they change? How much do they vary from place to place? Where do new ones come from? Some labels, like “jocks,” stand the test of time, while others (“emos,” “wiggers”) rise with clothing styles and musical subgenres, and as much as one might like to imagine some high school in a tiny valley that time forgot where “greasers” battle “bebops,” those labels are no more. Obviously, American society changes, and mass media reinforce some names. But experts say that when they go back to schools they’ve studied before, they find that the crowd labels have been refreshed by some inscrutable linguistic tide.

I credit my son’s babysitter for getting me interested in crowd labels and clique names when she told me that the popular kids at her school were the “windswept hair people.” I thought, surely teen social landscapes have interesting names and rich naming practices like “windswept hair people.” But for the most part, as far as I’ve been able to tell, the labels don’t vary much, and if creative names exist, they’re not easy to hear about. Over the decades up until now, studious college-bound kids are usually known as “brains,” “brainiacs,” “nerds,” “geeks,” “eggheads,” or “the intelligentsia.” I collected high school names using Twitter and Survey Monkey and came across “crumbsters,” a label for the popular kids, and the “queer cult,” for the semi-populars, though if awards were given out for creative crowd labels, the kids at one New Jersey high school in the mid-2000s might win for naming their popular crowd in the aftermath of a food fight that happened in ninth grade, when one of the most popular girls shrieked and generally overreacted after she was hit in the face with a flying chicken patty. The popular kids became known as “chicken patties.” Depending on where and when you went to high school, you might have called them “preps,” “socialites,” “Ivy Leaguers,” “soshes,” or simply “the popular kids.”

Those four crowds—jocks, smart kids, popular kids, and deviants—are said by adolescent researchers to be standard in American high schools. Then there’s a grab-bag group: kids into drama and band (“drama fags,” “band fags,” “drama geeks,” “band geeks,” etc.), as well as “gang bangers,” “girly girls,” “cholos,” “Asian Pride” (and other racial and ethnic groups, like “FOBs”—a derogatory term referring to recent immigrants), “Gay Pride,” and depending on the region “plowjocks” (also ag students, or “aggies,” “hicks,” “cowboys,” and “rednecks”) and “Wangsters” (who are “wannabe” gangsters). At a highly selective residential public school from the middle of the U.S., upper-middle-class ethnic students were known as “teen girl squad” and “teen jerk squad,” while the mainly white, rural students were known as “second floor boys and girls,” because of where they lived in a dorm. (...)

The relative stability and blandness of certain crowd labels may have something to do with what kids need labels for (and they do need them—more about that in a bit). Basically, Brown said, once you leave the enclosed classroom of the elementary school, “you have a bigger sea of humanity that needs to be navigated without much oversight or guidance.” New middle-schoolers and high-schoolers now have to deal with far more people than they can have individual relationships with. You need some way to make sense of who might be a friend and who might be an enemy. And communicating about where you think you and others belong only works if your crowd labels are conventional enough for people to understand. I don’t know what a “grater” is, but I know what a “band nerd” is.

You can tell a lot about the social uses of crowd labels from their absence. For instance, people who went to smaller high schools don’t report having crowd labels. It’s not that they were closer-knit, it’s that the school’s population was closer to the number of individual social relationships the human brain can process (called Dunbar’s Number, it ranges between 100 and 130). Also, the first teenage crowd labels in America date to a 1942 study by August Hollingshead (who reported three labels: “elites,” “good kids,” and “the grubby gang”). Not until the 1940s were there a lot of high schools large enough to be all-encompassing social worlds.

by Michael Erard, TMN |  Read more:
Image: uncredited

Stereotyping Japan

[ed. See also: Designing Postwar Japan]

In 1945, The Saturday Evening Post proudly proclaimed that “The G.I. Is Civilizing the Jap” by showing the “savage” and “dirty” natives how to fix cars without breaking them, and how to go to the bathroom. A 1951 follow-up subsequently reported that the Japanese they visited six year prior, with their nightsoil gardens and Shintoism, now had gas stoves and Christianity! This idea, that the American Occupation would teach the Japanese how to act like modern men and women, was quite strong at the end of the Pacific War.

For General Douglas MacArthur, commander of the Occupation administration (SCAP), democratizing Japan was as much a personal manifest destiny for him as America’s presence in Asia was a benevolent historical one. Since basic cultural stereotypes underwrote US policy and media coverage, the primary debate among the participants was about the Japanese people’s perceived educability by American teachers. The story of the Occupation, from 1945 to 1952, is about shared assumptions regarding the limitations of the Japanese psyche: how much racism would be applied, basically. In the end, the conservative viewpoint won out, due mainly to concerns about alleged communist infiltration, and ideas about the Japanese being naturally vulnerable to Bolshevism. (...)

However, as the US advanced on the home islands, and it became clear there would be a postwar occupation, this tone began to decline. Propagandists, bureaucrats, and journalists now focused on the question of the Japanese mind, the molding of “a developing Asian consciousness,” in the words of Secretary of State Dean Acheson. This represented a turn from wartime propaganda, which depicted the Japanese as alien, insectile, and simian.

Now, the official line was that “it is a mistake to think that all Japanese are predominantly the monkey-man type.” The Saturday Evening Post, in an indication of this evolving thinking, posited in September 1944 that the good behavior of Japanese and Japanese-Americans in Hawaii showed that the Japanese in Japan “can, in time, be turned into decent, law-respecting citizens” too and were not “a hopeless immoral race” after all. So, the Japanese could be redeemed. But how?

It was clear that the old certainties were useless. “There was a school of thought that believed it possible to determine the friends of the United States by table etiquette,” wrote SCAP economics officer E. M. Hadley in her memoir, “those with beautiful table manners were friends; those ignorant of such matters were not.” She felt that these individuals gravitated towards those Japanese most implicated in imperialism. This was a clear indictment of people like former Ambassador Joseph Grew, who surrounded himself with cosmopolitan Japanese that ultimately supported the Empire.

The idea of “genuine” Westernized Japanese, which is how redemption would take place, was therefore a tricky one. Here, a more cautious racism emerged than the imperial idea of baptizing the Japanese in Western values. The most common refrain from Grew’s critics, in fact, was not that he was a big business conservative, as later historians like Howard Schonberger would charge, but that he had been bamboozled by Japan’s “wily” fake liberals. Wealthy, squeezing, unscrupulous, and false were words of choice for these men. Duplicity was a perfected Japanese art form, and Americans – like ostriches with their heads in the sand – fell for it.

Grew himself supported this characterization: he did not think anything better could be expected of the “Yamato race.” Though his views were less extreme than the more openly racist feelings expressed by other U.S. officials, it was still grounded in a racial determinism: they “dress like us,” but “they don’t think as we do.” In contrast, the “reputable citizens” who peppered Grew’s memoir, Ten Years in Japan reads as a yearbook of Imperial Japanese high society. These were the people who would resume the westernization of Japan.

Grew knew and respected this old guard- men who the New Dealers from the start regarded as revanchist liabilities. They, Grew’s friend Joseph Ballantine told George F. Kennan in 1947, were “able to raise Japan from a feudal state into a first class power in the course of seventy-five years,” after all. Grew and his more business-minded associates, in what became know as the Japan Lobby after his retirement, were ideologically very close to these nobles, prewar politicians, and zaibatsu families. Despite all of the talk about alien Japanese mentalities and childish inferiority complex, one very strong trans-Pacific cultural connection before the war, and that was through the business communities that Grew and his allies circled through in both countries.

Most other areas of possible cultural exchange were anemic. What followed was only logical. American skeptics of the New Deal at home, like Grew and Ballantine, also skeptical of the “common” Japanese person’s capacity for free thinking, came to share the views of Japan’s ruling class that any substantial reforms would bring anarchy. “[My] experience [has] shown that democracy in Japan would never work,” Grew had concluded just before the war’s end.

by Paul Mutter, Souciant | Read more:
Image:uncredited

Sigmar Polke. Untitled (Quetta, Pakistan). 1974/1978
via:

Tell Me What You Like and I'll Tell You Who You Are

The Facebook like button was first released in 2009. As of September of 2013, a total of 1.13 trillion likes had been registered across the earth, according to OkCupid co-founder Christian Rudder in his new book Dataclysm. Much has been written about how “likes” limit our social interaction or increase our engagement with brands. But these likes have another function, they’re becoming a source of data that will eventually tell social scientists more about who we are than what we share.

According to a research group in the UK, it turns out that what people choose to “like” on Facebook can be used to determine with 95% accuracy whether they are Caucasian or African American, 88% accuracy whether they are gay or straight, and 65% accuracy whether they are a drug user, among other things. So what you post on Facebook may not give as true a signal of your genuine self as what you like on Facebook. Rudder writes:
“This stuff was computed from three years of data collected from people who joined Facebook after decades of being on earth without it. What will be possible when someone’s been using these services since she was a child? That’s the darker side of the longitudinal data I’m otherwise so excited about. Tests like Myers-Briggs and Stanford-Binet have long been used by employers, schools, the military. You sit down, do your best, and they sort you. For the most part, you’ve opted in. But it’s increasingly the case that you’re taking these tests just by living your life.”
Is it possible that in the future your SAT score, personality, and employability might simply be predicted by all the data collected from your digital device use? I asked Rudder whether a person’s like pattern on Facebook could be used as a proxy for an intelligence or IQ score. He told me:
“I think we are still far away from saying with any real certainty how smart any one person is based on Facebook likes. In aggregate, finding out that people who like X, Y, Z, have traits A, B, C, D, I think we’re already there. We’re already tackling life history questions based on Facebook likes. For example, did your parents get divorced before they were 21, they can unlock that with 60% certitude. Given that it’s only a few years’ worth of likes, imagine that it’s in five or 10 years and there’s that much more data to go on, and people are revealing their lives through their smartphones and their laptops.”
by Jonathan Wai, Quartz |  Read more:
Image: Dado Ruvic

The Click Clique

It was a lovely April evening in downtown Dallas, the sky blank and blue. The Kate Spade cocktail party was scheduled to start at six o’clock, and as the minutes ticked past, two hundred young women in all their polymorphic plumage—stilettos, CĆ©line bags, bangles, blowouts, and iPhones, always iPhones—began to gather on an Astroturf lawn across the street from the Joule Hotel. Passersby, leaving their offices for home or happy hours, might have thought the gathering was just another party full of beautiful people, not all that unusual in Dallas.

Except these weren’t just beautiful people. These were fashion bloggers, selfie stars whose facility with heated hair tools and knack for posing long ago upended a field once strictly dominated by runway shows and magazine glossies. In attendance, for example, was Aimee Song (known as@songofstyle, with 1.58 million followers), a Los Angeles blogger famous for her girly grunge aesthetic and lips-parted-eyes-staring-dead-into-the-camera expression; her Instagram of a pair of $580 Isabel Marant sandals (basically Birkenstocks with pink bows), which she’d bought earlier that afternoon, had garnered more than 27,000 likes. There was also Julie SariƱana (@SincerelyJules, 1.4 million), another L.A.-based blogger, whosephoto outside the Joule in a white slip dress and Vince espadrille platform sandals would later be used to advertise the shoe, which had sold out at all department stores, on eBay. There was Andreas Wijk (@andreaswijk, 129k), the orange-colored Justin Bieber of Sweden, and Wendy Nguyen(@wendyslookbook, 510k), subject of the viral YouTube video “25 Ways to Wear a Scarf in 4.5 Minutes!” And then there was Dallas’s own Jane Aldridge(@seaofshoes, 132k), quietly slinking about in leather pants and a red flannel shirt, champagne in hand.

The influence wielded by this flock of pout-prone lips and dewy eyelashes was nothing short of staggering. These partygoers reached more than 13.5 million followers on Instagram combined. Many made more than $20,000 a month—some more than $80,000—just from posting links to sites that sold the short-shorts and Chanel shoes that they wore in their photos. Factoring in the revenue from banner ads on their websites, sponsored posts, and store appearances, a number of top bloggers raked in more than $1 million a year. And now they were waiting—having flown in from Los Angeles and New York and more than eighteen countries, some as far away as Australia and China—to meet the person who had made much of this money-making possible: a redheaded 26-year-old from Highland Park named Amber Venz.

Amber and her boyfriend, Baxter Box, had revolutionized the fashion world a few years earlier when, almost single-handedly, they figured out how to do the near impossible: easily monetize the content of fashion blogs. In 2011, with only a modest family investment, they’d built rewardStyle, a fashion technology company that collects commissions from retailers on behalf of bloggers and more-traditional publishers (think the websites of some major magazines) whose pictures induce readers to buy baubles online. In three years the company had grown to include 87 employees in Dallas and London, a network of 4,000 retailers, and more than 14,000 “publishers,” who drove $155 million in retail sales in 2013 alone (rewardStyle declined to release information about its amount of revenue). As rewardStyle’s top 200 earners, the bloggers on the lawn had been invited to the company’s second annual conference, hosted at the Joule. Because rewardStyle only makes money when its publishers do, the goal of the next three days was to teach the women how to make even more money by giving them strategies for effective website design (NewYorker.com was used as a model) and for search engine optimization (using, as an example, the key words “Valentino Rockstud pumps” ). The cocktail party was a networking event to kick the invitation-only conference off. (...)

Here's a theory about the rise of fashion blogging: in 2008 and 2009, during the dark days of the recession, magazines laid off employees left and right. Ad pages shrank, and, perhaps coincidentally, the brands that continued to advertise continued to be written about. Yet aspiring fashionistas, many of them unemployed millennials living with their parents, had plenty more to say. Blogger software was free and easy, so those young women turned to the Internet and started doing what magazines weren’t—mixing high and low brands and taking pictures that were rough and unexpected. Some bloggers developed loyal followings, and soon icons like Karl Lagerfeld, the white-ponytailed Werner Herzog of fashion, were greeting bloggers like Tavi Gevinson, a then fourteen-year-old from the Chicago suburbs, after their shows. In 2009 Dolce and Gabbana famously upset the runway’s feudal hierarchies when it sat Bryanboy, a Filipino blogger, just two seats away from Vogue’s Anna Wintour in the front row of the Milan spring-summer show.

by Francesca Mari, Texas Monthly |  Read more:
Image: Jonathan Zizzo

Monday, September 8, 2014

Outbreak Of ArgyleEvelyn Schless, Vogue 1960s
via:

How to Get Into an Ivy League College—Guaranteed

The academic transcript looked like a rap sheet. The 16-year-old had dropped out of boarding schools in England and California because of behavioral problems and had only two semesters left at a small school in Utah. Somehow, he had to raise his grade-point average above a C before applying to college. His confidence was shot, and though his parents didn’t openly discuss it, he knew they were crushed at the thought that he might not get into a reputable college. What the boy didn’t know was that back home in Hong Kong, where his dad is chief executive officer of a big publicly traded investment company, the family was calling in a miracle worker.

Through a friend, his father reached out to Steven Ma, founder of ThinkTank Learning, a chain of San Francisco Bay Area tutoring centers that operate out of strip malls. Like many in the field, Ma helps kids apply to college. Unlike his competitors, Ma guarantees that his students will get into a top school or their parents get their money back—provided the applicant achieves a certain GPA and other metrics. He also offers a standard college consulting package that doesn’t come with a guarantee; for a lower price, Ma’s centers provide after-school tutoring, test prep, college counseling, and extra class work in English, math, science, and history.

Ma, a former hedge fund analyst, makes bets on student admissions the way a trader plays the commodities markets. Using 12 variables from a student’s profile—from grades and test scores to extracurricular activities and immigration status—Ma’s software crunches the odds of admission to a range of top-shelf colleges. His proprietary algorithm assigns varying weights to different parameters, derived from his analysis of the successes and failures of thousands of students he’s coached over the years. Ma’s algorithm, for example, predicts that a U.S.-born high school senior with a 3.8 GPA, an SAT score of 2,000 (out of 2,400), moderate leadership credentials, and 800 hours of extracurricular activities, has a 20.4 percent chance of admission to New York University and a 28.1 percent shot at the University of Southern California. Those odds determine the fee ThinkTank charges that student for its guaranteed consulting package: $25,931 to apply to NYU and $18,826 for USC. “Of course we set limits on who we’ll guarantee,” says Ma. “We don’t want to make this a casino game.”

Some 10,000 students—sixth graders to junior-college grads—use ThinkTank’s services now, generating annual revenue of more than $18 million. Nearly all are Asian immigrants like Ma, 36, who moved to Northern California from Taiwan when he was 11. He reels them in at free seminars, held in Holiday Inn ballrooms on Saturday afternoons. The standing-room-only events, advertised in Bay Area Chinese media, include a raffle of free SAT prep classes and a pep talk for the college-obsessed. Ma reassures the bewildered, multigenerational audiences that top-ranked American universities aren’t nearly as capricious as they seem, once you know their formula. ThinkTank boasts that 85 percent of its applicants get into a top-40 college, as ranked by U.S. News & World Report. “Our model knows more about how to get into many colleges than their own admissions officers know,” he says.

Ma also writes “custom contracts,” like the one he struck with the Hong Kong CEO for his wayward son in Utah. The father agreed to participate in this article, and authorized Ma to release their signed agreement, on the condition no family member was named. His son, he explains, doesn’t know how much he paid Ma; the dad worries the truth might hurt the boy’s self-esteem. He feels guilty that he didn’t spend more time with his son growing up. He was too busy running his business; hiring Ma, he says, was his compensation to his son. “They were desperate,” says Ma.

After signing an agreement in May 2012, the family wired Ma $700,000 over the next five months—before the boy had even applied to college. The contract set out incentives that would pay Ma as much as $1.1 million if the son got into the No. 1 school in U.S. News’ 2012 rankings. (Harvard and Princeton were tied at the time.) Ma would get nothing, however, if the boy achieved a 3.0 GPA and a 1600 SAT score and still wasn’t accepted at a top-100 college. For admission to a school ranked 81 to 100, Ma would get to keep $300,000; schools ranked 51 to 80 would let Ma hang on to $400,000; and for a top-50 admission, Ma’s payoff started at $600,000, climbing $10,000 for every rung up the ladder to No. 1.

by Peter Waldman, Bloomberg Businessweek |  Read more:
Image: Damien Maloney for Bloomberg Businessweek

Home Depot Hacked


Home Depot has confirmed that it suffered a massive security breach that could impact “any customer that has used their payment card” at retail stores in the US and Canada since April. HomeDepot.com does not appear to have been affected.

The company said it is still investigating, but it doesn’t believe that PIN numbers were compromised.

Security researcher Brian Krebs originally reported the issue after seeing a batch of credit cards go up for sale on the black market. According to Krebs, Home Depot was hit with the same malware that Target fell victim to last year.

In a statement, Home Depot reassured customers that they won’t be responsible for any fraudulent charges that stem from the theft. The company is also offering identity protection services for affected customers.

by Josh Ong, TNW |  Read more:
Image: Home Depot

Here's Everything We Expect Apple To Announce During Its Big Event This Week

Apple's first product launch event of the year is on Sept. 9.

It's going to be stacked.

Apple hasn't released a new product in 2014, save for some minor refreshes to a few of its MacBook models. Instead, it's packing all its announcements into the fall. And it all starts Tuesday.

Here's a quick breakdown of what to expect. (...)

While almost everything about the new iPhone has leaked, Apple's first new product under CEO Tim Cook largely remains a mystery.

Apple is expected to unveil its first wearable computing device, which the press has been calling the iWatch, on Tuesday. According to Brian X. Chen of The New York Times, the iWatch will come in two sizes and come packed with health and fitness monitoring sensors. It'll also have a flexible or curved display. Mark Gurman of 9to5Mac reported Saturday that developers will also be able to make apps for the device.

There's a lot of pressure on Apple to prove it can launch a successful new product category. A lot of Apple critics think the company can no longer innovate without Steve Jobs. This will be a big test for Cook. If the iWatch is a dud, then those critics will be right. If the iWatch is a success, Cook will prove that his Apple is just as innovative as Apple was under Jobs. Several companies have tried to make smartwatches over the last year or so, but they're all duds.

Although we'll learn a lot about the iWatch on Tuesday, Apple isn't expected to start selling the device until early 2015, according to John Paczkowski of Re/code.

by Steve Kovach, Business Insider |  Read more:
Image: Behance/Todd Hamilton

How Are American Families Doing? A Guided Tour of Our Financial Well-Being

How are we doing?

That is the question that reverberates in every report of the latest economic data. It’s the one that nags Americans as they head to the voting booth. It’s the question that sets our national mood. A new report provides the most exhaustive look at how Americans’ personal finances are faring — and sheds light on why the soaring stock market and occasionally giddy headlines have rarely translated into mass contentment with the economy.

Every three years, the Federal Reserve’s Survey of Consumer Finances interviews thousands of American families (6,026 for the newly published 2013 edition) about their income, savings, investments and debts. It is some of the richest information available about Americans’ financial lives, particularly in the 2010 to 2013 period of halting, inconsistent recovery from the Great Recession.

So how are we doing?

No recovery in incomes for most groups

The most basic measure of financial well-being is how much money people make and how much that money can buy. Many measures, such as per capita personal income, have risen in recent years, even after adjusting for inflation.


But this survey gives us a richer view of how incomes of people in different groups were affected. It is rather depressing.

Incomes rose nicely in the 2010 to 2013 time frame for the top 10 percent of earners (who had a median income of $230,000 last year). They rose slightly, by 0.7 percent, for the 80th to 90th percentile of earners (median of $122,000). But real incomes fell for every other group of earners.

Separate people by age or education, and the same basic pattern applies. Those with a college degree have done fine, but anything less than that and incomes have fallen. Both young adult households (those headed by someone under 35) and those households headed by someone over 75 have seen steep income declines in that same period.

This is the simplest yet most important fact to understand about the current economic recovery: It has not resulted in higher incomes for anyone other than those who were already doing well. And very large groups of Americans have experienced falling incomes.

by Neil Irwin, NY Times |  Read more:
Image: NY Times