Sunday, September 8, 2013


miron mihei, what i did this summer
via:

Logo, Bullshit & Co., Inc.


[ed. For a variety of reasons a lot of people seem pretty exercised about the new Yahoo logo.]

.
Anybody can make a logo. No doubt. It’s not complicated. Just try a couple of fonts and colors, choose the one you like, then change the font a little so it becomes special. Make it look nice. Blog about it, showing those magic construction lines. You can do it. All it needs is a little time, a computer, someone that knows how to use Illustrator, and taste, maybe. Everybody has taste, right? So let’s do it!

So thought Yahoo’s CEO Marissa Mayer, and she went and did it. How did it turn out?

The Weekend

For the last month Yahoo has showcased its logo in a different font each day. The motivation was unclear. Did they want feedback in the decision process, or was Yahoo just trying to get attention from the design community? The new logo Yahoo unveiled at the end of this was not better or worse than the previous variations, and it appears it had already been chosen before this began. After unveiling that empty secret, Marissa Mayer wrote a blog post about the process:
We hadn’t updated our logo in 18 years. Our brand, as represented by the logo, has been valued at as much as ~$10 billion dollars. So, while it was time for a change, it’s not something we could do lightly.
Whether Yahoo needs a change in brand identity is hardly something we can decide from outside, not knowing exactly what the overall brand strategy is. It seems legit, because currently the Yahoo brand feels dead. Yahoo is still a massive online property, but it is as boring as it is big. Changing brand identity when you change strategy makes sense. So, even though it’s misleading to claim that the Yahoo logo hasn’t changed in 18 years, the time for a change is up to the CEO.
On a personal level, I love brands, logos, color, design, and, most of all, Adobe Illustrator. I think it’s one of the most incredible software packages ever made. I’m not a pro, but I know enough to be dangerous :)
There is nothing wrong with loving brands or branding as such (I do), logos in general (some do), colors (who doesn’t?) or a particular software program (okay, that’s a little weird). And it’s okay if you are not a pro at everything. But if, as the CEO, you work on a $10 Billion Dollar core brand identity, and you hack it out in a weekend, you are not being professional.
So, one weekend this summer, I rolled up my sleeves and dove into the trenches with our logo design team: Bob Stohrer, Marc DeBartolomeis, Russ Khaydarov, and our intern Max Ma. We spent the majority of Saturday and Sunday designing the logo from start to finish, and we had a ton of fun weighing every minute detail.
Let us assume that at Yahoo the logo design team (including the intern) is comprised of the best designers in the field. It is conceivable that, with some luck, this dream team can design a logo “from start to finish” over a weekend. It sure is fun “weighing every minute detail” with a team of outstanding professionals. And what is more efficient than working directly with the CEO on the brand identity? A dream setup. Also, it’s cheap. A weekend for a logo, instead of paying a branding agency millions and waiting months for something that can be done in a couple of days? That’s smart business!

Is it?

by Oliver Reichenstein, iA |  Read more:
Images via: and via:

Saturday, September 7, 2013

We Post Nothing About Our Daughter Online

[ed. Not sure what's most remarkable here: the paranoia, the OCD or the "helicoptering". In any case, it's a good example of the degree to which some people think they can control their digital lives and technology (...want to make some headway? Start by getting off Facebook!) Anyway, read the comments for a good laugh.]

I vividly remember the Facebook post. It was my friend’s 5-year-old daughter “Kate,” (a pseudonym) standing outside of her house in a bright yellow bikini, the street address clearly visible behind her on the front door. A caption read “Leaving for our annual Labor Day weekend at the beach,” and beneath it were more than 50 likes and comments from friends—including many “friends” that Kate’s mom barely knew.

The picture had been uploaded to a Facebook album, and there were 114 shots just of Kate: freshly cleaned and swaddled on the day of her birth … giving her Labradoodle a kiss … playing on a swing set. But there were also photos of her in a bathtub and an awkward moment posing in her mother’s lacy pink bra.

I completely understood her parents’ desire to capture Kate’s everyday moments, because early childhood is so ephemeral. I also knew how those posts would affect Kate as an adult, and the broader impact of creating a generation of kids born into original digital sin.

Last week, Facebook updated its privacy policy again. It reads in part: “We are able to suggest that your friend tag you in a picture by scanning and comparing your friend’s pictures to information we’ve put together from your profile pictures and the other photos in which you’ve been tagged.” Essentially, this means that with each photo upload, Kate’s parents are, unwittingly, helping Facebook to merge her digital and real worlds. Algorithms will analyze the people around Kate, the references made to them in posts, and over time will determine Kate’s most likely inner circle. (...)

There’s a more insidious problem, though, which will haunt Kate well into the adulthood. Myriad applications, websites, and wearable technologies are relying on face recognition today, and ubiquitous bio-identification is only just getting started. In 2011, a group of hackers built an app that let you scan faces and immediately display their names and basic biographical details, right there on your mobile phone. Already developers have made a working facial recognition API for Google Glass. While Google has forbidden official facial recognition apps, it can’t prevent unofficial apps from launching. There’s huge value in gaining real-time access to view detailed information the people with whom we interact.

The easiest way to opt-out is to not create that digital content in the first place, especially for kids. Kate’s parents haven’t just uploaded one or two photos of her: They’ve created a trove of data that will enable algorithms to learn about her over time. Any hopes Kate may have had for true anonymity ended with that ballet class YouTube channel.

Knowing what we do about how digital content and data are being cataloged, my husband and I made an important choice before our daughter was born. We decided that we would never post any photos or other personally identifying information about her online. Instead, we created a digital trust fund.

The process started in earnest as we were selecting her name. We’d narrowed the list down to a few alternatives and ran each (and their variants) through domain and keyword searches to see what was available. Next, we crawled through Google to see what content had been posted with those name combinations, and we also looked to see if a Gmail address was open.

We turned to KnowEm.com, a website I often rely on to search for usernames, even though the site is primarily intended as a brand registration service. We certainly had a front-runner for her name, but we would have chosen something different if the KnowEm results produced limited availability or if we found negative content associated with our selection.

With her name decided, we spent several hours registering her URL and a vast array of social media sites. All of that tied back to a single email account, which would act as a primary access key. We listed my permanent email address as a secondary—just as you’d fill out financial paperwork for a minor at a bank. We built a password management system for her to store all of her login information.

On the day of her birth, our daughter already had accounts at Facebook, Twitter, Instagram, and even Github. And to this day, we’ve never posted any content.

by Amy Webb, Slate |  Read more:
Image: Hemera/Thinkstock

Eggplants in a Basket, Japanese (woodblock print) late Edo Period; Ukiyo-e; surimono
via:

Red Hat Paris. Mitchell Funk
via:

They’re Taking Over!


It’s become fashionable to keep jellyfish in aquariums. Behind glass they can be hypnotically beautiful and immensely relaxing to watch. Unless we are enjoying them in this way, we usually give little thought to the creatures until we are stung by one. Jellyfish stings are often not much more than a painful interlude in a seaside holiday—unless you happen to live in northern Australia. There, you might be stung by the most venomous creature on Earth: the box jellyfish, Chironex fleckeri.

Box jellyfish have bells (the disc-shaped “head”) around a foot across, behind which trail up to 550 feet of tentacles. It’s the tentacles that contain the stinging cells, and if just six yards of tentacle contact your skin, you have, on average, four minutes to live—though you might die in just two. Seventy-six fatalities have been recorded in Australia since 1884, and many more may have gone misdiagnosed or unreported.

In 2000 a somewhat less venomous species of box jellyfish, which lives further south, threatened the Sydney Olympics. It began swarming at the exact location scheduled for the aquatic leg of the triathlon events. The Olympic Committee considered many options, including literally sweeping the course free of the menace, but all were deemed impractical. Then, around a week before the opening ceremony, the jellyfish vanished as mysteriously as they had appeared.

Most jellyfish are little more than gelatinous bags containing digestive organs and gonads, drifting at the whim of the current. But box jellyfish are different. They are active hunters of medium-sized fish and crustaceans, and can move at up to twenty-one feet per minute. They are also the only jellyfish with eyes that are quite sophisticated, containing retinas, corneas, and lenses. And they have brains, which are capable of learning, memory, and guiding complex behaviors. (...)

The box jellies and Irukandjis are merely the most exotic of a group of organisms that have existed for as long as complex life itself. In Stung! On Jellyfish Blooms and the Future of the Ocean, biologist Lisa-ann Gershwin argues that after half a billion years of quiescence, they’re on the move:
If I offered evidence that jellyfish are displacing penguins in Antarctica—not someday, but now, today—what would you think? If I suggested that jellyfish could crash the world’s fisheries, outcompete the tuna and swordfish, and starve the whales to extinction, would you believe me?
Jellyfish are among the oldest animal fossils ever found. Prior to around 550 million years ago, when a great diversity of marine life sprang into existence, jellyfish may have had the open oceans pretty much to themselves. Today they must share the briny deep with myriad creatures, and with machines. It’s not just the wildlife they’re worrying. In November 2009 a net full of gigantic jellyfish, the largest of which weighed over 450 pounds, capsized a Japanese trawler, throwing the three-man crew into the ocean. But even mightier vessels have been vanquished by jellyfish. (...)

From the Arctic to the equator and on to the Antarctic, jellyfish plagues (or blooms, as they’re technically known) are on the increase. Even sober scientists are now talking of the jellification of the oceans. And the term is more than a mere turn of phrase. Off southern Africa, jellyfish have become so abundant that they have formed a sort of curtain of death, “a stingy-slimy killing field,” as Gershwin puts it, that covers over 30,000 square miles. The curtain is formed of jelly extruded by the creatures, and it includes stinging cells. The region once supported a fabulously rich fishery yielding a million tons annually of fish, mainly anchovies. In 2006 the total fish biomass was estimated at just 3.9 million tons, while the jellyfish biomass was 13 million tons. So great is their density that jellyfish are now blocking vacuum pumps used by local diamond miners to suck up sediments from the sea floor.

by Tim Flannery, NY Review of Books |  Read more:
Image: David Hall

All LinkedIn with Nowhere to Go

In a jobs economy that has become something of a grim joke, nothing seems quite so bleak as the digital job seeker’s all-but-obligatory LinkedIn account. In the decade since the site launched publicly with a mission “to connect the world’s professionals to make them more productive and successful,” the glorified résumé-distribution service has become an essential stop for the professionally dissatisfied masses. The networking site burrows its way into users’ inboxes with updates spinning the gossamer dream of successful and frictionless advancement up the career ladder. Just add one crucial contact who’s only a few degrees removed from you (users are the perpetual Kevin Bacons in this party game), or update your skill set in a more market-friendly fashion, and one of the site’s 187 million or so users will pluck you from a stalled career and offer professional redemption. LinkedIn promises to harness everything that’s great about a digital economy that so far has done more to limit than expand the professional prospects of its user-citizens.

In reality, though, the job seeker tends to experience the insular world of LinkedIn connectivity as an irksome ritual of digital badgering. Instead of facing the prospect of interfacing professionally with a nine-figure user base with a renewed spring in their step, harried victims of economic redundancy are more likely to greet their latest LinkedIn updates with a muttered variation of, “Oh shit, I’d better send out some more résumés.” At which point, they’ll typically mark the noisome email nudge as “read” and relegate it to the trash folder.

Which is why it’s always been a little tough to figure out what LinkedIn is for. The site’s initial appeal was as a sort of self-updating Rolodex—a way to keep track of ex-coworkers and friends-of-friends you met at networking happy hours. There’s the appearance of openness—you can “connect” with anyone!—but when users try to add a professional contact from whom they’re more than one degree removed, a warning pops up. “Connecting to someone on LinkedIn implies that you know them well,” the site chides, as though you’re a stalker in the making. It asks you to indicate how you know this person. Former coworker? Former classmate? Fine. “LinkedIn lets you invite colleagues, classmates, friends and business partners without entering their email addresses,” the site says. “However, recipients can indicate that they don’t know you. If they do, you’ll be asked to enter an email address with each future invitation.”

You can try to lie your way through this firewall by indicating you’ve worked with someone when you haven’t—the equivalent of name-dropping someone you’ve only read about in management magazines. But odds are, you’ll be found out. I’d been confused, for instance, about numerous LinkedIn requests from publicists saying we’d “worked together” at a particular magazine. But when I clicked through to their profiles, I realized why they’d confidently asserted this professional alliance into being: the way to get to the next rung is to pretend you’re already there. If you don’t already know the person you’re trying to meet, you’re pretty much out of luck.

This frenetic networking-by-vague-association has bred a mordant skepticism among some users of the site. Scott Monty, head of social media for the Ford Motor Company, includes a disclaimer in the first line of his LinkedIn bio that, in any other context, would be a hilarious redundancy: “Note: I make connections only with people whom I have met.” It’s an Escher staircase masquerading as a career ladder.

On one level, of course, this world of aspirational business affiliation is nothing new. LinkedIn merely digitizes the core, and frequently cruel, paradox of networking events and conferences. You show up at such gatherings because you want to know more important people in your line of work—but the only people mingling are those who, like you, don’t seem to know anyone important. You just end up talking to the sad sacks you already know. From this crushing realization, the paradoxes multiply on up through the social food chain: those who are at the top of the field are at this event only to entice paying attendees, soak up the speaking fees, and slip out the back door after politely declining the modest swag bag. They’re not standing around on garish hotel ballroom carpet with a plastic cup of cheap chardonnay in one hand and a stack of business cards in the other.

by Ann Friedman, The Baffler |  Read more:
Image: J.D. King

Gut Bacteria From Thin Humans Can Slim Mice Down

[ed. The microbiome -- exciting new frontier in medical diagnosis and treatment.]

The trillions of bacteria that live in the gut — helping digest foods, making some vitamins, making amino acids — may help determine if a person is fat or thin.

Dr. Jeffrey I. Gordon, left, and Vanessa K. Ridaura are two members of a scientific team whose research shows a connection between human gut bacteria and obesity.

The evidence is from a novel experiment involving mice and humans that is part of a growing fascination with gut bacteria and their role in health and diseases like irritable bowel syndrome and Crohn’s disease. In this case, the focus was on obesity. Researchers found pairs of human twins in which one was obese and the other lean. They transferred gut bacteria from these twins into mice and watched what happened. The mice with bacteria from fat twins grew fat; those that got bacteria from lean twins stayed lean.

The study, published online Thursday by the journal Science, is “pretty striking,” said Dr.Jeffrey S. Flier, an obesity researcher and the dean of the Harvard Medical School, who was not involved with the study. “It’s a very powerful set of experiments.”

Michael Fischbach of the University of California, San Francisco, who also was not involved with the study, called it “the clearest evidence to date that gut bacteria can help cause obesity.”

“I’m very excited about this,” he added, saying the next step will be to try using gut bacteria to treat obesity by transplanting feces from thin people.

“I have little doubt that that will be the next thing that happens,” Dr. Fischbach said.

But Dr. Flier said it was far too soon for that.

“This is not a study that says humans will have a different body weight” if they get a fecal transplant, he said. “This is a scientific advance,” he added, but many questions remain.

Dr. Jeffrey I. Gordon of Washington University in St. Louis, the senior investigator for the study, also urged caution. He wants to figure out which bacteria are responsible for the effect so that, eventually, people can be given pure mixtures of bacteria instead of feces. Or, even better, learn what the bacteria produce that induces thinness and give that as a treatment.

While gut bacteria are a new hot topic in medicine, he added that human biology is complex and that obesity in particular has many contributors, including genetics and diet.

In fact, the part of the study that most surprised other experts was an experiment indicating that, with the right diet, it might be possible to change the bacteria in a fat person’s gut so that they promote leanness rather than obesity. The investigators discovered that given a chance, and in the presence of a low-fat diet, bacteria from a lean twin will take over the gut of a mouse that already had bacteria from a fat twin. The fat mouse then loses weight. But the opposite does not happen. No matter what the diet, bacteria from a fat mouse do not take over in a mouse that is thin.

by Gina Kolata, NY Times |  Read more:
Image: Dan Gill

Friday, September 6, 2013


Billy Gibbons
via:

Underwater photography by Zena Holloway
via:

Can the White Girl Twerk?

There’s a famous scene in the 2004 Wayans brothers’ comedy White Chicks: The opening bars of piano introduce Vanessa Carlton’s “A Thousand Miles” and the car full of white girls squeals in delight before launching into the cloyingly earnest lyrics. Later a black man sings the song and, that’s it, that’s the whole joke.

Like the late aughts’ “hipster,” “white girl” is a label applied either dismissively or self-consciously. The tastes, habits, and concerns of the white girl, like those of the hipster, are often punch lines used as self-evident definitions for the label. Like a hipster’s, the white girl’s class status goes without saying—there is no Twitter account for ­PoorWhiteGirlProblems.

Historically, white girlhood stood for the preservation of whiteness. Not just reproductively but as future missionaries, schoolteachers, moral custodians of the dark frontier—Columbia leading the way. Today the symbolic potency of white femininity is shifting.

Only outprivileged by white men, the white girl’s assumed universality lets us project onto “white girl” our attitudes about race, gender, class, and the behavior appropriate within those parameters. The girlhood implied by the label is central to understanding how it regulates not only white girls’ behavior but everyone else’s too.

The straight American “white girl” serves as the normative gender performance, the femininity from which all femininity deviates, through which all women of color are otherized. As the default, heteronormative white femininity must provide the ultimate foil to patriarchal masculinity. The “white girl” is vulnerable, trivial, and self-involved. Above all she is mainstream, either by consumer habits or design. Any resemblance to real-life white girls doesn’t matter; all exceptions are exempt from consideration. For every witchy, androgynous Rooney Mara, there’s a Taylor Swift, a Zooey Deschanel, and a Miley Cyrus. At least, there used to be a Miley Cyrus.

Her loyalty to the white girlhood she was born into via Hannah Montana is under scrutiny. No longer confined to a Disney contract, she dresses in cropped shirts, leather bras, and bondage-inspired Versace. She’s taken cues from Rihanna and hip-hop culture at large and added gold chains, even a grill. Sixteen-year-old Miley had never heard a Jay Z song (despite the name-check in her hit single “Party in the USA”). Twenty-year-old Miley tweets screengrabs of her iPhone, boasting songs from Gucci Mane, French Montana, and Juicy J. She’s recorded with the latter two.

It would be unfair to demand Miley remain faithful to her teenage aesthetic when no self-aware person does. And it would take a dull palette to assume she couldn’t sincerely recognize the appeal of rap music and gold accessories. Her sincerity, however, is ­irrelevant. Charges of cultural appropriation and the rampant slut shaming she now faces draw a narrow lens to her actions. In truth, Miley exemplifies the white impulse to shake the stigma its mainstream status affords while simultaneously exercising the power of whiteness to define blackness.

She ties a bandana across her forehead like Tupac, or struggle-twerks—her ever-present tongue lolling out in challenge as she looks back at us. Each time it’s a statement declaring this is cool because it’s atypical, and it’s atypical because according to her, it’s black. Miley’s look exists because racial drag carries cachet in cultures that commodify difference.

For all its black performers, the rap industry has been run by the white establishment and caters to the white consumer. The commercial success of gangsta rap wouldn’t be possible without North America’s largest demographic buying in. The commercial demand for sexually aggressive and violent rap is appreciably shaped by white teens in the suburbs looking to live out their fantasies via imagined black bodies. And in guiding the market, white consumers dictate the available imagery of blackness.

by Ayesha Saddiqui, TNI |  Read more:

The Robin Hood Tax



the pussycat stage by monstermagnet (Eren Kürkcüoğlu)
via:

Studio Drift: Dandelight
via:

Little Dragon