Saturday, March 16, 2013

De Nimes: The Long Journey of Blue Jeans

Before we had low-rise, straight-leg, skinny, selvage, stretchy, resin-coated, lotion-infused, or mom jeans, there was simply jean—the fabric. The name likely originated from gênes, referring to Genoa, Italy, where sailors wore a twill blend of cotton, linen, and wool that came in a variety of stripes and colors.

Today’s jeans are made from heavier, all-cotton denim woven in a combination of indigo-dyed vertical yarn and natural horizontal yarn, resulting in the fabric’s white-speckled surface and pale underside. And although the original name for denim came from Nîmes, France—as in, de Nîmes—the fabric was most likely first produced in England.

Once the United States emancipated itself from British rule, the former colonists stopped importing European denim and began producing it themselves from all-American cotton, picked by slaves in the South and spun, dyed, and woven in the North. The Industrial Revolution was largely fueled by the textile trade, which almost singlehandedly upheld slavery. When the cotton gin mechanized processing in 1793, prices, already subsidized by slave labor, dropped dramatically. Cheap goods drove demand, and a vicious cycle ensued. In the period between the invention of the cotton gin and the Civil War, America’s slave population shot from 700,000 to a staggering 4 million.

After the Civil War, companies like Carhartt, Eloesser-Heynemann, and OshKosh slung cotton coveralls to miners, railroad men, and factory workers. A Bavarian immigrant named Levi Strauss set up shop in San Francisco selling fabric and work-wear. Jacob Davis, an entrepreneurial Reno tailor, bought Strauss’s denim to make workingman’s pants, and added metal rivets to prevent the seams from ripping open. Davis sent two samples of his riveted pants to Strauss, and they patented the innovation together. Soon after, Davis joined Strauss in San Francisco to oversee production in a new factory. In 1890, Strauss assigned the ID number of 501 to their riveted denim “waist overalls.” The Levi’s 501 blue jean—which would become the best-selling garment in human history—was born.

Initially, jeans were proletarian western work-wear, but wealthy easterners inevitably ventured out in search of rugged cowboy authenticity. In 1928, a Vogue writer returned East from a Wyoming dude ranch with a snapshot of herself, “impossibly attired in blue jeans… and a smile that couldn’t be found on all Manhattan Island.” In June 1935, the magazine ran an article titled “Dude Dressing,” possibly one of the first fashion pieces to instruct readers in the art of DIY denim distressing: “What she does is to hurry down to the ranch store and ask for a pair of blue jeans, which she secretly floats the ensuing night in a bathtub of water—the oftener a pair of jeans is laundered, the higher its value, especially if it shrinks to the ‘high-water’ mark. Another innovation—and a most recent one, if I may judge—also goes on in the dead of night, and undoubtedly behind locked doors—an intentional rip here and there in the back of the jeans.” (...)

And now, in the midst of the Great Recession, we have come full circle, with the fairly recent demand for nostalgic “heritage” jeans that recall the hardscrabble industrialism of the Great Depression: work shirts and overalls faded to shades of cornflower blue, and rough-hewn, no-nonsense, deep-dyed dungarees. Like their precursors from the 20s and 30s, these jeans seem imbued with a sad nostalgia for a bygone country (but maybe this time with a better fit). We’ve entered the Dorothea Lange era of fashion—clothed in flecked wool cardigans, formidable flannel shirts, and sturdy work boots, Depression-era from head to toe.

by Jenni Avins, Vice |  Read more:
Photo courtesy of Advertising Archives

Kerry Skarbakka - The Struggle to Right Oneself (2011)
via:

Reposts


[ed. Slow news day so far, so here are some reposts from March and April, 2011. Check out the archives.]

Sex, Hastily, Then Beignets
It's Not a Secret
Tako Poke (Octopus Salad)
Shoot
The Sacred Child
Fancy Meatloaf
Good Thing
Sniffing Out a Menace
Better Handy Than Handsome

Does the Pope Matter?

The next pope should be increasingly irrelevant, like the last two. The farther he floats up, away from the real religious life of Catholics, the more he will confirm his historical status as a monarch in a time when monarchs are no longer believable. Some people think it a new or even shocking thing that so many Catholics pay no attention to papal fulminations—against, for instance, female contraceptives, male vasectomies, condoms to prevent the spread of AIDS, women’s equality, gay rights, divorce, masturbation, and artificial insemination (because it involves masturbation). But it is the idea of truth descending though a narrow conduit, straight from God to the pope, that is a historical invention.

When Cardinal Ratzinger was asked, before he became Pope Benedict XVI, if he was disturbed that many Catholics ignored papal teaching, he said he was not, since “truth is not determined by a majority vote.” But that is precisely how the major doctrines like those on the Trinity, the Incarnation, the Resurrection were fixed in creeds: at councils like that of Nicaea, by the votes of hundreds of bishops, themselves chosen by the people, before popes had any monopoly on authority. Belief then rose up from the People of God, and was not pronounced by a single oracle. John Henry Newman, in On Consulting the Faithful in Matters of Doctrine (1859), argued that there had been periods when the body of believers had been truer to the faith than had the Church hierarchy. He was silenced for saying it, but his historical arguments were not refuted.

Catholics have had many bad popes whose teachings or acts they could or should ignore or defy. Orcagna painted one of them in hell; Dante assigned three to his Inferno; Lord Acton assured Prime Minister William Gladstone that Pius IX’s condemnation of democracy was not as bad as the papal massacres of Huguenots, which showed that “people could be very good Catholics and yet do without Rome”; and John Henry Newman hoped Pius IX would die during the first Vatican Council, before he could do more harm. Acton’s famous saying, “Power tends to corrupt, and absolute power corrupts absolutely,” was written to describe Renaissance popes.

With the election of a new pope, the press will repeat old myths—that Christ made Peter the first pope, and that there has been an “apostolic succession” of popes from his time. Scholars, including great Catholic ones like Raymond Brown and Joseph Fitzmyer, have long known that Peter was no pope. He was not even a priest or a bishop—offices that did not exist in the first century. And there is no apostolic succession, just the twists and tangles of interrupted, multiple, and contested office holders. It is a rope of sand. At the beginning of the fifteenth century, for instance, there were three popes, none of whom would resign. A new council had to be called to start all over. It appointed Martin V, on condition that he call frequent councils—a condition he evaded after he was in power.

by Garry Wills, NY Review of Books |  Read more:
Image: Peter Paul Rubens: Saint Peter

Friday, March 15, 2013


bouquet garni
via:

Food Truck Economics

Bobby Hossain’s day starts early. Along with his family, he runs a food truck called Phat Thai that serves his mother’s Thai recipes “with a modern twist.” Although he won’t be serving customers for nearly 4 hours, he wakes up by 7:30am. He is working a double shift in the truck (lunch and dinner), so his brother is on prep duty. Bobby buys any last minute supplies they need - ice, more bean sprouts - from Restaurant Depot while his brother cuts vegetables and slices meat at the kitchen space they use in a friend’s restaurant. His brother then drives the truck to their parents’ house. They load up and Bobby is on the road at 9:30. 

From 11am-2pm they work at Mission Dispatch - a location in San Francisco’s Mission district that hosts food trucks. It brings in a dependable lunch crowd. Bobby’s mother cooks, his employee Frank takes orders, and Bobby hands out completed orders while helping the other two. After three hours, Phat Thai has served around 200 dishes.

Once the lunch crowd dies down, they return to the commissary, a space where they can clean dishes and dispose of garbage. Bobby checks whether he needs to get more supplies for tomorrow, preps, and then drives the truck to North Beach. From 5pm-8pm they will sell Thai dishes alongside other food trucks at a “market” of food trucks organized by Off The Grid. On busy days, they won’t have a chance to eat lunch.

Although it’s only half as busy as lunch, Phat Thai sells dinner to a dense crowd of families and professionals returning from work. By the time they serve their last customer, clean up, and park the truck, it’s approaching midnight. It’s one of Bobby’s busiest days. He and his brother only do double shifts twice a week. Since dinner is less lucrative than lunch, on other days they finish selling food by 3pm and he can get home by 5pm.

Compared to the original American food trucks (a.k.a. roach coaches) that frequented construction sites and baseball stadiums, food trucks like Phat Thai are a different breed. Instead of cheap, greasy fare, they sell $10 dishes featuring organic ingredients and fusions of different regional cuisines. Since their emergence as a social media sensation in 2008, they have asserted themselves as a force in the food scene, employing celebrated chefs and inspiring countless food reviews.

As a service that strips all the overhead costs of a restaurant down to the minimum requirements for selling food to customers, food trucks are also an irresistible metaphor for lean startups: the Silicon Valley practice of quickly rolling out a minimum viable product, allowing customers to try it, and engaging with them to improve your offering. Just as the falling cost of creating a website or app lowered the barriers to entry in the tech industry, food trucks allow aspiring restauranters to quickly put their creations in front of customers with minimal financial barriers.

But given that starting a restaurant is essentially a respectable way to throw money down a hole, are food trucks just mini money pits? What does it take to start and run a food truck? Why is a sandwich in a paper tray $10? Will food trucks disrupt the restaurant industry or is there a bubble?

by Alex Mayyasi, Priceonomics Blog | Read more:
Photo: uncredited

Frank's Photography on Flickr, Nanjing Road
via:

Paul Krugman Is Brilliant, but Is He Meta-Rational?

Nobel laureate, Princeton economics professor, and New York Times columnist Paul Krugman is a brilliant man. I am not so brilliant. So when Krugman makes strident claims about macroeconomics, a complex subject on which he has significantly more expertise than I do, should I just accept them? How should we evaluate the claims of people much smarter than ourselves?

A starting point for thinking about this question is the work of another Nobelist, Robert Aumann. In 1976, Aumann showed that under certain strong assumptions, disagreement on questions of fact is irrational. Suppose that Krugman and I have read all the same papers about macroeconomics, and we have access to all the same macroeconomic data. Suppose further that we agree that Krugman is smarter than I am. All it should take, according to Aumann, for our beliefs to converge is for us to exchange our views. If we have common “priors” and we are mutually aware of each others’ views, then if we do not agree ex post, at least one of us is being irrational.

It seems natural to conclude, given these facts, that if Krugman and I disagree, the fault lies with me. After all, he is much smarter than I am, so shouldn’t I converge much more to his view than he does to mine?

Not necessarily. One problem is that if I change my belief to match Krugman’s, I would still disagree with a lot of really smart people, including many people as smart as or possibly even smarter than Krugman. These people have read the same macroeconomics literature that Krugman and I have, and they have access to the same data. So the fact that they all disagree with each other on some margin suggests that very few of them behave according to the theory of disagreement. There must be some systematic problem with the beliefs of macroeconomists.

In their paper on disagreement, Tyler Cowen and Robin Hanson grapple with the problem of self-deception. Self-favoring priors, they note, can help to serve other functions besides arriving at the truth. People who “irrationally” believe in themselves are often more successful than those who do not. Because pursuit of the truth is often irrelevant in evolutionary competition, humans have an evolved tendency to hold self-favoring priors and self-deceive about the existence of these priors in ourselves, even though we frequently observe them in others.

Self-deception is in some ways a more serious problem than mere lack of intelligence. It is embarrassing to be caught in a logical contradiction, as a stupid person might be, because it is often impossible to deny. But when accused of disagreeing due to a self-favoring prior, such as having an inflated opinion of one’s own judgment, people can and do simply deny the accusation.

How can we best cope with the problem of self-deception? Cowen and Hanson argue that we should be on the lookout for people who are “meta-rational,” honest truth-seekers who choose opinions as if they understand the problem of disagreement and self-deception. According to the theory of disagreement, meta-rational people will not have disagreements among themselves caused by faith in their own superior knowledge or reasoning ability. The fact that disagreement remains widespread suggests that most people are not meta-rational, or—what seems less likely—that meta-rational people cannot distinguish one another.

by Eli Dourado, The Umlaut | Read more:
Photo:David Shankbone

The Problem with Tumblr and Photography

A little over ten years ago, when I started blogging about photography, most photoblogs were presenting a single photographer’s work, one photograph at a time, usually per day. They were maintained by the photographers themselves. The scene was very small, and there was maybe a slightly naive earnestness about how it was done, which made following those blogs an appealing experience.

In the years since, for better or for worse, many of the ideas driving those early photoblogs have fallen by the wayside, with new formats and platforms replacing each other in a bewildering fashion. Many more photographers have come to embrace the web, in particular the social-networking bits.

Before looking at this in more detail, it might be worthwhile to point out that the internet seems made for photography. Photographs offer an immediacy that survives even under the most adverse, aka attention-deficit-disorder-plagued, circumstances.

Interestingly enough, Tumblr is nothing but a variant of the very early photoblogs on steroids. The basic format is the same: present usually one photograph (or another short snippet of information, like a video, an animated GIF, or a text) at a time. Following other Tumblrs then adds the steroids. While in the past one needed to visit one photoblog after another, Tumblr now offers a seemingly incessant stream of work, all in one place. What’s more, showcasing other people’s photographs appears to have overtaken showcasing one’s own.

This all sounds pretty great, except that there’s a multitude of problems, some of them well-known, others not so much. For starters, a large number of photographers are massively concerned about copyright. If everybody were to ask photographers for permission to showcase their work, Tumblr would grind to a halt in less time than it takes to say the word “copyright.”

This isn’t to say that concerns about copyright are invalid. But photographers worried about it might want to ask themselves what damage is done to their work (and income) if someone showcases their pictures to a possibly larger, possibly different audience, for noncommercial reasons. If a photographer is very concerned, a simple solution would be to not put photographs online. The moment they’re on the web, the medium’s properties kick in; the nature of the internet makes copyright violations incredibly simple. Then again, it also makes it easier to detect and go after those violations (as retailer DNKY just found out).

As far as I’m concerned, the bigger issue is the sloppy attribution of photographs, especially on Tumblr. Often, I find photographs where the source is not given at all, or where it is given in such a way that tracking down the photographer involves considerable work. This translates to a non-fair-use copyright violation, and unfortunately, many Tumblr users — often photographers themselves — are woefully uninformed or unconcerned about this.

by Jörg Colberg, Hyperallergic |  Read more:
Photo: Alec Soth

Thursday, March 14, 2013

The Planet Trillaphon


[ed. After posting a review this morning, I realized that I had never read DFW's Planet Trillaphon and it's Relation to the Bad Thing.  Wonders of the internet, here it is (pdf).]

Gene Kloss (1903-1996) - Rain Cloud at Evening
via:

Elliott Erwitt, New York, 1946
via:

The Curse of “You May Also Like”


Of all the startups that launched last year, Fuzz is certainly one of the most intriguing and the most overlooked. Describing itself as a “people-powered radio” that is completely “robot-free,” Fuzz bucks the trend toward ever greater reliance on algorithms in discovering new music. Fuzz celebrates the role played by human DJs—regular users who are invited to upload their own music to the site in order to create and share their own “radio stations.”

The idea—or, perhaps, hope—behind Fuzz is that human curators can still deliver something that algorithms cannot; it aspires to be the opposite of Pandora, in which the algorithms do all the heavy lifting. As its founder, Jeff Yasuda, told Bloomberg News last September, “there’s a big need for a curated type of experience and just getting back to the belief that the most compelling recommendations come from a human being.”

But while Fuzz's launch attracted little attention, the growing role of algorithms in all stages of artistic production is becoming impossible to ignore. Most recently, this role was highlighted by Andrew Leonard, the technology critic for Salon, in an intriguing article about House of Cards, Netflix's first foray into original programming. The series' origin myth is by now well-known: Having studied its user logs, Netflix discovered that a remake of the British series of the same name could be a huge hit, especially if it also featured Kevin Spacey and was directed by David Fincher.

“Can the auteur survive in an age when computer algorithms are the ultimate focus group?” asked Leonard. He wondered how the massive amounts of data that Netflix has gathered while users were streaming the first season of the series—how many times did they click the pause button?—would affect future episodes.

Many other industries are facing similar questions. For example, Amazon, through its Kindle e-reader, collects vast troves of information about reading habits of its users: what books they finish and what books they don't; what sections they tend to skip and which they read most diligently; how often they look up certain words in the dictionary and underline passages. (Amazon is hardly alone here: Other e-book players are as guilty.)

Based on all these data, Amazon can predict the ingredients that will make you keep clicking to the very end of the book. Perhaps Amazon could even give you alternate endings—just to make you happier. As a recent paper on the future of entertainment puts it, ours is a world where “stories can become adaptive algorithms, creating a more engaging and interactive future.”

Just as Netflix has figured out that, given all their data, it would be stupid not to enter the filmmaking business, so has Amazon discovered that it would be stupid not to enter the publishing business. Amazon's knowledge, however, goes deeper than Netflix's: Since it also runs a site where we buy books, it knows everything that there's to know about our buying behavior and the prices that we are willing to pay. Today Amazon runs half a dozen publishing imprints and plans to add more.

by Evgeny Morozov, Slate | Read more:
Photo by Beck Diefenbach/Reuters

Boredom vs. Depression

David Foster Wallace walked into great literature, as Trotsky said of Céline, the way other people walk into their homes. From the publication of his undergraduate fiction thesis, The Broom of the System (1987)*, to the unfinished manuscript he left after his suicide in 2008, The Pale King, Wallace’s life was an object of interest for even the most inert cultural bystanders. His cockiness, insecurity, ambition, anthropological precision and meticulous avoidance of the ordinary sentence – all of this won Wallace the double-edged honour of being regularly proclaimed “the voice of his generation”. For Americans who came of age in the 1990s and worried whether their times would produce a writer of the same cultural heft as the giants of the post-war decades, Wallace’s battleship of a book, Infinite Jest (1996), and his flotilla of stories and essays arrived just in time. Now, in lock step with the worthies he once called “The Great Male Narcissists” – John Updike, Norman Mailer, Philip Roth – Wallace has a biography, a hallowed archive, and a swooning field of “Wallace studies”.  (...)

Wallace came into his own as a writer at Amherst College in Massachusetts in the 1980s, where he arrived as an interloper from Illinois among well-heeled preppy peers. “Midwestern boys might teach or read or make ironic fun of novels”, writes Max in one of his bizarre asides about the heartland, “but they did not go to college to learn how to write them.” Fiction on campus, Wallace would claim, was the province of “foppish aesthetes”, who “went around in berets stroking their chins”. Max’s portrait of these years is of a student getting the top marks, lest anyone mistake him for not being the cleverest boy in the room. (When, years later, the film Good Will Hunting came out, Wallace not only seemed to identify with Matt Damon’s character, but actually tried to follow the blurry equations on the chalkboard.) Max describes a regime that reserved forty-five minutes for dental hygiene, afternoon bong hits, six-hour bouts with the books, and evening whisky shots on the library steps. We get good glimpses of Wallace’s table-talk: “Does anyone want to see Friedrich Hayek get hit on by a girl from Wilton, Connecticut?”. Wallace’s fanbase in future years would consist of fellow liberal arts graduates who saw his work as an opportunity to exercise their education while savouring the pop-cultural references in his prose. But it was Wallace’s style itself, at once laid back and hilariously precise, that seduced a generation. Take this classic passage where Wallace pre-emptively mourns the etiquette of the old-school telephone call:

“A traditional aural-only conversation – utilizing a hand-held phone whose earpiece contained only 6 little pinholes but whose mouthpiece (rather significantly, it later seemed) contained (62) or 36 little pinholes – let you enter into a kind of highway-hypnotic semi-attentive fugue: while conversing, you could look around the room, doodle, fine-groom, peel tiny bits of dead skin away from your cuticles, compose phone-pad haiku, stir things on the stove; you could even carry on a whole separate additional sign-language-and-exaggerated-facial-expression type of conversation with people right there in the room with you, all while seeming to be right there attending closely to the voice on the phone. And yet – this was the retrospectively marvelous part – even if you were dividing your attention between the phone call and all sorts of other little fugue-like activities, you were somehow never haunted by the suspicion that the person on the other end’s attention might be similarly divided. During a traditional call, e.g., as you let’s say performed a close tactile blemish-scan of your chin, you were in no way oppressed by the thought that your phonemate was perhaps also devoting a good percentage of her attention to a close tactile blemish-scan.”

This is a snippet of a much larger passage, but it’s cherishable, not only for the way it mimics the fleetingness of our attention spans, but also for the truth it delivers about our socially repugnant self-centredness. The huge interference and distracting pleasures that we conspire to build between us would become one of Wallace’s great subjects.

Wallace’s struggle with depression is one of the main points of orientation for Max’s biography, and it’s the most valuable contribution of the book. Twice during college, Wallace was forced to leave school and return home to Champaign, Illinois, where he tried to ride out his illness on a drug called Tofranil. It’s no exaggeration to say depression was one of Wallace’s reasons for writing fiction in the first place. In “The Planet Trillaphon as it Stands in Relation to the Bad Thing”, his first published story in the Amherst Review, Wallace enters the mind of a Brown undergraduate who goes on anti-depressants after trying to kill himself. The power of the story lies in Wallace’s ability to convey what the “Bad Thing” feels like from the inside. The story begins, as Max notes, with a seemingly loose Salingeresque introduction:

“I’ve been on antidepressants for, what, about a year now, and I suppose I feel as if I’m pretty qualified to tell what they’re like. They’re fine, really, but they’re fine in the same way that, say, living on another planet that was warm and comfortable and had food and fresh water would be fine: it would be fine, but it wouldn’t be good old Earth, obviously. I haven’t been on Earth now for almost a year, because I wasn’t doing very well on Earth. I’ve been doing somewhat better here where I am now, on the planet Trillaphon, which I suppose is good news for everyone involved.”

The repetitions and played-up quaintness here give the sense of a consciousness that has been lulled into congeniality. But as the story unfolds, and the imprecisions come into focus, the narrator comes to see that depression is not “just sort of really intense sadness, like what you feel when your very good dog dies, or when Bambi’s mother gets killed in Bambi”. Rather, it’s a kind of auto-immune deficiency of the self:

“All this business about people committing suicide when they’re ‘severely depressed;’ we say, ‘Holy cow, we must do something to stop them from killing themselves!’ That’s wrong. Because all these people have, you see, by this time already killed themselves, where it really counts. By the time these people swallow entire medicine cabinets or take naps in the garage or whatever, they’ve already been killing themselves for ever so long. When they ‘commit suicide,’ they’re just being orderly.”

by Thomas Meaney, TLS | Read more: 
Photo: David Foster Wallace, 1996 © Garry Hannabarger/Corbis
h/t 3 Quarks Daily