Tuesday, February 14, 2017

E Unibus Pluram: Television and U.S. Fiction

Act Natural

Fiction writers as a species tend to be oglers. They tend to lurk and to stare. The minute fiction writers stop moving, they start lurking, and stare. They are born watchers. They are viewers. They are the ones on the subway about whose nonchalant stare there is something creepy, somehow. Almost predatory. This is because human situations are writers' food. Fiction writers watch other humans sort of the way gapers slow down for car wrecks: they covet a vision of themselves as witnesses.

But fiction writers as a species also tend to be terribly self-conscious. Even by U.S. standards. Devoting lots of productive time to studying closely how people come across to them, fiction writers also spend lots of less productive time wondering nervously how they come across to other people. How they appear, how they seem, whether their shirttail might be hanging out their fly, whether there's maybe lipstick on their teeth, whether the people they're ogling can maybe size them up as somehow creepy, lurkers and starers.

The result is that a surprising majority of fiction writers, born watchers, tend to dislike being objects of people's attention. Being watched. The exceptions to this rule - Mailer, McInerney, Janowitz - create the misleading impression that lots of belles-lettres types like people's attention. Most don't. The few who like attention just naturally get more attention. The rest of us get less, and ogle.

Most of the fiction writers I know are Americans under forty. I don't know whether fiction writers under forty watch more television than other American species. Statisticians report that television is watched over six hours a day in the average American household. I don't know any fiction writers who live in average American households. I suspect Louise Erdrich might. Actually I have never seen an average American household. Except on TV.

So right away you can see a couple of things that look potentially great, for U.S. fiction writers, about U.S. television. First, television does a lot of our predatory human research for us. American human beings are a slippery and protean bunch, in real life, as hard to get any kind of univocal handle on as a literary territory that's gone from Darwinianly naturalistic to cybernetically post-postmodern in eighty years. But television comes equipped with just such a syncretic handle. If we want to know what American normality is - what Americans want to regard as normal - we can trust television. For television's whole raison is reflecting what people want to see. It's a mirror. Not the Stendhalian mirror reflecting the blue sky and mud puddle. More like the overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile. This kind of window on nervous American self-perception is just invaluable, fictionwise. And writers can have faith in television. There is a lot of money at stake, after all; and television retains the best demographers applied social science has to offer, and these researchers can determine precisely what Americans in 1990 are, want, see: what we as Audience want to see ourselves as. Television, from the surface on down, is about desire. Fictionally speaking, desire is the sugar in human food.

The second great thing is that television looks to be an absolute godsend for a human subspecies that loves to watch people but hates to be watched itself. For the television screen affords access only one way. A psychic ball-check valve. We can see Them; They can't see Us. We can relax, unobserved, as we ogle. I happen to believe this is why television also appeals so much to lonely people. To voluntary shut-ins. Every lonely human I know watches way more than the average U.S. six hours a day. The lonely, like the fictional, love one-way watching. For lonely people are usually lonely not because of hideous deformity or odor or obnoxiousness - in fact there exist today social and support groups for persons with precisely these features. Lonely people tend rather to be lonely because they decline to bear the emotional costs associated with being around other humans. They are allergic to people. People affect them too strongly. Let's call the average U.S. lonely person Joe Briefcase. Joe Briefcase just loathes the strain of the self-consciousness which so oddly seems to appear only when other real human beings are around, staring, their human sense-antennae abristle. Joe B. fears how he might appear to watchers. He sits out the stressful U.S. game of appearance poker.

But lonely people, home, alone, still crave sights and scenes. Hence television. Joe can stare at Them, on the screen; They remain blind to Joe. It's almost like voyeurism. I happen to know lonely people who regard television as a veritable deus ex machina for voyeurs. And a lot of the criticism, the really rabid criticism less leveled than sprayed at networks, advertisers, and audiences alike, has to do with the charge that television has turned us into a nation of sweaty, slack-jawed voyeurs. This charge turns out to be untrue, but for weird reasons.

What classic voyeurism is is espial: watching people who don't know you're there as they go about the mundane but erotically charged little businesses of private life. It's interesting that so much classic voyeurism involves media of framed glass-windows, telescopes, etc. Maybe the framed glass is why the analogy to television is so tempting. But TV-watching is a different animal from Peeping Tourism. Because the people we're watching through TV's framed-glass screen are not really ignorant of the fact that somebody is watching them. In fact a whole lot of somebodies. In fact the people on television know that it is in virtue of this truly huge crowd of ogling somebodies that they are on the screen, engaging in broad non-mundane gestures, at all. Television does not afford true espial because television is performance, spectacle, which by definition requires watchers. We're not voyeurs here at all. We're just viewers. We are the Audience, megametrically many, though most often we watch alone. E unibus pluram. (...)

Not that realities about actors and phosphenes and furniture are unknown to us. We simply choose to ignore them. For six hours a day. They are part of the belief we suspend. But we're asked to hoist such a heavy load aloft. Illusions of voyeurism and privileged access require real complicity from viewers. How can we be made so willingly to acquiesce for hours daily to the illusion that the people on the TV don't know they're being looked at, to the fantasy that we're transcending privacy and feeding on unself-conscious human activity? There might be lots of reasons why these unrealities are so swallowable, but a big one is that the performers behind the two layers of glass are - varying degrees of Thespian talent aside - absolute geniuses at seeming unwatched. Now, seeming unwatched in front of a TV camera is a genuine art. Take a look at how civilians act when a TV camera is pointed at them: they simply spaz out, or else go all rigor mortis. Even PR people and politicians are, camera-wise, civilians. And we love to laugh at how stiff and false non-professionals appear, on television. How unnatural But if you've ever once been the object of that terrible blank round glass stare, you know all too well how self-conscious it makes you. A harried guy with earphones and a clipboard tells you to "act natural" as your face begins to leap around on your skull, struggling for a seemingly unwatched expression that feels impossible because "seeming unwatched" is, like the "act natural" which fathered it, oxymoronic. Try driving a golf ball as someone asks you whether you in- or exhale on your backswing, or getting promised lavish rewards if you can avoid thinking of a rhinoceros for ten seconds, and you'll get some idea of the truly heroic contortions of body and mind that must be required for Don Johnson to act unwatched as he's watched by a lens that's an overwhelming emblem of what Emerson, years before TV, called "the gaze of millions."

Only a certain very rare species of person, for Emerson, is "fit to stand the gaze of millions." It is not your normal, hard-working, quietly desperate species of American. The man who can stand the megagaze is a walking imago, a certain type of transcendent freak who, for Emerson, "carries the holiday in his eye."(2) The Emersonian holiday television actors' eyes carry is the potent illusion of a vacation from self-consciousness. Not worrying about how you come across. A total unallergy to gazes. It is contemporarily heroic. It is frightening and strong. It is also, of course, an act, a counterfeit impression - for you have to be just abnormally self-conscious and self-controlling to appear unwatched before lenses. The self-conscious appearance of unself-consciousness is the grand illusion behind TV's mirror-hall of illusions; and for us, the Audience, it is both medicine and poison.

For we gaze at these rare, highly trained, seemingly unwatched people for six hours daily. And we love these people. In terms of attributing to them true supernatural assets and desiring to emulate them, we sort of worship them. In a real Joe Briefcase-type world that shifts ever more starkly from some community of relationships to networks of strangers connected by self-interest and contest and image, the people we espy on TV offer us familiarity, community. Intimate friendship. But we split what we see. The characters are our "close friends"; but the performers are beyond strangers, they're images, demigods, and they move in a different sphere, hang out with and marry only each other, seem even as actors accessible to Audience only via the mediation of tabloids, talk show, EM signal. And yet both actors and characters, so terribly removed and filtered, seem so natural, when we watch.

Given how much we watch and what watching means, it's inevitable - but toxic - for those of us fictionists or Joe Briefcases who wish to be voyeurs to get the idea that these persons behind the glass, persons who are often the most colorful, attractive, animated, alive people in our daily experience, are also people who are oblivious to the fact that they are watched. It's toxic for allergic people because it sets up an alienating cycle, and also for writers because it replaces fiction research with a weird kind of fiction consumption. We self-conscious Americans' oversensitivity to real humans fixes us before the television and its ball-check valve in an attitude of rapt, relaxed reception. We watch various actors play various characters, etc. For 360 minutes per diem, we receive unconscious reinforcement of the deep thesis that the most significant feature of truly alive persons is watchableness, and that genuine human worth is not just identical with but rooted in the phenomenon of watching. And that the single biggest part of real watchableness is seeming to be unaware that there's any watching going on. Acting natural.

by David Foster Wallace, The Free Library |  Read more:
Image: Naldz Graphics, TV MAN

Stanford Students Recreate 5,000-year-old Chinese Beer Recipe

On a recent afternoon, a small group of students gathered around a large table in one of the rooms at the Stanford Archaeology Center.

For a hands-on view into the ancient world, students brewed beer from a 5,000-year-old recipe as part of an archaeology course with Professor Li Liu.

A collection of plastic-covered glass beakers and water bottles filled with yellow, foamy liquid stood in front of them on the table, at the end of which sat Li Liu, a professor in Chinese archaeology at Stanford.

White mold-like layers floated on top of the liquids. As the students removed the plastic covers, they crinkled their noses at the smell and sour taste of the odd-looking concoctions, which were the results of their final project for Liu’s course Archaeology of Food: Production, Consumption and Ritual.

The mixtures were homemade beer students made using ancient brewing techniques of early human civilizations. One of the experiments imitated a 5,000-year-old beer recipe Liu and her team revealed as part of published research last spring.

“Archaeology is not just about reading books and analyzing artifacts,” said Liu, the Sir Robert Ho Tung Professor in Chinese Archaeology. “Trying to imitate ancient behavior and make things with the ancient method helps students really put themselves into the past and understand why people did what they did.”
The ancient recipe

Liu, together with doctoral candidate Jiajing Wang and a group of other experts, discovered the 5,000-year-old beer recipe by studying the residue on the inner walls of pottery vessels found in an excavated site in northeast China. The research, which was published in Proceedings of the National Academy of Sciences, provided the earliest evidence of beer production in China so far.

The ancient Chinese made beer mainly with cereal grains, including millet and barley, as well as with Job’s tears, a type of grass in Asia, according to the research. Traces of yam and lily root parts also appeared in the concoction.

Liu said she was particularly surprised to find barley – which is used to make beer today – in the recipe because the earliest evidence to date of barley seeds in China dates to 4,000 years ago. This suggests why barley, which was first domesticated in western Asia, spread to China.

“Our results suggest the purpose of barley’s introduction in China could have been related to making alcohol rather than as a staple food,” Liu said.

The ancient Chinese beer looked more like porridge and likely tasted sweeter and fruitier than the clear, bitter beers of today. The ingredients used for fermentation were not filtered out, and straws were commonly used for drinking, Liu said.

Recreating the recipe

At the end of Liu’s class, each student tried to imitate the ancient Chinese beer using either wheat, millet or barley seeds.

The students first covered their grain with water and let it sprout, in a process called malting. After the grain sprouted, the students crushed the seeds and put them in water again. The container with the mixture was then placed in the oven and heated to 65 degrees Celsius (149 F) for an hour, in a process called mashing. Afterward, the students sealed the container with plastic and let it stand at room temperature for about a week to ferment.

Alongside that experiment, the students tried to replicate making beer with a vegetable root called manioc. That type of beer-making, which is indigenous to many cultures in South America where the brew is referred to as “chicha,” involves chewing and spitting manioc, then boiling and fermenting the mixture.

Madeleine Ota, an undergraduate student who took Liu’s course, said she knew nothing about the process of making beer before taking the class and was skeptical that her experiments would work. The mastication part of the experiment was especially foreign to her, she said.

“It was a strange process,” Ota said. “People looked at me weird when they saw the ‘spit beer’ I was making for class. I remember thinking, ‘How could this possibly turn into something alcoholic?’ But it was really rewarding to see that both experiments actually yielded results.”

Ota used red wheat for brewing her ancient Chinese beer. Despite the mold, the mixture had a pleasant fruity smell and a citrus taste, similar to a cider, Ota said. Her manioc beer, however, smelled like funky cheese, and Ota had no desire to check how it tasted.

The results of the students’ experiments are going to be used in further research on ancient alcohol-making that Liu and Wang are working on.

“The beer that students made and analyzed will be incorporated into our final research findings,” Wang said. “In that way, the class gives students an opportunity to not only experience what the daily work of some archaeologists looks like but also contribute to our ongoing research.”

Getting a glimpse of the ancient world


For decades, archeologists have yearned to understand the origin of agriculture and what actions may have sparked humans to transition from hunting and gathering to settling and farming, a period historians call the Neolithic Revolution.

Studying the evolution of alcohol and food production provides a window into understanding ancient human behavior, said Liu, who has been teaching Archaeology of Food for several years after coming to Stanford in 2010.

But it can be difficult to figure out precisely how the ancient people made alcohol and food from just examining artifacts because organic molecules easily break down with time. That’s why experiential archaeology is so important, Liu said.

by Alex Shashkevich, Stanford News |  Read more:
Image: Stanford News

Considerations on Cost Disease


[ed. This and other analyses of Cost Disease here:]
via: Slate Star Codex

Michael Franti & Spearhead

Air Pollution Masks – Fashion's Next Statement?

The intersect between fashion and practicality is not always the most compelling. But given that air pollution is the world’s largest single environmental health risk it seems inevitable they will come to influence each another.

Yesterday saw the launch of M90, an “urban breathing mask” created by the Swedish company Airinum and sold in more than 50 countries. Face masks are already a common sight in Asian countries, although the cheap washable cotton rectangles rarely perform well in tests. Surgical masks, the type usually worn by doctors, have tended to fare better – but are still largely ineffectual.

The market for pricier, more attractive masks has been growing steadily in the past few years. Sales are not notable but Freka, a British brand, had the monopoly for a while. And rightly so, given that they tapped into the trend for minimal sportswear, almostCéline-like in design, seeking to become more of a background accessory than anything stand-out.

Which sets Airinum apart. While the design is typically Scandinavian, these face masks are neon camo.

They aren’t the first luxe masks to have forayed into fashion. In the last few years, these have regularly appeared on the catwalk at Beijing fashion week, arguably being awarded the same gravitas as an It bag. (...)

Masks covered in the Burberry check (although they are not Burberry products) remain a common sight in Asian cities, in a bid to marry style with sensibility, even if they don’t work well. As to whether they’ll take off, affordability remains an issue. But if the aim is to market them in Europe and the US, where athleisure is king and vanity is key, perhaps this is the answer.

As for the fashion appraisal, trad camo is having a moment, particularly in menswear. But neon camo, nothing short of an eyesore, is unchartered territory. It’s also oxymoronic. But that’s the point: if the aim is to raise awareness of the problem, then it’s unlikely you’ll miss one of these on the street.

by Morwenna Ferrier, The Guardian | Read more:
Image: Airinum

Monday, February 13, 2017

Is Apple Over?

I started personal computing on an Apple II circa 1977. It was a big step up from the Heathkit and Radio Shack DIY projects I tinkered with in grade school. When IBM introduced the IBM-PC circa 1981, I semi-defected, and in 1984 I became bi-computeral (you know why).

My company functioned in a computer multiverse for some time. Macs were for art, music and publishing; PCs were for business; DEC minicomputers were for science, math and engineering. The minicomputers went away by 2000, and then we were just Mac and PC. In 2006, shortly after Macs became Intel inside and Parallels Desktop (a utility that enabled users to run Windows programs on a Mac) debuted, we became a 100% Apple shop, and we never looked back.

For more than a decade, if Apple manufactured it, we purchased it – in bulk. There was no reason to hyper-evaluate the new specifications; we just sent a purchase order to Tekserve (now T2 Computing) for as many of the new Apple devices as we needed (and maybe a few we didn't need). There are so many Apple devices in our offices, someone once said, "It looks like Steve Jobs threw up in here."

That was then.

What malevolent force could entice me to seriously consider a PC? What wickedness could tempt me to contemplate a transitioning back to Windows? What could possibly lure me to the dark side? Only Apple itself has such power.

My iPhone 7 Plus Chronicle

On September 7, 2016, I stood on line for an hour to pick up my brand new iPhone 7 plus. I had made an appointment to be one of the first to pick one up because I was still a blind faith follower of the cult of Apple. There was going to be an issue with the headphone jack (well documented in my first treatise of dissent, "Apple iPhone 7: Are You F#$king Kidding Me"). But being one of the faithful means putting aside common sense.

The moment I started to transfer information from iCloud, I was in trouble. Some apps worked, others were greyed out, and certain features were hit or miss.

Two factory resets and four hours later, I called Apple Care. After 30 minutes on hold, I was told that my iPhone must be defective and needed to be replaced.
What?
"OK, I'll just go to the Genius Bar and have it replaced." "No, sorry," said the Apple Care person, "we don't have any extra iPhones at the stores; you'll have to send it back to us." "But because of the 'new phone every year' plan you sold me last time, you took my iPhone 6 Plus back. What will I do for a phone for the five to seven days you're telling me it will take for me to get the replacement?"

(Note: Because I review technology as part of my job, I had plenty of other smartphones, but if this happened to most people, they'd be offline for a week.)

It took two tries for Apple to send me a new phone. The first replacement was lost in shipping, and the second is the one I'm carrying now. I was without an iPhone for about two weeks. To make matters worse, Apple charged my credit card $950 for each phone, so although I had no iPhones, Apple put $2,850 of charges on my credit card, saying it would refund the difference when the missing phone and the bad phone were returned (which it ultimately did).

How could Apple not have replacement phones available for the inevitable number of defective phones it might sell? Here's a better question: Did Apple sell too many defective phones for its supply of replacements?

With the number of iPhones Apple sells, some are bound to be defective – but this was not an isolated incident.

My MacBook Pro Chronicle

I wrote my second treatise of dissent, "Apple MacBook Pro 2016: WTF?," about the all-singing, all-dancing 15-inch MacBook Pro before I received my unit. Here are two videos you may enjoy about unboxing my second MacBook Pro and its battery life. Second? Yes, second. I'm writing this article on my third 15-inch MacBook Pro because the first two were defective.

by Shelly Palmer, Ad Age |  Read more:
Image: Apple

Tell Me A Story

‘Data-Driven’ Campaigns Are Killing the Democratic Party

There’s a Southern proverb often attributed to Sam Rayburn: “There’s no education in the second kick of a mule.” One month into the Trump presidency, and it’s still unclear whether the Democratic Party will learn anything from a fourth kick.

For four straight election cycles, Democrats have ignored research from the fields of cognitive linguistics and psychology that the most effective way to communicate with other humans is by telling emotional stories. Instead, the Democratic Party’s affiliates and allied organizations in Washington have increasingly mandated “data-driven” campaigns instead of ones that are message-driven and data-informed. And over four straight cycles, Democrats have suffered historic losses.

After the 2008 election, Democrats learned all the wrong lessons from President Obama’s victory, ascribing his success to his having better data. He did have better data, and it helped, but I believe he won because he was the better candidate and had a better message, presented through better storytelling.

I’m not a Luddite. I did my graduate work in political science at MIT, and as a longtime Democratic strategist, I appreciate the role that data can play in winning campaigns. But I also know that data isn’t a replacement for a message; it’s a tool to focus and direct one.

We Democrats have allowed microtargeting to become microthinking. Each cycle, we speak to fewer and fewer people and have less and less to say. We all know the results: the loss of 63 seats and control of the House, the loss of 11 seats and control of the Senate, the loss of 13 governorships, the loss of over 900 state legislative seats and control of 27 state legislative chambers.

Yet despite losses on top of losses, we have continued to double down on data-driven campaigns at the expense of narrative framing and emotional storytelling.

Consider the lot of Bill Clinton. It has been widely reported that in 2016, Bill Clinton urged Hillary Clinton’s campaign to message on the economy to white working-class voters as well as to the “Rising American Electorate” (young voters, communities of color and single white women), but couldn’t get anyone to listen to him in Brooklyn. They had an algorithm that answered all questions. Theirs was a data-driven campaign. The campaign considered Bill to be old school—a storyteller, not data driven.

I feel his pain. And unless Democrats start to change things quickly, we’ll be feeling pain in elections yet to come.

Though the problem for Democrats is urgent, the challenge is not new. Before the clamor for a “data-driven” approach, the “best practices” embraced by much of the Democratic Party apparatus encouraged campaigns that were predominantly driven by issue bullet points. In 2000, for example, the Gore presidential campaign had no shortage of position papers, but it would be challenging (at best) to say what the campaign’s message was. In contrast, in Obama’s 2008 campaign, “Hope and Change” was not only a slogan, but a message frame through which all issues were presented.

Years ago, my political mentor taught me the problem with this approach, using a memorable metaphor: issues are to a campaign message what ornaments are to a Christmas tree, he said. Ornaments make the tree more festive, but without the tree, you don’t have a Christmas tree, no matter how many ornaments you have or how beautiful they are. Issues can advance the campaign’s story, but without a narrative frame, your campaign doesn’t have a message, no matter how many issue ads or position papers it puts forward.

Storytelling has been the most effective form of communication throughout the entirety of human history. And that is unlikely to change, given that experts in neurophysiology affirm that the neural pathway for stories is central to the way the human brain functions (“The human mind is a story processor, not a logic processor,” as social psychologist Jonathan Haidt has written).

The scientific evidence of the effectiveness of storytelling is extensive. Consider the 2004 book, Don’t Think of an Elephant, in which Berkeley linguistics professor George Lakoff applied the analytic techniques from his field to politics, explaining that “all of what we know is physically embodied in our brains,” which process language through frames: “mental structures that shape the way we see the world.”

Convincing a voter—challenging an existing frame—is no small task. “When you hear a word, its frame (or collection of frames) is activated in your brain,” writes Lakoff. As a result, “if a strongly held frame doesn’t fit the facts, the facts will be ignored and the frame will be kept.” How then to persuade voters? How can we get them to change the way they see the world? Tell a story.

Further evidence was put forward in 2007’s The Political Brain, by Emory University psychologist Drew Westen. “The political brain is an emotional brain,” Westen wrote, and the choice between electoral campaigns that run on an issue-by-issue debate versus those that embrace storytelling is stark: “You can slog it out for those few millimeters of cerebral turf that process facts, figures and policy statements. Or you can take your campaign to the broader neural electorate collecting delegates throughout the brain and targeting different emotional states with messages designed to maximize their appeal.”

For Democrats, a useful metaphor to frame our storytelling is that while conservatives believe we are each in our own small boat and it is up to each of us to make it on our own, progressive morality holds that we are all on a large boat and unless we maintain that boat properly, we will all sink together. That metaphor could serve as our narrative frame, and addressing issues within this frame—rather than as separate, unrelated bullet points—would allow us to present emotional stories using language that speaks to voters’ values.

by Dave Gold, Politico |  Read more:
Image: uncredited

Saturday, February 11, 2017



Carmen Cartiness Johnson, I can see China (2015)
via:

A Resort for the Apocalypse

Rising S Bunkers, one of several companies that specialize in high-end shelters—its Presidential model includes a gym, a workshop, a rec room, a greenhouse, and a car depot —says sales of its $500,000-plus units increased 700 percent last year. (This compares with a more modest 150 percent increase across other Rising S units.) Bunker companies won’t disclose customers’ names, but Gary Lynch, Rising S’s CEO, told me his clients include Hollywood actors and “highly recognizable sports stars.” Other luxury shelters are marketed to businesspeople, from bankers to Bill Gates, who is rumored to have bunkers beneath his houses in Washington State and California.

Whereas Cold War shelters, by design, were near the home and easy to get to, a handful of bunker companies are building entire survival communities in remote locations. Some of them share literal foundations with Cold War buildings: One project, Vivos XPoint, involves refurbishing 575 munitions-storage bunkers in South Dakota; Vivos Europa One, in Germany, is a Soviet armory turned luxury community with a subterranean swimming pool.

By contrast, Trident Lakes, a 700-acre, $330 million development in Ector, Texas, an hour and a half north of Dallas, is being built from scratch. Marketed as a “5-star playground, equipped with defcon 1 preparedness,” it is the project of a group of investors who incorporated as Vintuary Holdings. According to James O’Connor, the CEO, Trident Lakes “is designed for enjoyment like any other resort.” (This pitch is rather different from its Cold War–era counterparts: A 1963 bunker advertisement from the Kelsey-Hayes company shows a family tucked under its home, with just rocking chairs for comfort.)

In some regards, the plans for Trident Lakes do resemble those for a resort. Amenities will include a hotel, an athletic center, a golf course, and polo fields. The community is slated to have 600 condominiums, ranging in price from $500,000 to $1.5 million, each with a waterfront view (to which end, three lakes and 10 beaches will be carved out of farmland). Other features are more unusual: 90 percent of each unit will be underground, armed security personnel will guard a wall surrounding the community, and there will be helipads for coming and going.

by Ben Rowan, The Atlantic |  Read more:
Image: Chris Philpot

Luke Pelletier
via:

The Glorious Exit of Jeffrey Loria, the Worst Owner in Sports

Over the past 18 years, as Jeffrey Loria sprayed the stench of his naked greed across baseball like the skunk he is, as he destroyed the sport in one city and bilked another out of billions of dollars, as he tore asunder a championship team and micromanaged countless others and behaved like the lamest sort of wannabe George Steinbrenner possible, all blowhard, zero substance, the aggrieved could take solace in one thing and one alone: Some day, the game would rid itself of him.

Mercy was not the exclusive domain of the Ninth Circuit on Thursday. Early in the day, simultaneous feelings of joy and fury accompanied the Forbes report that Loria had agreed to sell the Miami Marlins to an unnamed buyer for $1.6 billion. Joy because the owner who played arsonist to his own franchise was relinquishing his Zippo. And fury because one of the worst owners in sports had turned a $158.5 million investment into something worth 10 times as much, an example other owners with similarly feeble consciences may be tempted to copy.

Whatever frustration percolated over a rich man getting even richer paled compared to the ding-dong-the-witch-is-dead giddiness expressed by Marlins players and executives past and present in texts and calls to one another. Presuming the deal goes through – plenty of pitfalls remain, a source familiar with the agreement confirmed to Yahoo Sports, and Loria would like to bask in the glow of the All-Star Game at Marlins Stadium in July, so the timing of any sale remains unclear – it will bring to an end an ownership reign that stained the sport for more than a decade.

To understand the treachery of Loria and David Samson, the team president and son of Loria’s ex-wife, one need only understand a single number: $1.2 billion. That’s how much a $91 million note from J.P. Morgan to help finance the team’s new stadium, which opened in 2012, is going to cost Miami-area taxpayers. That’s 13 times the original loan. In all, $409 million worth of loans will balloon to $2.4 billion.

And here’s the thing: That’s not even the worst part. For years, the Marlins cried poor to local politicians, saying they needed a stadium to make money. Never would they open up their financials, of course, because they would have shown the Marlins had cleared nearly $50 million in profits the two years before Miami-Dade County approved the stadium funding. Ultimately, the government cowed, and the Marlins got perhaps the most sweetheart of sweetheart stadium deals, which is saying something. They covered only a quarter of construction costs. They keep all of the stadium revenues: tickets, parking, concessions. They pay $2.3 million annually in rent – money that goes to pay off a county loan.

by Jeff Passan, Yahoo Sports | Read more:
Image: via:

Why Whole Foods is Now Struggling

Organic food has never been so popular among American consumers. Ironically, that’s bad news for the brand that made organic a household name — namely, the Austin-based Whole Foods.

On Wednesday, Whole Foods reported what is arguably its worst performance in a decade, announcing its sixth consecutive quarter of falling same-store sales and cutting its outlook for the year. The company is closing nine stores, the most it has ever closed at one time. A mere 16 months ago, Whole Foods predicted it would grow its 470 U.S. locations to more than 1,200.

The problem is one that chief executive John Mackey probably didn’t predict when he first opened Whole Foods as a neighborhood natural foods store 36 years ago: Organics, then a fringe interest, have become so thoroughly mainstream that organic chains now have to face conventional big-box competitors. Mass-market retailers were responsible for 53.3 percent of organic food sales in 2015, according to the Organic Trade Association; natural retailers clocked in just north of 37.

And Whole Foods is hardly the only store feeling the squeeze: Sprouts and Fresh Market, the second- and third-largest publicly traded organic stores, have also seen falling stock prices.

“Whole Foods created this space and had it all to themselves for years,” said Brian Yarbrough, an analyst at Edward Jones. “But in the past five years, a lot of people started piling in. And now there's a lot of competition.”

In many ways, the story of Whole Foods's decline is also the story of how the organic movement took over the United States. Between 2005 and 2015, sales of organic food increased 209 percent, according to the Organic Trade Association. Last year, organic sales topped $43.3 billion.

The driving force behind this growth, most analysts agree, is none other than millennials: Consumers aged 18 to 34 are the largest buyers of organics, and they’re the most likely to consider themselves “knowledgeable” about their food. As they came of age, mainstream grocery chains have been forced to adapt, too.

Walmart ramped up its organics selection in 2006. Kroger introduced its Simple Truth brand in 2012 — the store’s chief executive, Mike Ellis, later said it was the store’s “most successful brand launch ever.” Earlier this week, Aldi announced plans for a $1.6 billion U.S. expansion, with much of that growth aimed at offering “a wider range of organic and gluten-free products.”

By volume, the largest organic retailer in the United States is believed to be Costco, which in 2015 sold $4 billion of organic produce and packaged foods. Like Walmart, Kroger and Aldi, Costco sells organic produce for considerably less than do natural food stores, farmers markets or Whole Foods. In fact, lowering prices has been one of Whole Food’s primary strategies for dealing with competitors.

Apart from shuttering stores and stalling expansion plans, the company is continuing to focus on 365 by Whole Foods, a two-year-old division aimed at launching stores for “value-conscious” consumers. It’s also been dropping prices at its regular locations and mailing out national discount circulars, something it had not previously done. Speaking to investors Wednesday, Mackey indicated that he did not want to see “too big of a gap” between the prices at Whole Foods and those at stores like Costco and Kroger.

But some organic advocates are concerned that lowering the prices of organic foods — an apparent prerequisite for mainstream popularity — can only happen at the expense of the movement’s early principles. This fear is not entirely new: Michael Pollan fretted about it in the pages of the New York Times when Walmart began selling organic Rice Krispie treats 11 years ago. But with results like Whole Foods's, it is becoming more urgent, said Ronnie Cummins, the co-founder of the Organic Consumers Association.

by Caitlin Dewey, WP | Read more:
Image:Ty Wright/Bloomberg News

Friday, February 10, 2017


Tom Guald
via:

Politics 101

Emergency Preparedness Amongst the Liberal Elite

ALEXANDRA: We need go bags, Michael.

MICHAEL: You’re overreacting.

ALEXANDRA: That’s what you said last October when my panic attacks started. I’m just saying we need to be prepared, Michael. What if they repeal health care and we can’t afford your Xanax? What if Civil War erupts? What if they destroy the internet and we have to wait to subscribe to print? What if you forget to put a post-it over your laptop camera one day and they see you have a periodic table hanging in your office?

MICHAEL: We need go bags.

ALEXANDRA: WE NEED GO BAGS!

MICHAEL: What do we put in them? Is there an Amazon list? Is Prime still next day in the event of an apocalypse? Oh, sweet merciful Jon Stewart, tomorrow is Sunday.

ALEXANDRA: We’ll need to travel light. Bare essentials, Michael.

MICHAEL: Is the generator gassed up? How many gallons of gas does it take to charge a tablet?

ALEXANDRA: In this apocalyptic scenario, we’re assuming there’s no internet, remember?

MICHAEL: How many gallons to charge a Kindle?

ALEXANDRA: ESSENTIALS, MICHAEL. That means cellphones, passports, pussy hats. Should we take supplies to make protest signs?

MICHAEL: Can’t we just make a multi-purpose one in advance and take that?

ALEXANDRA: But what if they attack another group? It won’t be just the Muslims forever.

MICHAEL: Then we’ll make it say, “All lives matter.”

ALEXANDRA: JESUS FUCKING CHRIST, MICHAEL. Have you been living under an ecologically-responsible yurt? We can’t say that shit. People are going to think you aren’t woke.

MICHAEL: I still don’t understand that word. Is it a grammar thing? I haven’t slept since November. So did I woke last year?

ALEXANDRA: I don’t think that’s how it works. But I DO know you never, ever say “All Lives Matter.”

MICHAEL: There are way too many rules when total chaos reigns.

ALEXANDRA: We’re all making sacrifices, Michael. I desperately miss watching Empire.

MICHAEL: Are you allowed to boycott a Black TV show? I thought we were just boycotting Fox News. Now it’s the whole damn network?

ALEXANDRA: I think so. Honestly, I can’t keep up with what we’re boycotting. Thank heavens our Prius is self-driving because my Facebook feed keeps vacillating over which ride company is fleecing the immigrants.

MICHAEL: I know. I can’t even enjoy my heteronormative porn. I don’t even know who I am anymore. For a moment last night, I almost didn’t care that my fantasy league is in pieces.

ALEXANDRA: Wasn’t the Super Bowl last weekend? Isn’t football over?

MICHAEL: The season is over?!? WHAT THE FUCK IS HAPPENING TO ME!?!?! Next you’ll tell me I can’t even enjoy a beer and steak.

ALEXANDRA: Methane gas, Michael. DiCaprio says no more red meat. And I haven’t found an American-made beer yet that didn’t support you-know-who. So we’re out. But that reminds me, we should throw in some boxes of wine. I’d normally prefer a white with the non-GMO, organic, fair-trade-certified kale chips and box of Lara bars I’ve packed, but refrigeration might be an issue. Red it is!

by Elly Lonon, McSweeny's |  Read more:
Image: uncredited via:

photo: markk

Tell Me Everything You Don't Remember

[ed. When my mom began her decline with Alzheimer's this is how I imagined it must have felt.]

Short-term memory dominates all tasks—in cooking, for instance: I put the water to boil in a pot on the stove and remember that the water will boil while I chop the onions. I will put the sauté pan on the stove to heat up the oil for the onions, and I will then put the onions, which I will remember I have chopped, into the oil, which I remember I have heated for the onions. I will then add tomatoes. While the onions and tomatoes cook, I will put pasta in the water, which I remember I have boiled. I will know that in ten minutes I will put the cooked pasta into the tomato and onion stir, and thus have a simple tomato pasta meal.

If short-term memory is damaged as mine was, it works more like this: I put the water on to boil. I heat up the oil in the sauté pan. I chop the onions and then wonder for what it was that I chopped the onions. What might it be? I wash my hands, because I might as well—my hands are covered in onion juice, and my eyes are tearing. I return to the stove, where the oil is now scorching hot. I wonder what on earth it was I was cooking, why the sauté pan was left this way. I turn off the heat under the oil. I sigh and go upstairs. I forget everything I just did like a trail of dust in wind. Two hours later, after a nap, I return to the kitchen to a pile of chopped onions on the chopping block. The pan is cool but scorched. And I again wonder why. But mostly, my eyes turn to an empty stockpot on the stove, the burner turned on high. There is nothing in the stockpot, not even water. This happened over and over again in the months following my stroke. So I stopped cooking for a year.

Short-term memory is like an administrative assistant for the brain, keeping information on hand and organizing tasks—it will figuratively jot down a number, a name, an address, your appointments, or anything else for as long as you need to complete your transaction. It stores information on a temporary basis, on Post-it notes, before deciding whether or not to discard the memory/Post-it or move it into a file cabinet for long-term memory storage. Everything in long-term memory finds its way there through short-term memory, from the PIN for your ATM card to the words to the “Happy Birthday” song to the weather on your wedding day. In fact, you are exercising short-term memory now, by keeping track of what you read at the beginning of this sentence so that you can make sense of it at the end.

When short-term memory is damaged, it cannot track sentences. It must read the paragraph over and over again, because by the end of the sentence or paragraph, it will not remember the beginning. And because it does not remember the beginning, it cannot make meaning out of the entirety.

I look at a restaurant menu. I read each item, and when I get to the end of the list, I cannot remember what was at the beginning. I reread the menu. I get to the bottom of it. My brain gets tired, short-circuits, and all I see is random words. I cannot connect my appetite to the words. I cannot remember what food tastes like. I cannot connect the ingredients, “hand-cut green noodles with chanterelle mushroom ragù and gremolata,” into a whole. I cannot put together noodles and mushrooms and chopped herbs in my brain. I cannot connect those flavors into a picture, and I cannot connect them to my appetite, because I have no memory. I only know I am hungry, because I am light-headed and listless.

I put down the menu. I ask for a hamburger if I am dining alone. I ask my companion to order for me, if I am not dining alone. I always request hamburgers, because nearly every restaurant offers hamburgers, and because I cannot parse a menu and hold all the possibilities in my head in order to make a decision.

I am surprised, every time, when the hamburger arrives at the table, because I do not remember having ordered it. I chew it mechanically. There are no images flashing through my head reminding me of the first time I ate a hamburger, or all the barbecues I’ve attended, or the time after marching in the Rose Parade that I ate Burger King because Burger King gave out free burgers to participants at the end of the route. No. There is just blank space. There is chewing. Swallowing. The end of hunger.

When short-term memory is damaged, it will not retain new names. I do not remember someone who popped her head into my hospital room a few minutes ago. I do not remember the receptionist in the doctor’s waiting room. I do not remember who visited me in the hospital the day prior. I do not remember who gave me the flowers in my room. I have to write all these things down in my notebook, so I can refer back to it later.

If short-term memory is damaged, it may not be able to move things into long-term memory, because it takes time, even if not much. It can take about a minute for the memory to be retained. But with age or injury, our brains have less time to successfully move new information to long-term memory. As a result, it is difficult to recall the details of recent events. I see a book at the bookstore, and I buy it because it looks interesting. I go home and see two copies of that book on my bookshelf because I have bought that book over and over again.

I do not remember so many things that happened. I do not remember who was in my workshop the semester I returned to school before I was fully healed, returning because all I wanted to do was finish my degree. I do not remember the woman who befriended me in the wake of my stroke, who then months later wrote me a breakup card because supporting me, she said, was “too much.” I find the breakup card years later and look at the date in befuddlement. I do not remember printing my MFA thesis onto special paper and then assembling it and turning it in. I do not remember the names of any doctors at the hospital. I do not remember room numbers. In addition to not cooking, I do not even go grocery shopping for a whole year, because I forget what it is I have to buy, and if I write a list down, I forget where I put the list.

In the wake of my stroke, I remembered the names of people I’d known for years, even if I couldn’t remember the names of doctors I had just met. I recognized my best friend, Mr. Paddington, and my husband, Adam, and all my girlfriends, and greeted them. But when they’d leave the room and return, I would greet them once again, as if they hadn’t been in that same room just fteen minutes prior. I knew who they were, but I had lost track of time. My short-term memory was unable to move things into long-term storage.

by Christine Hyung-Oak Lee, Longreads | Read more:
Image: Perrin

Thursday, February 9, 2017


Dan Mcpharlin
via:

What the Closure of FRUiTS Magazine Means for Japanese Street Style

After two decades spent documenting the street style of Japanese teens across an impressive 233 issues, FRUiTS Magazine announced this weekend that it had printed its last copy. While the much discussed death of print media was perhaps a predictable cause of its demise, the actual reason for the magazine's closure that its founder, editor and chief photographer Shoichi Aoki gave was altogether more surprising. In an interview with Japanese site Fashionsnap, he said, simply, that "there are no more cool kids left to photograph."

Founded in 1997, FRUiTS was the definitive publication that championed Harajuku's colourfully dressed youth and provided a reliable and authentic record of emerging street trends and genuinely interesting dressers, in contrast to the contrived street style circus of fashion week peacocks. Recognised by its legions of readers as the magazine that introduced them to the saccharine, brave and frenetic world of Japanese fashion, FRUiTS will be sorely missed by the deserved cult following it has amassed. While fans of the magazine took to social media to decry its loss, some cited the magazine's insistence on photographing the same kids over and over again as the reason for its failure - commenting on a news article on Japanese culture site Spoon Tamago, Alan Yamamoto said that FRUiTS had become a "popularity contest", and that despite the magazine's closure "there's still so much fashion on the streets [in Tokyo] it's unbelievable." Misha Janette, founder of Tokyo Fashion Diaries, called the magazine's loss a "death blow to the already waning Japanese street scene", and called for more to be done to protect and nurture the area's style heritage.

The closure of a publication like FRUiTS is a bitter pill to swallow for many longtime fans of Japanese street style, but in many ways its demise has been a long time coming. However tempting it is to wax lyrical about Harajuku's unique cachet of cool, it's impossible to deny that the district's golden age has long passed. Historically regarded as an underground hotbed of street style that most foreign magazines would die to document, the area's magnetism has been numbed by overexposure and gentrification for quite a while now. The Hokoten pedestrian paradise that provided a nurturing crucible for Harajuku fashion was shut down in the late nineties, and although the area's girls and boys diffused over the surrounding square mile for the next two decades, things have changed. The reality in 2017 is that if you come to Tokyo hoping to see tribes of teens in the eye-grabbing garms and kawaii Decora that made FRUiTS famous, you're likely to be disappointed.

The Camden-esque transformation of Harajuku into a souvenir-saturated tourist trap means that instead of the trendsetting locals Gwen Stefani sang about in the early 00s, you're more likely to see white people dressed in tired lolita getups haunting the streets. There are still the odd shops like Dog, and 6%DokiDoki hidden close to Takeshita Street that cater to FRUiTS-worthy kids, but they are few and far between, and most of Harajuku now feels decidedly un-Japanese. Although the area has tourism to thank for its success in part, this has ultimately been the cause of its overexposure. (...)

Despite the increasing dominance of the high street, and the sentiment Aoki's statement suggests, Japan isn't facing an all-out style apocalypse; compared to most fashion capitals, Tokyo is still spoilt when it comes to achingly stylish dressers. In the past, kids paraded down Omotesando intending to get their picture taken, and managing to get into a magazine like FRUiTS was considered a great honour. If that was the case, then where are they now? These days, time spent on self-promotion is better invested online, and the lives of Harajuku's next generation play out on Instagram.

by Ashley Clarke, i-D | Read more:
Image: FRUiTS