Saturday, August 13, 2016

The Long Thanatopsis

In 30 years, the assisted suicides of people won’t be met by the furor that followed the death of Brittany Maynard, a 29-year-old woman with terminal brain cancer who ended her own life with a fatal dose of barbiturates on Nov. 1.

In 30 years, the media won’t bother with stories like Gillian Bennett, an 85-year-old Canadian woman who took her own life in August using alcohol and barbiturates because she had decided not to live any longer with dementia. Upon her death, a website, deadatnoon.com, went live with an essay justifying her choice and arguing for assisted suicide. In 30 years, an explanatory website will be as unnecessary as a newspaper story, because this form of dying will be just another one of the ways that we die.

This shift will be part of the lasting legacy of the 76 million baby boomers who make up about 25 percent of the American population and who will be aging and dying in the next 20 years. A century from now, the historians of the future will not credit them as much for their boisterous 1960s counterculture as for the gray twinkle and fade in the early 21st century that forever altered the way America dies.

For a middle-aged Gen X-er like me to play in the thanatopsis sandbox like this is bittersweet. (“Thanatopsis” is a meditation on death, from William Cullen Bryant’s 1817 poem.) Of course, it’s untoward to point to anyone’s death, no matter how inescapable it is. The main source of the bitterness, though, is acknowledging that the cultural hegemony of the baby boomers will always overshadow me. Better than I know the contours of my historical experience, I know theirs: born into postwar prosperity, the hedonism and idealism, its psychological aftermath, and the nostalgias (The Big Chill, A Prairie Home Companion). I envied their generational mindset, the self-identity of a group that was formed in the same historical crucible, so as a senior in college in 1989, I pitched an article on “my generation” to a national magazine. Very kindly but firmly the editor, a baby boomer, refused it on the grounds that generational forms of thinking were now outrĂ©, even as he admitted getting his start in journalism by publishing a “my generation” piece in 1974. This epitomizes to me how sorry a creature the Gen X-er could be: weaned on someone else’s cultural themes, always too late to the party. (Fortunately Richard Linklater and Douglas Coupland were more persistent than I was.)

All this is blunted by knowing that when (and, let’s be honest, if) I become an elderly Gen X-er, many of the sharp edges of old age will have been blasted smooth by the massive demographic cohort that has preceded me. That’s the sweet part of the bittersweetness.

It’s impossible for me to predict everything that will occur, but it seems clear that every kitchen gadget will be available in ergonomic designs for weaker, arthritic hands. Every building will have been fitted with hearing loops in each room, so my hearing aids will work better. The bathrooms will all be ADA-compliant, fitted with wide doors and handles, and all street crossings will have curb cuts. No homes, offices, or shops will have raised thresholds at doorways, so my robot health aides will able to glide over them, along with my solar-powered wheelchair. If I can afford it, my transitional housing will be designed to maximize my psychological and emotional wellbeing. The clinics, rehab centers, hospitals, and nursing homes will seem like brisk hotels, perhaps even like resorts, not like institutions. Safe, effective, and cheap therapies and drugs to improve the workings of my brain and body will be easily accessible and widely accepted. We will be able to take, and perhaps self-administer, human growth hormone. Medical marijuana will be federally recognized.

By then, the good death will be just another lifestyle choice. Philosophers of inequality will argue that dying well should not be enjoyed only by the upper income tiers, and the policy question of the day will be whether or not dying well is a public good. Should prisoners receive funding from the state in order to pursue dying with dignity? Will people living in homeless shelters be able to receive the psychiatric clearance that’s needed for state-sanctioned death? State laws about burials and funerals will also change, such as the requirement (in some states) that only licensed, registered funeral directors may make arrangements and preparations for burial or cremation; embalming bodies will become increasingly rare, and funerary practices with low environmental impact, such as “green cremation,” will be niche at first, even luxury, then will become more widely available. Already there are do-it-yourself funerary books, magazines, and night courses. Soon there will be coffee shop meet-ups and death parties. (Things are moving so fast that I discover this already exists, too.)

by Michael Erard, TMN | Read more:
Image: Colin Chillag, Grandma-Grandpa, 2012

[ed. Bad breakups, Episode #37...]
via:

Friday, August 12, 2016

This Company Has Built a Profile on Every American Adult

Forget telephoto lenses and fake mustaches: The most important tools for America’s 35,000 private investigators are database subscription services. For more than a decade, professional snoops have been able to search troves of public and nonpublic records—known addresses, DMV records, photographs of a person’s car—and condense them into comprehensive reports costing as little as $10. Now they can combine that information with the kinds of things marketers know about you, such as which politicians you donate to, what you spend on groceries, and whether it’s weird that you ate in last night, to create a portrait of your life and predict your behavior.

IDI, a year-old company in the so-called data-fusion business, is the first to centralize and weaponize all that information for its customers. The Boca Raton, Fla., company’s database service, idiCORE, combines public records with purchasing, demographic, and behavioral data. Chief Executive Officer Derek Dubner says the system isn’t waiting for requests from clients—it’s already built a profile on every American adult, including young people who wouldn’t be swept up in conventional databases, which only index transactions. “We have data on that 21-year-old who’s living at home with mom and dad,” he says.

Dubner declined to provide a demo of idiCORE or furnish the company’s report on me. But he says these personal profiles include all known addresses, phone numbers, and e-mail addresses; every piece of property ever bought or sold, plus related mortgages; past and present vehicles owned; criminal citations, from speeding tickets on up; voter registration; hunting permits; and names and phone numbers of neighbors. The reports also include photos of cars taken by private companies using automated license plate readers—billions of snapshots tagged with GPS coordinates and time stamps to help PIs surveil people or bust alibis.

IDI also runs two coupon websites, allamericansavings.com and samplesandsavings.com, that collect purchasing and behavioral data. When I signed up for the latter, I was asked for my e-mail address, birthday, and home address, information that could easily link me with my idiCORE profile. The site also asked if I suffered from arthritis, asthma, diabetes, or depression, ostensibly to help tailor its discounts.

Users and industry analysts say the addition of purchasing and behavioral data to conventional data fusion outmatches rival systems in terms of capabilities—and creepiness. “The cloud never forgets, and imperfect pictures of you composed from your data profile are carefully filled in over time,” says Roger Kay, president of Endpoint Technologies Associates, a consulting firm. “We’re like bugs in amber, completely trapped in the web of our own data.” (...)

Besides pitching its databases to big-name PIs (Kroll, Control Risks), law firms, debt collectors, and government agencies, IDI says it’s also targeting consumer marketers. The 200-employee company had revenue of about $40 million in its most recent quarter and says 2,800 users signed up for idiCORE in the first month after its May release. It declined to provide more recent figures. The company’s data sets are growing, too. In December, Frost helped underwrite IDI’s $100 million acquisition of marketing profiler Fluent, which says it has 120 million profiles of U.S. consumers. In June, IDI bought ad platform Q Interactive for a reported $21 million in stock.

IDI may need Frost’s deep pockets for a while. The PI industry’s three favorite databases are owned by TransUnion and media giants Reed Elsevier and Thomson Reuters. “There’s no shortage,” says Chuck McLaughlin, chairman of the board of the World Association of Detectives, which has about 1,000 members. “The longer you’re in business, the more data you have, the better results.” He uses TLO and Tracers Information Specialists.

by David Gauvey Herbert and Olga Kharif, Bloomberg | Read more:
Image:Mustafa Hacalaki/Getty Images

Erasing the Pop-Culture Scholar, One Click at a Time

Academics in the humanities — but particularly those who specialize in film, television, and comics — have come to view the pop-culture thinkpiece with dread. Invariably some new essay on, say, taste and television is published to great fanfare, at least from other writers of pop-culture thinkpieces. They proceed to treat as "new" or "innovative" some idea or trend that we in academe been writing about for years.

Decades of scholarship are erased by a single, viral essay that is presumed to be the first observation of some "new" phenomenon. Mainstream journalists don’t realize that the subjects they’re writing about, the patterns and shifts they’re noting for the first time, likely have numerous journal articles and possibly even full monographs devoted to them.

If it were just a question of crediting the work of scholars, most of us would lick our wounds and slink away. But it’s not just that. What pains us more than the absent citation is the unsupported claim, the anachronistic parallel, the apocryphal anecdote.

In other words, these thinkpieces almost always get it wrong. The writers, like many a college student, simply haven’t done the reading.

In the college classroom, students’ initial evaluation of art is often based (understandably) on ignorance. They misread, misinterpret, and misunderstand because they simply don’t know what they don’t know. For example, the first time students see Jean-Luc Godard’s 1960 film, Breathless, they often assume his jump cuts are sloppy editing mistakes, rather than a conscious strategy on the part of the director to subvert the polished style of the 1950s French "Cinema of Quality." In the classroom that is called a "teachable moment." Mistakes and misunderstandings offer professors platforms for engaging students in productive but also corrective discussions.

The internet is not a classroom — however much we like to think it is. When writers for major news magazines misread, misunderstand, and mistake their objects of study, they are not synonymous with students and that situation is rarely a teachable moment. That’s because readers have been conditioned to expect that their news sources present them with accurate information.

But that is often not the case in the current online publishing landscape where speed, not accuracy, is valued, and clicks are king. The new culture of immediacy — based on anecdotal knowledge, individual experience, and the occasional nod toward what can be found in a quick Google search — is the lifeblood of this cultural moment.

We didn’t write this to knock anyone’s hustle; to the contrary, this essay is a request for reciprocity. We just want mainstream journalists to be aware: The thoughts and ideas that the news media spotlight as "original" aren’t actually all that original. Someone likely wrote something about that idea/era/film/TV show/music before, and it’s up to you to find out what’s been said and assimilate that knowledge with your initial argument. That write-up you’re planning on antiheroes, reality-television history, or the networks’ exploitation of black audiences? It has a scholarly antecedent just waiting to expand your knowledge of the subject.

In a recent article for The Chronicle, Noah Berlatsky offers a spirited defense of ignorance on the internet. Though Berlatsky is himself a dedicated pop-culture scholar, having published a monograph on Wonder Woman with a respected university press, he warns his fellow pop-culture scholars about getting too territorial: "The enshrinement of hard-won expertise — the insistence that value consists in being able to tell right from wrong — is exactly the mind-set that makes work in the humanities so easy to denigrate."

Instead, Berlatsky asks that we celebrate the fact that there are so many people on the internet excitedly writing about art, whether or not what they are writing is factually or historically sound. "Art’s value isn’t in objective expertise," he writes, "but in its ability to confound subjectivity and objectivity, to scramble the barriers between how one person thinks, how that other person thinks, and how everybody thinks. In art, a misinterpretation may be wrong, but it is always an opportunity."

In other words, if you truly love art and want more people to love it too, then it is necessary to welcome critics of all skill levels into the tent. Popular culture is the culture of the populace, after all. Therefore, everyone’s opinions on popular culture have value, right?

Not necessarily.

by Amanda Ann Klein and Kristen Warner, Chronicle of Higher Education | Read more:
Image: James Yang

Thursday, August 11, 2016

An Oral History of Tin Cup: One of Golf's Most Iconic Movies Ever Made



Few movies get golf right. Fewer still add to the game's lexicon. (We all know what it means to "pull a Tin Cup" or to "let the big dog eat.") Twenty years after the film's release, the stars of Tin Cup -- Costner! Russo! Cheech! -- take us back to '96 and the making of the most authentic golf movie ever. And yeah, that 18th hole meltdown? It still hurts. "Another ball, Romeo..."

During the final round of the 1993 Masters, Chip Beck etched his name in the annals of golf infamy with his second shot on the par-5 15th hole. Beck trailed Bernhard Langer by three strokes with three holes to play, but rather than go for the green in two, he laid up, inciting the outrage of forehead-slapping second-guessers watching at home. Ron Shelton, the director of such brilliantly offbeat sports movies as Bull Durham and White Men Can't Jump, was one of those armchair critics. When Beck made his fateful decision, Shelton immediately called his golfing buddy, screenwriter John Norville. The two men had kicked around ideas for a golf movie over the course of several years and even more adult beverages, but they could never find a way into the story. Beck gave them what they were looking for. What if the hero of the movie was the anti-Beck, a guy constitutionally incapable of laying up, a guy who went for it all the time, even when—especially when—he shouldn't? That was the moment Tin Cup was born.

Released on August 16, 1996, Shelton and Norville's long-gestating labor of love may be the most thrilling (and accurate) movie ever made about golf. The romantic comedy stars Kevin Costner as Roy "Tin Cup" McAvoy, a washed-up pro drinking his days away at a Texas driving range; Don Johnson as David Simms, his smarmy, play-it-safe college rival who's become a Tour star; Rene Russo as Dr. Molly Griswold, the daffy shrink who comes between them; and Cheech Marin as Romeo, Roy's loyal sidekick and caddie. There are cameos by dozens of Tour pros, too, among them Fred Couples and Johnny Miller. But the film is most famous for its excruciating climax, when Roy self-destructs on the 72nd hole of the U.S. Open. To commemorate Tin Cup's 20th anniversary, GOLF tracked down the cast and crew for a no-holes-barred look back at an evergreen fairway classic.

Ron Shelton (director, co-writer)
: Our original idea involved a golf hustler at a driving range in West Texas, a guy with a bit of Lee Trevino's background. But we didn't nail it down until the '93 Masters. I was watching at home in Ojai, and John was at home in Oregon. When Chip Beck laid up, we immediately called each other and said, That's the key to our guy: He won't lay up!

John Norville (co-writer): Ron's thinking all along was that we shouldn't write a golf story for golfers. We needed to write a golf story for women who don't play golf—or even get golf. The question became, Who's our character? Well, Ron's family is from West Texas, and there's this great tradition of Texas players who have a whiskey bottle and a revolver in their bag. It's the kind of place where a guy can get lost.

Shelton: We had to overcome the perception of golf as a rich man's sport, because I don't think it is a rich man's sport, it's a blue-collar sport. One of the glorious parts of the game is that golfers will wait in line at five a.m. at a public course to shoot 103. The Chip Beck thing was just the light bulb that went off for us. Our hero's strength and his fatal flaw is that he's more afraid of winning than losing. He'd rather be the big fish in the littlest of ponds than risk winning on the big stage.

Gary Foster (producer): Norville invited me up to Ojai one day to play golf with Ron. We called him "Ballwash Ron" because if he hit a drive off the fairway and over by the ball washer, he'd still find a way to make par. This must have been 1994. Afterward, over drinks, we decided Ron would direct, John would write, and I would produce. Then they went to Warner Bros., because that's where Ron had a deal. It just so happened that Kevin Costner had a deal there, too.

Shelton: When we started writing it, we didn't have an actor in mind for Roy, but about 20 pages into it John and I looked at each other and said, "It's Costner." So I called Kevin, who I'd worked with on Bull Durham, and he said, "I'm taking some time off." I said, "Just read it before you say no." So he did. A few days later, we met for breakfast, and he said, "Damn it. You're right. I gotta do this."

Kevin Costner (Roy McAvoy)
: "Champagne Johnny" Norville and I had gone fishing together, and I knew he was working on something about golf with Ron. But I didn't think about it too much because I didn't really play golf—maybe once a year with my father-in-law. On the first tee, I tended to hit three or four balls, all to the right, and I wasn't too f---ing impressive. Plus, I wasn't working at the time. I'd just done Waterworld and had gone through a divorce, and my heart was pretty much on the ground. But I knew working with Ron again would be the best therapy, because he basically hands you something you can't fail with.

Shelton: Once we had Kevin, we had to start thinking about the other roles, like Molly. In all sports movies, the woman's role is critical. You want to get the golfer and the person who thinks it's the stupidest sport ever. You want both audiences. There were actresses on our list who refused to audition, but Rene Russo was like, "Sure!" And she was perfect. Not just attractive and smart—she's very appealing when she gets flustered, and I thought I could make something out of that. You could easily believe that she got involved with the wrong guy, and can't figure out how to get uninvolved with him in order to get with the right guy, who happens to be kind of a mess.

Rene Russo (Dr. Molly Griswold): I didn't know anything about golf. And I remember being really intimidated because it was such a good role. And I went in, and there was Mr. Charm, Kevin Costner. I was so nervous, because it was the first film that I majorly wanted. We read together, and Kevin's so good I just fell into it.

Costner: Ron casts broads, and I say "broads" as a term of endearment—a girl who can hang with guys and make everybody feel like they have a chance, even when they don't.

Shelton: For the part of Roy's caddie, Romeo, I must have auditioned every Latino actor there was—even stars from Mexico City. Cheech Marin was the first to walk in the door, and after dozens of other actors I just couldn't get him out of my mind. The character he plays is sort of the moral center of this wacky universe. He's the truth teller, he's got the heart. And the thought of Cheech being the moral center of a universe appealed to me.

Cheech Marin (Romeo): It wasn't like, Oh, we gotta get Cheech for this! I auditioned and months went by. I'd given up hope. This was a big, fat, A-list movie, and it was my chance to run with the big dogs. I mean, I'm half joking. Cheech and Chong was bigger than a lot of movie stars, but I wanted to compete in that race.

Shelton: For the part of David Simms, we needed someone with swagger and who could swing a golf club. Alec Baldwin was going to do it, but his wife at the time, Kim Basinger, was expecting. So he called me up and said, "I'm sorry, I can't do it." Then someone suggested Don Johnson. He could really play, which was crucial because we were about to start shooting.

Marin: We were waiting on the set, and every day there was scuttlebutt about who they were going to get for the Simms part. Then all of a sudden it's Don Johnson, and it was like, Oh, this is going to be f---ing perfect! Don and I were friends going back to when we were both young actors in Hollywood, way before Miami Vice.

Don Johnson (David Simms)
: I think it really came down to who could go toe to toe with Kevin and be believable, and who could play golf. I'm not sure that I qualify in either one of those categories, but I'm an actor, and I can pretend really well. At that time, my game was a lot better than it is now. I was an 8 or 9 handicap. I played some ProAms with guys like Payne Stewart. (...)

Shelton: We needed to get some professional golfers in the movie to give it a flavor of authenticity. So we started calling, and their agents wanted $50,000 for an appearance, like it was a corporate outing. We were like, "No, we're offering them $600." And they all said no way. Then McCord had a great idea.

McCord: I called the players' wives and said, "How'd you like to have dinner with Kevin Costner and Don Johnson? The catch is, your husband is going to have to be on a movie set for a day." We rented a big room in Tucson and let Kevin and Don loose on the girls. I told them, "Be Hollywood, and bulls--- with these women; make them tell their husbands they have to do this movie." In the end, we got 35 players, four U.S. Open winners—and they got SAG minimum!

Corey Pavin: I was the reigning U.S. Open champion when we shot the film, and I still get a residual check every six or eight months, for $1.80 or something.

by Chris Nashawaty, Golf | Read more:
Image: Warner Bros.

Western Industrial, Charles Sheeler
via:

Wednesday, August 10, 2016

Hillbilly Elegy: A Memoir of a Family and a Culture in Crisis

[ed. See also: In ‘Hillbilly Elegy,’ a Tough Love Analysis of the Poor Who Back Trump]

I wrote last week about the new nonfiction book Hillbilly Elegy: A Memoir of a Family and a Culture in Crisis by J.D. Vance, the Yale Law School graduate who grew up in the poverty and chaos of an Appalachian clan. The book is an American classic, an extraordinary testimony to the brokenness of the white working class, but also its strengths. It’s one of the best books I’ve ever read. With the possible exception of Yuval Levin’s The Fractured Republic, for Americans who care about politics and the future of our country, Hillbilly Elegy is the most important book of 2016. You cannot understand what’s happening now without first reading J.D. Vance. His book does for poor white people what Ta-Nehisi Coates’s book did for poor black people: give them voice and presence in the public square.

This interview I just did with Vance in two parts (the final question I asked after Trump’s convention speech) shows why.

RD: A friend who moved to West Virginia a couple of years ago tells me that she’s never seen poverty and hopelessness like what’s common there. And she says you can drive through the poorest parts of the state, and see nothing but TRUMP signs. Reading “Hillbilly Elegy” tells me why. Explain it to people who haven’t yet read your book.

J.D. VANCE: The simple answer is that these people–my people–are really struggling, and there hasn’t been a single political candidate who speaks to those struggles in a long time.  Donald Trump at least tries.

What many don’t understand is how truly desperate these places are, and we’re not talking about small enclaves or a few towns–we’re talking about multiple states where a significant chunk of the white working class struggles to get by.  Heroin addiction is rampant.  In my medium-sized Ohio county last year, deaths from drug addiction outnumbered deaths from natural causes.  The average kid will live in multiple homes over the course of her life, experience a constant cycle of growing close to a “stepdad” only to see him walk out on the family, know multiple drug users personally, maybe live in a foster home for a bit (or at least in the home of an unofficial foster like an aunt or grandparent), watch friends and family get arrested, and on and on.  And on top of that is the economic struggle, from the factories shuttering their doors to the Main Streets with nothing but cash-for-gold stores and pawn shops.

The two political parties have offered essentially nothing to these people for a few decades.  From the Left, they get some smug condescension, an exasperation that the white working class votes against their economic interests because of social issues, a la Thomas Frank (more on that below).  Maybe they get a few handouts, but many don’t want handouts to begin with.

From the Right, they’ve gotten the basic Republican policy platform of tax cuts, free trade, deregulation, and paeans to the noble businessman and economic growth.  Whatever the merits of better tax policy and growth (and I believe there are many), the simple fact is that these policies have done little to address a very real social crisis.  More importantly, these policies are culturally tone deaf: nobody from southern Ohio wants to hear about the nobility of the factory owner who just fired their brother.

Trump’s candidacy is music to their ears.  He criticizes the factories shipping jobs overseas.  His apocalyptic tone matches their lived experiences on the ground.  He seems to love to annoy the elites, which is something a lot of people wish they could do but can’t because they lack a platform.

The last point I’ll make about Trump is this: these people, his voters, are proud.  A big chunk of the white working class has deep roots in Appalachia, and the Scots-Irish honor culture is alive and well.  We were taught to raise our fists to anyone who insulted our mother.  I probably got in a half dozen fights when I was six years old.  Unsurprisingly, southern, rural whites enlist in the military at a disproportionate rate.  Can you imagine the humiliation these people feel at the successive failures of Bush/Obama foreign policy?  My military service is the thing I’m most proud of, but when I think of everything happening in the Middle East, I can’t help but tell myself: I wish we would have achieved some sort of lasting victory.  No one touched that subject before Trump, especially not in the Republican Party.

I’m not a hillbilly, nor do I descend from hillbilly stock, strictly speaking. But I do come from poor rural white people in the South. I have spent most of my life and career living among professional class urbanite, most of them on the East Coast, and the barely-banked contempt they — the professional-class whites, I mean — have for poor white people is visceral, and obvious to me. Yet it is invisible to them. Why is that? And what does it have to do with our politics today? 

I know exactly what you mean.  My grandma (Mamaw) recognized this instinctively.  She said that most people were probably prejudiced, but they had to be secretive about it.  “We”–meaning hillbillies–“are the only group of people you don’t have to be ashamed to look down upon.”  During my final year at Yale Law, I took a small class with a professor I really admired (and still do).  I was the only veteran in the class, and when this came up somehow in conversation, a young woman looked at me and said, “I can’t believe you were in the Marines.  You just seem so nice.  I thought that people in the military had to act a certain way.”  It was incredibly insulting, and it was my first real introduction to the idea that this institution that was so important among my neighbors was looked down upon in such a personal way. To this lady, to be in the military meant that you had to be some sort of barbarian.  I bit my tongue, but it’s one of those comments I’ll never forget.

The “why” is really difficult, but I have a few thoughts.  The first is that humans appear to have some need to look down on someone; there’s just a basic tribalistic impulse in all of us.  And if you’re an elite white professional, working class whites are an easy target: you don’t have to feel guilty for being a racist or a xenophobe.  By looking down on the hillbilly, you can get that high of self-righteousness and superiority without violating any of the moral norms of your own tribe.  So your own prejudice is never revealed for what it is.

A lot of it is pure disconnect–many elites just don’t know a member of the white working class. A professor once told me that Yale Law shouldn’t accept students who attended state universities for their undergraduate studies.  (A bit of background: Yale Law takes well over half of its student body from very elite private schools.)  “We don’t do remedial education here,” he said.  Keep in mind that this guy was very progressive and cared a lot about income inequality and opportunity.  But he just didn’t realize that for a kid like me, Ohio State was my only chance–the one opportunity I had to do well in a good school.  If you removed that path from my life, there was nothing else to give me a shot at Yale.  When I explained that to him, he was actually really receptive.  He may have even changed his mind.

What does it mean for our politics?  To me, this condescension is a big part of Trump’s appeal.  He’s the one politician who actively fights elite sensibilities, whether they’re good or bad.  I remember when Hillary Clinton casually talked about putting coal miners out of work, or when Obama years ago discussed working class whites clinging to their guns and religion.  Each time someone talks like this, I’m reminded of Mamaw’s feeling that hillbillies are the one group you don’t have to be ashamed to look down upon.  The people back home carry that condescension like a badge of honor, but it also hurts, and they’ve been looking for someone for a while who will declare war on the condescenders.  If nothing else, Trump does that.

This is where, to me, there’s a lot of ignorance around “Teflon Don.”  No one seems to understand why conventional blunders do nothing to Trump.  But in a lot of ways, what elites see as blunders people back home see as someone who–finally–conducts themselves in a relatable way.  He shoots from the hip; he’s not constantly afraid of offending someone; he’ll get angry about politics; he’ll call someone a liar or a fraud.  This is how a lot of people in the white working class actually talk about politics, and even many elites recognize how refreshing and entertaining it can be!  So it’s not really a blunder as much as it is a rich, privileged Wharton grad connecting to people back home through style and tone.  Viewed like this, all the talk about “political correctness” isn’t about any specific substantive point, as much as it is a way of expanding the scope of acceptable behavior.  People don’t want to believe they have to speak like Obama or Clinton to participate meaningfully in politics, because most of us don’t speak like Obama or Clinton.

On the other hand, as Hillbilly Elegy says so well, that reflexive reverse-snobbery of the hillbillies and those like them is a real thing too, and something that undermines their prospects in life. Is there any way for it to be overcome, other than getting out of the bubble, as you did?

I’m not sure we can overcome it entirely. Nearly everyone in my family who has achieved some financial success for themselves, from Mamaw to me, has been told that they’ve become “too big for their britches.”  I don’t think this value is all bad.  It forces us to stay grounded, reminds us that money and education are no substitute for common sense and humility.  But, it does create a lot of pressure not to make a better life for yourself, and let’s face it: when you grow up in a dying steel town with very few middle class job prospects, making a better life for yourself is often a binary proposition: if you don’t get a good job, you may be stuck on welfare for the rest of your life.

I’m a big believer in the power to change social norms.  To take an obvious recent example, I see the decline of smoking as not just an economic or regulatory matter, but something our culture really flipped on.  So there’s value in all of us–whether we have a relatively large platform or if our platform is just the people who live with us–trying to be a little kinder to the kids who want to make a better future for themselves.  That’s a big part of the reason I wrote the book: it’s meant not just for elites, but for people from my own clan, in the hopes that they’ll better appreciate the ways they can help (or hurt) their own kin.

At the same time, the hostility between the working class and the elites is so great that there will always be some wariness toward those who go to the other side.  And can you blame them?  A lot of these people know nothing but judgment and condescension from those with financial and political power, and the thought of their children acquiring that same hostility is noxious.  It may just be the sort of value we have to live with.

The odd thing is, the deeper I get into elite culture, the more I see value in this reverse snobbery.  It’s the great privilege of my life that I’m deep enough into the American elite that I can indulge a little anti-elitism.  Like I said, it keeps you grounded, if nothing else!  But it would have been incredibly destructive to indulge too much of it when I was 18.

I live in the rural South now, where I was born, and I see the same kind of social pathologies among some poor whites that you write about in Hillbilly Elegy. I also see the same thing among poor blacks, and have heard from a few black friends who made it out as you did the same kind of stories about how their own people turned on them and accused them of being traitors to their family and class — this, only for getting an education and building stable lives for themselves. The thing that so few of us either understand or want to talk about is that nobody who lives the way these poor black and white people do is ever going to amount to anything. There’s never going to be an economy rich enough or a government program strong enough to compensate for the lack of a stable family and the absence of self-discipline. Are Americans even capable of hearing that anymore? 

Judging by the current political conversation, no: Americans are not capable of hearing that anymore.  I was speaking with a friend the other night, and I made the point that the meta-narrative of the 2016 election is learned helplessness as a political value.  We’re no longer a country that believes in human agency, and as a formerly poor person, I find it incredibly insulting.  To hear Trump or Clinton talk about the poor, one would draw the conclusion that they have no power to affect their own lives.  Things have been done to them, from bad trade deals to Chinese labor competition, and they need help.  And without that help, they’re doomed to lives of misery they didn’t choose.

Obviously, the idea that there aren’t structural barriers facing both the white and black poor is ridiculous.  Mamaw recognized that our lives were harder than rich white people, but she always tempered her recognition of the barriers with a hard-noses willfulness: “never be like those a–holes who think the deck is stacked against them.”  In hindsight, she was this incredibly perceptive woman.  She recognized the message my environment had for me, and she actively fought against it.

There’s good research on this stuff.  Believing you have no control is incredibly destructive, and that may be especially true when you face unique barriers.  The first time I encountered this idea was in my exposure to addiction subculture, which is quite supportive and admirable in its own way, but is full of literature that speaks about addiction as a disease.  If you spend a day in these circles, you’ll hear someone say something to the effect of, “You wouldn’t judge a cancer patient for a tumor, so why judge an addict for drug use.”  This view is a perfect microcosm of the problem among poor Americans.  On the one hand, the research is clear that there are biological elements to addiction–in that way, it does mimic a disease.  On the other hand, the research is also clear that people who believe their addiction is a biologically mandated disease show less ability to resist it.  It’s this awful catch-22, where recognizing the true nature of the problem actually hinders the ability to overcome.

Interestingly, both in my conversations with poor blacks and whites, there’s a recognition of the role of better choices in addressing these problems.  The refusal to talk about individual agency is in some ways a consequence of a very detached elite, one too afraid to judge and consequently too handicapped to really understand.  At the same time, poor people don’t like to be judged, and a little bit of recognition that life has been unfair to them goes a long way.  Since Hillbilly Elegy came out, I’ve gotten so many messages along the lines of: “Thank you for being sympathetic but also honest.”

I think that’s the only way to have this conversation and to make the necessary changes: sympathy and honesty.  It’s not easy, especially in our politically polarized world, to recognize both the structural and the cultural barriers that so many poor kids face.  But I think that if you don’t recognize both, you risk being heartless or condescending, and often both.

by Rod Dreher, American Conservative |  Read more:
Image: a katz / Shutterstock.com

Scathing Report on Baltimore Cops Vindicates Black Residents

With startling statistics, a federal investigation of the Baltimore Police Department documents in 164 single-spaced pages what black residents have been saying for years: They are routinely singled out, roughed up or otherwise mistreated by officers, often for no reason.

The 15-month Justice Department probe was prompted by the death of Freddie Gray, the black man whose fatal neck injury in the back of a police van touched off the worst riots in Baltimore in decades. To many people, the blistering report issued Wednesday was familiar reading.

Danny Marrow, a retired food service worker, said that over the years, he has been stopped and hassled repeatedly by police.

"It started when I was 8 years old and they'd say, with no probable cause, 'Hey, come here. Where are you going?'" he said. "No cause, just the color of my skin."

"Bullies in the workplace," he said. "They don't want you to get angry or challenge their authority, so they'll use force, they'll put the handcuffs on too tight. And if you run, they're going to beat you up when they catch you."

The Justice Department looked at hundreds of thousands of pages of documents, including internal affairs files and data on stops, searches and arrests.

It found that one African-American man was stopped 30 times in less than four years and never charged. Of 410 people stopped at least 10 times from 2010 to 2015, 95 percent were black. During that time, no one of any other race was stopped more than 12 times.

With the release of the report, the city agreed to negotiate with the Justice Department a set of police reforms over the next few months to fend off a government lawsuit. The reforms will be enforceable by the courts.

Mayor Stephanie Rawlings-Blake and Police Commissioner Kevin Davis acknowledged the longstanding problems and said they had started improvements even before the report was completed. They promised it will serve as a blueprint for sweeping changes.

"Fighting crime and having a better, more respectful relationship with the community are not mutually exclusive endeavors. We don't have to choose one or the other. We're choosing both. It's 2016," said Davis, who was appointed after the riots in April 2015.

Six officers, three white and three black, were charged in Gray's arrest and death. The case collapsed without a single conviction, though it did expose a lack of training within the department.

by Juliet Linderman and Eric Tucker, AP |  Read more:
Image: Brian Witte/AP

Think Amazon’s Drone Delivery Idea Is a Gimmick?

Amazon is the most obscure large company in the tech industry.

It isn’t just secretive, the way Apple is, but in a deeper sense, Jeff Bezos’ e-commerce and cloud-storage giant is opaque. Amazon rarely explains either its near-term tactical aims or its long-term strategic vision. It values surprise.

To understand Amazon, then, is necessarily to engage in a kind of Kremlinology. That’s especially true of the story behind one of its most important business areas: the logistics by which it ships orders to its customers.

Over the last few years, Amazon has left a trail of clues suggesting that it is radically altering how it delivers goods. Among other moves, it has set up its own fleet of trucks; introduced an Uber-like crowdsourced delivery service; built many robot-powered warehouses; and continued to invest in a far-out plan to use drones for delivery. It made another splash last week, when it showed off an Amazon-branded Boeing 767 airplane, one of more than 40 in its planned fleet.

These moves have fueled speculation that Amazon is trying to replace the third-party shipping companies it now relies on — including UPS, FedEx and the United States Postal Service — with its homegrown delivery service. Its logistics investments have also fed the general theory that Amazon has become essentially unbeatable in American e-commerce — no doubt one reason Walmart, the world’s largest retailer, felt the need this week to acquire an audacious Amazon rival, Jet.com, for $3.3 billion.

So what’s Amazon’s ultimate aim in delivery? After talking to analysts, partners and competitors, and prying some very minimal input from Amazon itself, I suspect the company has a two-tiered vision for the future of shipping.

First, it’s not trying to replace third-party shippers. Instead, over the next few years, Amazon wants to add as much capacity to its operations as possible, and rather than replace partners like UPS and FedEx, it is spending boatloads on planes, trucks, crowdsourcing and other novel delivery services to add to its overall capacity and efficiency.

Amazon’s longer-term goal is more fantastical — and, if it succeeds, potentially transformative. It wants to escape the messy vicissitudes of roads and humans. It wants to go fully autonomous, up in the sky. The company’s drone program, which many in the tech press dismissed as a marketing gimmick when Mr. Bezos unveiled it on “60 Minutes” in 2013, is central to this future; drones could be combined with warehouses manned by robots and trucks that drive themselves to unlock a new autonomous future for Amazon.

There are hurdles to realizing this vision. Drone delivery in the United States faces an uncertain regulatory future, and there are myriad technical and social problems to iron out. Still, experts I consulted said that a future populated with autonomous drones is closer at hand than one populated with self-driving cars.

“It’s a vastly easier problem — flying than driving,” said Keller Rinaudo, the co-founder of Zipline, a drone-delivery start-up that will begin deploying a system to deliver medical goods in Rwanda this fall. “If we had regulatory permission, we’d be delivering to your house right now,” he added, referring to the San Francisco Bay Area.

If Amazon’s drone program succeeds (and Amazon says it is well on track), it could fundamentally alter the company’s cost structure. A decade from now, drones would reduce the unit cost of each Amazon delivery by about half, analysts at Deutsche Bank projected in a recent research report. If that happens, the economic threat to competitors would be punishing — “retail stores would cease to exist,” Deutsche’s analysts suggested, and we would live in a world more like that of “The Jetsons” than our own.

by Farhad Manjoo, NY Times |  Read more:
Image: Amazon

Cidade de Deus (City of God), Silvia Izquierdo/AP
via:

Tuesday, August 9, 2016


[ed. These are the days when I really miss Alaska. My buddy won the Seward Salmon Fishing Derby in Resurrection Bay one year ($10,000) after catching a 20+ lb. coho. We had a double strike (both monsters), but my fish (the bigger one of course) shook the hook right at the boat. He treated us to a great dinner though.]

Photo: Erik Hill, A Day on Resurrection Bay
via:

Raiva, ho un condominio interiore on Flickr

The Singular Joys of Watching Ichiro

Sunday afternoon in Colorado, the Miami Marlins outfielder Ichiro Suzuki tallied a hit for the 3,000th time in his Major League career. Using his trademark batting style—less a swing than a kind of spinning stab, with the left-handed Ichiro already edging out of the batter’s box as bat meets ball—he whistled a pitch into right field, where it caromed off the wall as he ran lightly to third. Fans stood and cheered; Ichiro removed his helmet in acknowledgement; the Marlins left the dugout to congratulate him. He became the thirtieth player in Major League Baseball history to reach the figure, a hallowed number designating the true experts at the task Ted Williams called the toughest in all of sports: hitting a round ball with a round bat.

The 3,000 hit club is home to all sorts of players. Derek Jeter and Alex Rodriguez both belong to it, each having joined with a Yankee Stadium home run. Ty Cobb, the vicious and racist star of baseball’s dead-ball era, is a member, as is Hank Aaron, as dignified a figure as the game has produced. The salty and officially shunned “Hit King” Pete Rose remains atop the leaderboard. When, a couple of months back, Ichiro matched Rose’s mark of 4,256 hits, including his hits in the Japanese professional baseball league, Rose responded with characteristic grouchiness: “It sounds like in Japan, they’re trying to make me the Hit Queen. I’m not trying to take anything away from Ichiro ... but the next thing you know, they’ll be counting his high-school hits.”

Ichiro is not nearly the best player in this group, but he may be the most representative of its spirit of sustained excellence, of moderate success massed into something spectacular over time. At his best, during a decade with the Seattle Mariners, he was a variously gifted talent, a wall-climbing and cannon-armed dynamo in right field, but his core genius was always for sending a baseball just out of reach of the defenders. Hitting, for him, has seemed like a labor of love—as if, were some dramatic rule change to render anything less than a home run useless, he would still go to the plate looking to flick pitches onto patches of open grass. The simplicity and clarity of his purpose has made him one of the most joyful players to watch in the game’s long history. Very few people get to be great at something as difficult as professional baseball. Fewer still get to be great in exactly the way they would like—Ichiro has.

Ichiro has used the same batting technique for his whole career, from his prime in Seattle to his post-prime stops in New York and Miami, but it looks, even on the thousandth viewing, like something he just recently decided to experiment with. Before every pitch, he holds his bat out and tugs up the sleeve on his right arm. Once he assumes his stance—knees pinched, shoulders rounded—his hands hold the bat up behind his ear, wavering in a way that might make anyone with a less impressive rĂ©sumĂ© seem nervous. At 42 years old, Ichiro still has the scrawniness of an underfed teenager. His left foot hovers before the pitch is released; one wonders, watching, how this mess of thin limbs at strange angles will arrange itself to hit a baseball.

Then the pitch crosses the plate, and it is as if some invisible hand has pulled a string on a gyroscope. Ichiro whirls at the ball. His shoulders fly outward and his feet go askew, but the bat comes through in a calibrated slice. His goal is not to “barrel up” the pitch so much as redirect it, to let its own energy, nudged outward, carry it into the field. All that bodily mayhem has a purpose, too; Ichiro starts running almost as he swings, so he gets to first base remarkably quickly.

It is one of the most singular motions in baseball, the work of someone who has dedicated untold hours to wringing every possible hit from the game. A quiet irony attends this work, though. Ichiro has played his career during a time when the base hit has lost its luster. He first landed in the Majors as a 27-year-old in 2001, in the middle of what would be recognized as the Steroid Era, when players across baseball were muscling up in an effort to land the ball not just between the defenders, but also over the outfield wall. He has kept on through the popularization of advanced statistics, which assert that batting average—the mark that testified, during Ichiro’s peak, to his annual greatness—is not as strong a measure of quality as previously thought. In this context, he is something of a man out of time, his presence next to the rest of baseball’s modern star class as incongruous as a horse and rider on the interstate.

This quality, though, has only added to Ichiro’s appeal. Professional sports have never seemed more like work than they do now. Players spend their lives hunting for an edge, be it technological, chemical, or statistical. They pore over frame-by-frame video and ingest supplements. They change their approaches according to dictates or trends and give post-game interviews that, understandably, have all the joy of an office-job performance review. In this context, someone like Ichiro—who does what he’s always tried to do, again and again, without much care for the sport’s shifting ideological winds—is a welcome throwback. When he plays it, baseball just looks a little bit more like a game.

Judging by his appearance, with his forearms not much bigger around than his wrists and his spidery legs, you might think that Ichiro didn’t have much choice in his style of play, that slapping the ball into the shallow outfield was all he could ever muster. Rumors around baseball have long contradicted this assumption, though. His batting-practice home run displays are the stuff of lore, and writers have speculated that, if Ichiro had opted to sacrifice some of his contact-hitting prowess, he could have been a credible slugger. Barry Bonds, MLB’s all-time home-run leader and the current Marlins hitting coach, is the latest to chime in on this front; before the All-Star game, Bonds said even the aged version of Ichiro could win the annual Home Run Derby if he chose, “easy, hands down.”

by Robert O'Connell, The Atlantic |  Read more:
Image: uncredited via:

[ed. Truly one of the best.]

Trial by Jury, a Hallowed American Right, Is Vanishing

The criminal trial ended more than two and a half years ago, but Judge Jesse M. Furman can still vividly recall the case. It stands out, not because of the defendant or the subject matter, but because of its rarity: In his four-plus years on the bench in Federal District Court in Manhattan, it was his only criminal jury trial.

He is far from alone.

Judge J. Paul Oetken, in half a decade on that bench, has had four criminal trials, including one that was repeated after a jury deadlocked. For Judge Lewis A. Kaplan, who has handled some of the nation’s most important terrorism cases, it has been 18 months since his last criminal jury trial.

“It’s a loss,” Judge Kaplan said, “because when one thinks of the American system of justice, one thinks of justice being administered by juries of our peers. And to the extent that there’s a decline in criminal jury trials, that is happening less frequently.”

The national decline in trials, both criminal and civil, has been noted in law journal articles, bar association studies and judicial opinions. But recently, in the two federal courthouses in Manhattan and a third in White Plains (known collectively as the Southern District of New York), the vanishing of criminal jury trials has never seemed so pronounced.

The Southern District held only 50 criminal jury trials last year, the lowest since 2004, according to data provided by the court. The pace remains slow this year.

In 2005, records show, there were more than double the number of trials: 106. And decades ago, legal experts said, the numbers were much higher.

“It’s hugely disappointing,” said Judge Jed S. Rakoff, a 20-year veteran of the Manhattan federal bench. “A trial is the one place where the system really gets tested. Everything else is done behind closed doors.”

Legal experts attribute the decline primarily to the advent of the congressional sentencing guidelines and the increased use of mandatory minimum sentences, which transferred power to prosecutors, and discouraged defendants from going to trial, where, if convicted, they might face harsher sentences.

“This is what jury trials were supposed to be a check against — the potential abuse of the use of prosecutorial power,” said Frederick P. Hafetz, a defense lawyer and a former chief of the criminal division of the United States attorney’s office in Manhattan, who is researching the issue of declining trials. (...)

Former Judge John Gleeson, who in March stepped down from the federal bench in Brooklyn to enter private practice, noted in a 2013 court opinion that 81 percent of federal convictions in 1980 were the product of guilty pleas; in one recent year, the figure was 97 percent.

Judge Gleeson wrote that because most pleas are negotiated before a prosecutor prepares a case for trial, the “thin presentation” of evidence needed for indictment “is hardly ever subjected to closer scrutiny by prosecutors, defense counsel, judges or juries.”

“The entire system loses an edge,” he added, “and I have no doubt that the quality of justice in our courthouses has suffered as a result.”

by Benjamin Weiser, NY Times |  Read more:
Image: Anthony Lanzilote

Dinner, Disrupted

Silicon Valley has brought its wrecking ball to haute cuisine, and the results are not pretty.

At a fancy Silicon Valley restaurant where the micro-greens came from a farm called Love Apple, I got a definitive taste of California in the age of the plutocrats. This state — and this native of it — have long indulged a borderline-comic impulse toward self-expression through lifestyle and food, as if success might be a matter of nailing the perfect combination of surf trunks, grilled lamb hearts and sunset views.

For baby boomers who moved to the Bay Area in search of the unfussy good life, in the late 20th century, it was all about squinting just right to make our dry coastal hills look like Provence — per the instructions of the Francophile chefs Jeremiah Tower and Alice Waters of the legendary Berkeley restaurant Chez Panisse.

By the early 2000s, that Eurocentric baby-boomer cuisine enjoyed a prosperous middle age as “market-driven Cal/French/Italian” with an implicit lifestyle fantasy involving an Italianate Sonoma home with goats, a cheese-making barn, vineyards and olive trees, and a code of organic-grass-fed ethics that mapped a reliable boundary between food fit for bourgeois progressives and unclean commodity meats.

Today, Northern California has been taken over by a tech-boom generation with vastly more money and a taste for the existential pleasures of problem solving. The first hints of change appeared in 2005, when local restaurateurs sensed that it was time for a new culinary style with a new lifestyle fantasy. That’s when a leading San Francisco chef named Daniel Patterson published an essay that blamed the “tyranny of Chez Panisse” for stifling Bay Area culinary innovation. Next came the 2009 Fig-Gate scandal in which the chef David Chang, at a panel discussion in New York, said, “Every restaurant in San Francisco is serving figs on a plate with nothing on it.” Northern California erupted with an indignation that Mr. Chang called, in a subsequent interview, “just retardedly stupid.” Mr. Chang added that, as he put it, “People need to smoke more marijuana in San Francisco.”

By this point, I was a food writer of the not-anonymous variety, by which I mean that I joined the search for the next big thing by eating great meals courtesy of magazines and restaurants, without hiding my identity the way a critic would. In 2010, a magazine asked me to profile the extraordinary chef David Kinch of the aforementioned fancy restaurant, which is called Manresa and lies in the affluent suburb of Los Gatos.

I went there twice for work and concentrated both times on the food alone. I was knocked out, especially by a creation called Tidal Pool, which involved a clear littoral broth of seaweed dashi pooling around sea-urchin tongues, pickled kelp and foie gras. I know that I will set off the gag reflex in certain quarters when I confess that, in my view, Mr. Kinch took the sensory pleasure of falling off a surfboard into cold Northern California water and transformed it into the world’s most delicious bowl of Japanese-French seafood soup. Mr. Kinch, I concluded, was the savior sent to bring California cuisine into the 21st century.

Two years later, in December 2012, a magazine editor said that he could expense a Manresa dinner for the two of us. He suggested that we bring (and pay for) our spouses. I had never once eaten at a restaurant of that caliber on my own dime because I did not make nearly enough money. But I liked this editor, I loved Mr. Kinch and I calculated that my one-quarter share of the evening’s total would be $200. I decided to make it a once-in-a-lifetime splurge. After we sat down, Mr. Kinch emerged and said something like, “With your permission, I would love to create a special tasting menu for your table.” Because the editor and I were pampered food-media professionals, we took this to mean something like, Don’t sweat the prices on the menu; let’s have fun, and I’ll make the bill reasonable.

The meal lasted five hours, consisted of more than 20 fantastic courses, and we all felt that we had eaten perhaps the greatest meal of our lives. Then the bill came: $1,200, with tax and tip. It turned out that “a special tasting menu” was a price point marked on the menu. My editor friend confessed that he could charge only $400 to his corporate card, and I felt sick with self-loathing. I knew this was my fault — not Mr. Kinch’s — and I looked around the dining room at loving couples, buoyant double dates, even a family with two young children for whom a thousand-dollar meal was no stretch. I had been a fool in more ways than I could count, including my delusion that one could think and talk about food outside of its social and economic context.

Like any artisan whose trade depends upon expensive materials and endless work, every chef who plays that elite-level game must cultivate patrons. That means surrounding food with a choreographed theater of luxury in which every course requires a skilled server to set down fresh cutlery and then return with clean wine glasses. A midcareer professional sommelier then must fill those wine glasses and deliver a learned lecture about that next wine’s origin and flavor. Another person on a full-time salary with benefits must then set down art-piece ceramic plates that are perfectly selected to flatter the next two-mouthful course. Yet another midcareer professional must then explain the rare and expensive plants and proteins that have been combined through hours of time-consuming techniques to create the next exquisitely dense compression of value that each diner will devour in moments. Those empty plates and glasses must then be cleared to repeat this cycle again and again, hour after hour.

In the case of Northern California, these restaurants must satisfy a venture-capital and post-I.P.O. crowd for whom a $400 dinner does not qualify as conspicuous consumption and for whom the prevailing California-lifestyle fantasy is less about heirloom tomatoes than recognizing inefficiencies in the international medical technology markets, flying first-class around the planet to cut deals at three-Michelin-Star restaurants in Hong Kong or London and then, back home, treating the kids to casual $2,000 Sunday suppers.

The foragers and farmers and fishermen of the old Chez Panisse fantasy still figure, but now as an unseen impecunious peasant horde combing beaches and redwoods for the chanterelles and Santa Barbara spot prawns that genius chefs transform into visionary distillations of a mythical Northern California experience that no successful entrepreneur would waste time living.

In a normal metropolitan area, super-upscale places like Manresa have such narrow profit margins that ambitious young chefs open them mostly to establish their reputations; later, to pay the mortgage, they open a profitable mid-range joint nearby. According to Mr. Patterson, the opposite is now true in tech-boom San Francisco.

“Busy high-end places are doing fine because they have more ways to control their costs, but the mid-level is getting killed,” Mr. Patterson told me. “I’ve heard guys say they’re doing eight million a year in sales and bringing home less than 2 percent as profit.” (...)

I am all in favor of San Francisco’s $13 per hour minimum wage (which rises to $15 by 2018), plus mandatory paid sick leave, parental leave and employer health care contributions. But labor costs at restaurants are inching past 50 percent of total expenditures, an indicator of poor fiscal health. Commercial rents have also gone bananas. Add the ever-rising cost of frisĂ©e and pastured quail eggs and it’s no wonder that many restaurants are experimenting with that unique form of sadism known as “small plate sharing,” which amounts to offering a big group of hungry people something tiny to divvy up. Even nontrendy joints now ask $30 for a proper entree — a price point, according to Mr. Patterson, that encourages even affluent customers to discover the joys of home cooking.

This is all fine at the handful of places that are full and profitable every night — State Bird Provisions, Lazy Bear — but, according to Gwyneth Borden of the Golden Gate Restaurant Association, an alarming number are not. The bigger tech companies worsen the problem by scooping up culinary talent to run lavish free food programs that, as Ms. Borden said, offer workers “all-day bacon and lobster rolls and tacos.” This kills the incentive for employees to spend a penny in restaurants, especially at lunch. (Ms. Borden also told me that she can’t count the number of times she has heard an Uber or Lyft driver confess to being a former chef.)

Constant traffic jams and great restaurants in less congested cities like Oakland discourage suburbanites who used to cross the Bay Bridge for date night in San Francisco. Besides, as Mr. Patterson says, the city clears out on holiday weekends. “They all go to Tahoe,” he said. “You want to get a reservation somewhere? Just book a table during Burning Man.”

by Daniel Duane, NY Times |  Read more:
Image: Mark Pernice

Monday, August 8, 2016


Jo Jankowski, Pool Player
via:

Breaking Baccalaureate

This past spring, I attended a championship story slam with a student I have advised and whom I know well. This student is a gifted writer and a funny, self-deprecating storyteller. I could easily claim that I thought attending the slam might give her insight about a research project I was advising her on. But the truth is that I simply thought she would enjoy the slam and might find an outlet for her own storytelling. The issue of engaging with a student outside of formal class time is, of course, a tricky one these days, especially if the professor is a male and the student a female. I will address the potential pitfalls as well huge opportunities of engaging with students outside of class in another essay.

So there we were the other night — my student and I — sitting in a small club with about 75 people in the audience, at another story slam. This time I had challenged her to sign up to speak, and she agreed as long as I did the same. About an hour into the story slam, my student’s name was called. She smiled and made her way to the front of the stage. I looked on nervously as she told a funny story about her confusion regarding the men she likes. Her voice was strong and confident, and the audience laughed at the right moments.

When she made her way back to her seat, I stood and clapped and congratulated her. “You were great,” I said. She sat down and seemed pleased, still riding the tail end of a performer’s high. Then came the judges’ ratings: They were far lower than I thought she deserved, lower than the ratings of many of the speakers who preceded her. I was worried. My student can be harshly critical of her writing until it is fully polished. Having encouraged her to speak in front of the crowd in the first place, I didn’t want her to turn overly self-critical or feel dejected by the ratings. And so for the rest of that night, I was clear about my teacher mission: I wanted to celebrate her courage for stepping up to the microphone.

In his recent book, Helping Children Succeed, Paul Tough writes about the startling conclusion of a massive study of teacher effectiveness. According to Northwestern University economist C. Kirabo Jackson, who tracked the performance of 500,000 students in North Carolina over seven years from ninth grade on, there emerged from the data set two categories of highly effective teachers. In the first category were the teachers who consistently raised student test scores. These are the teachers who win awards and receive high evaluations and sometimes bonuses.

But it is the second category of excellent teachers that fascinates me. I’ll call this second group of teachers “nurturers,” though you might also see them as inspirers or motivators. These teachers don’t raise standardized test scores. Rather, their achievements show up as better student attendance, fewer suspensions, higher on-time grade progression, and higher GPAs.

Lest you think that the nurturers are the easy teachers who artificially cheer on students and hand out inflated grades, consider this: The GPAs of students improved not simply while in a nurturer’s class, but also in subsequent classes and in subsequent years as well.

Indeed, when Jackson added up four measures the nurturers excelled at — school attendance, on time grade progression, suspensions and discipline, and overall GPA — he found these measures to be, in Tough’s words “a better predictor than a student’s test scores of whether a student would attend college, a better predictor of adult wages and a better predictor of future arrests.”

Of course, many inspiring, motivating, nurturing teachers (and the students they influenced) have long intuited that their good work produced results beyond what was seen on standardized test scores. Ironically, it has taken the arrival of big data to highlight the magnitude of what they accomplish. The term frequently used to describe what students develop working with these nurturing teachers is “non-cognitive skills.” These are skills or traits such as persistence, ability to get along with others, ability to finish a task, ability to show up on time, and ability to manage and recover from failure.

Long before I heard of Jackson’s study, I had become convinced that cultivating non-cognitive skills was one of the best steps I could take to help my students with their academic (cognitive) work and help them long-term in their lives. The first-year students I mostly teach, around ages 18 and 19, often don’t know how to work through a bad day or a bad week or how to talk to a professor when they blow a deadline or miss an assignment. I have long noticed that if a student misses a class, there’s a good chance they will miss another class. The student then feels guilty and too embarrassed to contact me. In short order, the student falls so far behind on assignments that catching up seems overwhelming and impossible. And so they skip class again.

The irony of course is that if a student simply comes to class and pulls me aside to explain what is going on in their life, I can help them prioritize what to catch up on and provide words of support. To minimize this problem — the missed class, leading to more missed classes, leading to failure — I now insist that students come to class even if they are unprepared, no penalty attached. But when you’re not prepared, I tell my students, you must approach me before the start of class and tell me so.

Knowing at the start of class that a particular student hasn’t completed a reading allows me to avoid embarrassing that student by calling on them to answer a question related to the assignment. Sometimes I can even take a few seconds to fill in background information so that the student can participate in the class discussion.

I insist that my students come to class when they are not prepared because they can still gain a lot from the class. They will feel connected to the course and to me, and they won’t feel so paralyzed by guilt. They are also much more likely, in my experience, to catch up. Since I’ve started this policy, I would say my attendance has increased, but to be fair, I’ve improved in other ways as a teacher, so I can’t chalk up the improved attendance solely to this no-guilt policy about being unprepared. There’s been no decline in the number of students who come to class fully prepared. The requirement that they tell me in person when they are behind is apparently enough to discourage people from abusing that option.

I’ve made other changes that are designed to lure my students out of the binary, good/bad, perfectionist framework that a number of them seem to bring to college from high school. I used to yell at students who were sleeping in my class. These days if I see a student sleeping, I will calmly ask them to take a walk to get some fresh air or I might suggest they get some coffee. The first time I responded to a sleeping student by suggesting coffee and a walk, the student bolted upright. “No, I’m good,” he said. He no doubt sensed a trap. Why would I suggest he get coffee unless it was part of some devious scheme? I told the student there was no penalty for stepping out for a few minutes, but he wouldn’t move.

Finally, I pulled out my wallet, handed the student a few bills, and told him to getme a cup of coffee and to get one for himself if he wanted. It was only after I specified two creams and one sugar that the student relaxed and realized I was not plotting a scheme. Invariably, the times I’ve sent sleepy students out for a walk or for coffee, they have returned within minutes, awake, in a better mood, and able to participate in the class.

My goal in taking this less harsh approach to students is not to be nice. Being “nice” without clear boundaries and limits is a recipe for chaos and student dissatisfaction. My goal is to model for young people how to think maturely, precisely, and creatively about problems they face inside and outside of class. How can I expect them to engage in imaginative thinking on an assignment if I don’t cultivate imaginative thinking on the practical problems they face in class? Yelling at sleeping students, as I did in the old days, didn’t show students how to handle sleepiness. Yelling only made them feel bad, and the result at best was a student who fought to keep their eyes open. Spending all your energy to keep your eyes open leaves little energy for listening, learning, and engaging in the class.

by Robert Anthony Watts, TSS |  Read more:
Image: uncredited

Blues by alvdesign

My Love-Hate Relationship with Medium

[ed. The last paragraph in this article is exactly why I hardly visit Medium anymore. Who wants to wade through a bunch of self-indulgent, self-promoting, whiny posts, about - whatever - searching for something of value? And that goes for so many other 'hot' media sites these days: BuzzFeed, Huffington Post, Daily Beast, Vox, Slate, Salon, Tech Crunch, Fast Company, Jezebel, Vice, Vulture, Fusion, Thought Catalog (is that still around?) etc. ... the list goes on and on. Echo chambers mostly, selling click bait and navel gazing, with objectives like those articulated below. At least in the old days publishers and editors acted as effective gate-keepers to quality journalism (because it mattered and markets responded accordingly). These days, not so much.]

By day, I am a wireless industry analyst and consultant. By night and on weekends, besides being an exercise and outdoors enthusiast, I write running guides. A few years ago, I self-published three books on running in the Boston area. In late 2015, I started a new project called Great Runs, which is a guide to the best places to go running in the world’s major cities and destinations. It’s geared toward travelers who run and runners who travel. This time, I decided to develop the content online, but I wanted more than a traditional blogging platform. A colleague recommended Medium, the online publishing platform started in 2013 by Twitter co-founder Evan Williams.

This has been a love-hate relationship from the get-go. By turns liberating and also maddening. I decided to focus a column on Medium because of its potential as a next-generation instrument for writers and readers: Ease of use, democratization and social journalism. But Medium also embodies a lot of what’s wrong with the web.

So here’s what’s fantastic. Medium is essentially a Version 2.0 blogging platform, allowing anyone from amateurs to professionals to corporations to post a story. Within five minutes, I was signed up and writing. The site is easy to use and visually elegant. Medium has kept things very simple, with limited formatting options. It’s easy to insert images, and they align and look beautiful. Content is auto-saved nearly constantly. I’ve hired some freelancers to develop content, and it’s easy to add them to Medium and edit their work. Write a piece, press "publish" and ba-bang, it’s out there for everyone to see. Social media sharing tools are well-integrated.

Authors are also interested in community, so the main Medium site has a list of tabs including Editor’s Picks, topics of the day and "For You," which seems to choose articles based primarily on folks I follow on Twitter, LinkedIn contacts and perhaps some relationship to tags in my stories (running, fitness, travel, etc.).

So, in many ways, Medium has been great. I’ve got more than 50 city guides up on the platform, and the responsive Great Runs "site" looks great on PCs, tablets and phones. I didn’t have to get a publisher or hire a web/WordPress/app developer.

And now for the downside. First beef: Discovery. Despite some pretty good content and a well-defined target market, getting my stuff discovered on Medium is hard. Really hard. The whole idea of a blog or "social journalism," as I think Ev calls it, is to build an audience. Yes, your Medium content is easily shared with your Twitter followers or your Facebook friends. So it’s great for Luluemon, which already has a huge social media presence. It now has more than 10,000 "followers" on Medium, and tons of folks recommending its content. For brands, established authors, and the companies who are seemingly flocking to Medium, it’s great. Because they already have an audience. (...)

My second major beef is monetization. As a side note, I am curious how Medium itself plans to make money. But as an author on Medium, there is presently no way to make any money from content. Blog sites, WordPress sites and so on all have some opportunity to run ads, host sponsors or sell content. But on Medium, nothing. Not even the ability to direct one’s Medium audience to a site where content could potentially be monetized in some way. (...)

In the end, some of Medium’s greatest benefits are also its biggest liabilities. Anyone can write on Medium. Which means anyone can write on Medium. There needs to be some delineation between the individual who wants to just post the occasional story on Medium and the individual/brand who want to use Medium for at least semi-professional or business purposes.

by Mark Lowenstein, Recode |  Read more:
Image: Lam L. / Yelp