Tuesday, August 9, 2016

The Singular Joys of Watching Ichiro

Sunday afternoon in Colorado, the Miami Marlins outfielder Ichiro Suzuki tallied a hit for the 3,000th time in his Major League career. Using his trademark batting style—less a swing than a kind of spinning stab, with the left-handed Ichiro already edging out of the batter’s box as bat meets ball—he whistled a pitch into right field, where it caromed off the wall as he ran lightly to third. Fans stood and cheered; Ichiro removed his helmet in acknowledgement; the Marlins left the dugout to congratulate him. He became the thirtieth player in Major League Baseball history to reach the figure, a hallowed number designating the true experts at the task Ted Williams called the toughest in all of sports: hitting a round ball with a round bat.

The 3,000 hit club is home to all sorts of players. Derek Jeter and Alex Rodriguez both belong to it, each having joined with a Yankee Stadium home run. Ty Cobb, the vicious and racist star of baseball’s dead-ball era, is a member, as is Hank Aaron, as dignified a figure as the game has produced. The salty and officially shunned “Hit King” Pete Rose remains atop the leaderboard. When, a couple of months back, Ichiro matched Rose’s mark of 4,256 hits, including his hits in the Japanese professional baseball league, Rose responded with characteristic grouchiness: “It sounds like in Japan, they’re trying to make me the Hit Queen. I’m not trying to take anything away from Ichiro ... but the next thing you know, they’ll be counting his high-school hits.”

Ichiro is not nearly the best player in this group, but he may be the most representative of its spirit of sustained excellence, of moderate success massed into something spectacular over time. At his best, during a decade with the Seattle Mariners, he was a variously gifted talent, a wall-climbing and cannon-armed dynamo in right field, but his core genius was always for sending a baseball just out of reach of the defenders. Hitting, for him, has seemed like a labor of love—as if, were some dramatic rule change to render anything less than a home run useless, he would still go to the plate looking to flick pitches onto patches of open grass. The simplicity and clarity of his purpose has made him one of the most joyful players to watch in the game’s long history. Very few people get to be great at something as difficult as professional baseball. Fewer still get to be great in exactly the way they would like—Ichiro has.

Ichiro has used the same batting technique for his whole career, from his prime in Seattle to his post-prime stops in New York and Miami, but it looks, even on the thousandth viewing, like something he just recently decided to experiment with. Before every pitch, he holds his bat out and tugs up the sleeve on his right arm. Once he assumes his stance—knees pinched, shoulders rounded—his hands hold the bat up behind his ear, wavering in a way that might make anyone with a less impressive résumé seem nervous. At 42 years old, Ichiro still has the scrawniness of an underfed teenager. His left foot hovers before the pitch is released; one wonders, watching, how this mess of thin limbs at strange angles will arrange itself to hit a baseball.

Then the pitch crosses the plate, and it is as if some invisible hand has pulled a string on a gyroscope. Ichiro whirls at the ball. His shoulders fly outward and his feet go askew, but the bat comes through in a calibrated slice. His goal is not to “barrel up” the pitch so much as redirect it, to let its own energy, nudged outward, carry it into the field. All that bodily mayhem has a purpose, too; Ichiro starts running almost as he swings, so he gets to first base remarkably quickly.

It is one of the most singular motions in baseball, the work of someone who has dedicated untold hours to wringing every possible hit from the game. A quiet irony attends this work, though. Ichiro has played his career during a time when the base hit has lost its luster. He first landed in the Majors as a 27-year-old in 2001, in the middle of what would be recognized as the Steroid Era, when players across baseball were muscling up in an effort to land the ball not just between the defenders, but also over the outfield wall. He has kept on through the popularization of advanced statistics, which assert that batting average—the mark that testified, during Ichiro’s peak, to his annual greatness—is not as strong a measure of quality as previously thought. In this context, he is something of a man out of time, his presence next to the rest of baseball’s modern star class as incongruous as a horse and rider on the interstate.

This quality, though, has only added to Ichiro’s appeal. Professional sports have never seemed more like work than they do now. Players spend their lives hunting for an edge, be it technological, chemical, or statistical. They pore over frame-by-frame video and ingest supplements. They change their approaches according to dictates or trends and give post-game interviews that, understandably, have all the joy of an office-job performance review. In this context, someone like Ichiro—who does what he’s always tried to do, again and again, without much care for the sport’s shifting ideological winds—is a welcome throwback. When he plays it, baseball just looks a little bit more like a game.

Judging by his appearance, with his forearms not much bigger around than his wrists and his spidery legs, you might think that Ichiro didn’t have much choice in his style of play, that slapping the ball into the shallow outfield was all he could ever muster. Rumors around baseball have long contradicted this assumption, though. His batting-practice home run displays are the stuff of lore, and writers have speculated that, if Ichiro had opted to sacrifice some of his contact-hitting prowess, he could have been a credible slugger. Barry Bonds, MLB’s all-time home-run leader and the current Marlins hitting coach, is the latest to chime in on this front; before the All-Star game, Bonds said even the aged version of Ichiro could win the annual Home Run Derby if he chose, “easy, hands down.”

by Robert O'Connell, The Atlantic |  Read more:
Image: uncredited via:

[ed. Truly one of the best.]

Trial by Jury, a Hallowed American Right, Is Vanishing

The criminal trial ended more than two and a half years ago, but Judge Jesse M. Furman can still vividly recall the case. It stands out, not because of the defendant or the subject matter, but because of its rarity: In his four-plus years on the bench in Federal District Court in Manhattan, it was his only criminal jury trial.

He is far from alone.

Judge J. Paul Oetken, in half a decade on that bench, has had four criminal trials, including one that was repeated after a jury deadlocked. For Judge Lewis A. Kaplan, who has handled some of the nation’s most important terrorism cases, it has been 18 months since his last criminal jury trial.

“It’s a loss,” Judge Kaplan said, “because when one thinks of the American system of justice, one thinks of justice being administered by juries of our peers. And to the extent that there’s a decline in criminal jury trials, that is happening less frequently.”

The national decline in trials, both criminal and civil, has been noted in law journal articles, bar association studies and judicial opinions. But recently, in the two federal courthouses in Manhattan and a third in White Plains (known collectively as the Southern District of New York), the vanishing of criminal jury trials has never seemed so pronounced.

The Southern District held only 50 criminal jury trials last year, the lowest since 2004, according to data provided by the court. The pace remains slow this year.

In 2005, records show, there were more than double the number of trials: 106. And decades ago, legal experts said, the numbers were much higher.

“It’s hugely disappointing,” said Judge Jed S. Rakoff, a 20-year veteran of the Manhattan federal bench. “A trial is the one place where the system really gets tested. Everything else is done behind closed doors.”

Legal experts attribute the decline primarily to the advent of the congressional sentencing guidelines and the increased use of mandatory minimum sentences, which transferred power to prosecutors, and discouraged defendants from going to trial, where, if convicted, they might face harsher sentences.

“This is what jury trials were supposed to be a check against — the potential abuse of the use of prosecutorial power,” said Frederick P. Hafetz, a defense lawyer and a former chief of the criminal division of the United States attorney’s office in Manhattan, who is researching the issue of declining trials. (...)

Former Judge John Gleeson, who in March stepped down from the federal bench in Brooklyn to enter private practice, noted in a 2013 court opinion that 81 percent of federal convictions in 1980 were the product of guilty pleas; in one recent year, the figure was 97 percent.

Judge Gleeson wrote that because most pleas are negotiated before a prosecutor prepares a case for trial, the “thin presentation” of evidence needed for indictment “is hardly ever subjected to closer scrutiny by prosecutors, defense counsel, judges or juries.”

“The entire system loses an edge,” he added, “and I have no doubt that the quality of justice in our courthouses has suffered as a result.”

by Benjamin Weiser, NY Times |  Read more:
Image: Anthony Lanzilote

Dinner, Disrupted

Silicon Valley has brought its wrecking ball to haute cuisine, and the results are not pretty.

At a fancy Silicon Valley restaurant where the micro-greens came from a farm called Love Apple, I got a definitive taste of California in the age of the plutocrats. This state — and this native of it — have long indulged a borderline-comic impulse toward self-expression through lifestyle and food, as if success might be a matter of nailing the perfect combination of surf trunks, grilled lamb hearts and sunset views.

For baby boomers who moved to the Bay Area in search of the unfussy good life, in the late 20th century, it was all about squinting just right to make our dry coastal hills look like Provence — per the instructions of the Francophile chefs Jeremiah Tower and Alice Waters of the legendary Berkeley restaurant Chez Panisse.

By the early 2000s, that Eurocentric baby-boomer cuisine enjoyed a prosperous middle age as “market-driven Cal/French/Italian” with an implicit lifestyle fantasy involving an Italianate Sonoma home with goats, a cheese-making barn, vineyards and olive trees, and a code of organic-grass-fed ethics that mapped a reliable boundary between food fit for bourgeois progressives and unclean commodity meats.

Today, Northern California has been taken over by a tech-boom generation with vastly more money and a taste for the existential pleasures of problem solving. The first hints of change appeared in 2005, when local restaurateurs sensed that it was time for a new culinary style with a new lifestyle fantasy. That’s when a leading San Francisco chef named Daniel Patterson published an essay that blamed the “tyranny of Chez Panisse” for stifling Bay Area culinary innovation. Next came the 2009 Fig-Gate scandal in which the chef David Chang, at a panel discussion in New York, said, “Every restaurant in San Francisco is serving figs on a plate with nothing on it.” Northern California erupted with an indignation that Mr. Chang called, in a subsequent interview, “just retardedly stupid.” Mr. Chang added that, as he put it, “People need to smoke more marijuana in San Francisco.”

By this point, I was a food writer of the not-anonymous variety, by which I mean that I joined the search for the next big thing by eating great meals courtesy of magazines and restaurants, without hiding my identity the way a critic would. In 2010, a magazine asked me to profile the extraordinary chef David Kinch of the aforementioned fancy restaurant, which is called Manresa and lies in the affluent suburb of Los Gatos.

I went there twice for work and concentrated both times on the food alone. I was knocked out, especially by a creation called Tidal Pool, which involved a clear littoral broth of seaweed dashi pooling around sea-urchin tongues, pickled kelp and foie gras. I know that I will set off the gag reflex in certain quarters when I confess that, in my view, Mr. Kinch took the sensory pleasure of falling off a surfboard into cold Northern California water and transformed it into the world’s most delicious bowl of Japanese-French seafood soup. Mr. Kinch, I concluded, was the savior sent to bring California cuisine into the 21st century.

Two years later, in December 2012, a magazine editor said that he could expense a Manresa dinner for the two of us. He suggested that we bring (and pay for) our spouses. I had never once eaten at a restaurant of that caliber on my own dime because I did not make nearly enough money. But I liked this editor, I loved Mr. Kinch and I calculated that my one-quarter share of the evening’s total would be $200. I decided to make it a once-in-a-lifetime splurge. After we sat down, Mr. Kinch emerged and said something like, “With your permission, I would love to create a special tasting menu for your table.” Because the editor and I were pampered food-media professionals, we took this to mean something like, Don’t sweat the prices on the menu; let’s have fun, and I’ll make the bill reasonable.

The meal lasted five hours, consisted of more than 20 fantastic courses, and we all felt that we had eaten perhaps the greatest meal of our lives. Then the bill came: $1,200, with tax and tip. It turned out that “a special tasting menu” was a price point marked on the menu. My editor friend confessed that he could charge only $400 to his corporate card, and I felt sick with self-loathing. I knew this was my fault — not Mr. Kinch’s — and I looked around the dining room at loving couples, buoyant double dates, even a family with two young children for whom a thousand-dollar meal was no stretch. I had been a fool in more ways than I could count, including my delusion that one could think and talk about food outside of its social and economic context.

Like any artisan whose trade depends upon expensive materials and endless work, every chef who plays that elite-level game must cultivate patrons. That means surrounding food with a choreographed theater of luxury in which every course requires a skilled server to set down fresh cutlery and then return with clean wine glasses. A midcareer professional sommelier then must fill those wine glasses and deliver a learned lecture about that next wine’s origin and flavor. Another person on a full-time salary with benefits must then set down art-piece ceramic plates that are perfectly selected to flatter the next two-mouthful course. Yet another midcareer professional must then explain the rare and expensive plants and proteins that have been combined through hours of time-consuming techniques to create the next exquisitely dense compression of value that each diner will devour in moments. Those empty plates and glasses must then be cleared to repeat this cycle again and again, hour after hour.

In the case of Northern California, these restaurants must satisfy a venture-capital and post-I.P.O. crowd for whom a $400 dinner does not qualify as conspicuous consumption and for whom the prevailing California-lifestyle fantasy is less about heirloom tomatoes than recognizing inefficiencies in the international medical technology markets, flying first-class around the planet to cut deals at three-Michelin-Star restaurants in Hong Kong or London and then, back home, treating the kids to casual $2,000 Sunday suppers.

The foragers and farmers and fishermen of the old Chez Panisse fantasy still figure, but now as an unseen impecunious peasant horde combing beaches and redwoods for the chanterelles and Santa Barbara spot prawns that genius chefs transform into visionary distillations of a mythical Northern California experience that no successful entrepreneur would waste time living.

In a normal metropolitan area, super-upscale places like Manresa have such narrow profit margins that ambitious young chefs open them mostly to establish their reputations; later, to pay the mortgage, they open a profitable mid-range joint nearby. According to Mr. Patterson, the opposite is now true in tech-boom San Francisco.

“Busy high-end places are doing fine because they have more ways to control their costs, but the mid-level is getting killed,” Mr. Patterson told me. “I’ve heard guys say they’re doing eight million a year in sales and bringing home less than 2 percent as profit.” (...)

I am all in favor of San Francisco’s $13 per hour minimum wage (which rises to $15 by 2018), plus mandatory paid sick leave, parental leave and employer health care contributions. But labor costs at restaurants are inching past 50 percent of total expenditures, an indicator of poor fiscal health. Commercial rents have also gone bananas. Add the ever-rising cost of frisée and pastured quail eggs and it’s no wonder that many restaurants are experimenting with that unique form of sadism known as “small plate sharing,” which amounts to offering a big group of hungry people something tiny to divvy up. Even nontrendy joints now ask $30 for a proper entree — a price point, according to Mr. Patterson, that encourages even affluent customers to discover the joys of home cooking.

This is all fine at the handful of places that are full and profitable every night — State Bird Provisions, Lazy Bear — but, according to Gwyneth Borden of the Golden Gate Restaurant Association, an alarming number are not. The bigger tech companies worsen the problem by scooping up culinary talent to run lavish free food programs that, as Ms. Borden said, offer workers “all-day bacon and lobster rolls and tacos.” This kills the incentive for employees to spend a penny in restaurants, especially at lunch. (Ms. Borden also told me that she can’t count the number of times she has heard an Uber or Lyft driver confess to being a former chef.)

Constant traffic jams and great restaurants in less congested cities like Oakland discourage suburbanites who used to cross the Bay Bridge for date night in San Francisco. Besides, as Mr. Patterson says, the city clears out on holiday weekends. “They all go to Tahoe,” he said. “You want to get a reservation somewhere? Just book a table during Burning Man.”

by Daniel Duane, NY Times |  Read more:
Image: Mark Pernice

Monday, August 8, 2016


Jo Jankowski, Pool Player
via:

Breaking Baccalaureate

This past spring, I attended a championship story slam with a student I have advised and whom I know well. This student is a gifted writer and a funny, self-deprecating storyteller. I could easily claim that I thought attending the slam might give her insight about a research project I was advising her on. But the truth is that I simply thought she would enjoy the slam and might find an outlet for her own storytelling. The issue of engaging with a student outside of formal class time is, of course, a tricky one these days, especially if the professor is a male and the student a female. I will address the potential pitfalls as well huge opportunities of engaging with students outside of class in another essay.

So there we were the other night — my student and I — sitting in a small club with about 75 people in the audience, at another story slam. This time I had challenged her to sign up to speak, and she agreed as long as I did the same. About an hour into the story slam, my student’s name was called. She smiled and made her way to the front of the stage. I looked on nervously as she told a funny story about her confusion regarding the men she likes. Her voice was strong and confident, and the audience laughed at the right moments.

When she made her way back to her seat, I stood and clapped and congratulated her. “You were great,” I said. She sat down and seemed pleased, still riding the tail end of a performer’s high. Then came the judges’ ratings: They were far lower than I thought she deserved, lower than the ratings of many of the speakers who preceded her. I was worried. My student can be harshly critical of her writing until it is fully polished. Having encouraged her to speak in front of the crowd in the first place, I didn’t want her to turn overly self-critical or feel dejected by the ratings. And so for the rest of that night, I was clear about my teacher mission: I wanted to celebrate her courage for stepping up to the microphone.

In his recent book, Helping Children Succeed, Paul Tough writes about the startling conclusion of a massive study of teacher effectiveness. According to Northwestern University economist C. Kirabo Jackson, who tracked the performance of 500,000 students in North Carolina over seven years from ninth grade on, there emerged from the data set two categories of highly effective teachers. In the first category were the teachers who consistently raised student test scores. These are the teachers who win awards and receive high evaluations and sometimes bonuses.

But it is the second category of excellent teachers that fascinates me. I’ll call this second group of teachers “nurturers,” though you might also see them as inspirers or motivators. These teachers don’t raise standardized test scores. Rather, their achievements show up as better student attendance, fewer suspensions, higher on-time grade progression, and higher GPAs.

Lest you think that the nurturers are the easy teachers who artificially cheer on students and hand out inflated grades, consider this: The GPAs of students improved not simply while in a nurturer’s class, but also in subsequent classes and in subsequent years as well.

Indeed, when Jackson added up four measures the nurturers excelled at — school attendance, on time grade progression, suspensions and discipline, and overall GPA — he found these measures to be, in Tough’s words “a better predictor than a student’s test scores of whether a student would attend college, a better predictor of adult wages and a better predictor of future arrests.”

Of course, many inspiring, motivating, nurturing teachers (and the students they influenced) have long intuited that their good work produced results beyond what was seen on standardized test scores. Ironically, it has taken the arrival of big data to highlight the magnitude of what they accomplish. The term frequently used to describe what students develop working with these nurturing teachers is “non-cognitive skills.” These are skills or traits such as persistence, ability to get along with others, ability to finish a task, ability to show up on time, and ability to manage and recover from failure.

Long before I heard of Jackson’s study, I had become convinced that cultivating non-cognitive skills was one of the best steps I could take to help my students with their academic (cognitive) work and help them long-term in their lives. The first-year students I mostly teach, around ages 18 and 19, often don’t know how to work through a bad day or a bad week or how to talk to a professor when they blow a deadline or miss an assignment. I have long noticed that if a student misses a class, there’s a good chance they will miss another class. The student then feels guilty and too embarrassed to contact me. In short order, the student falls so far behind on assignments that catching up seems overwhelming and impossible. And so they skip class again.

The irony of course is that if a student simply comes to class and pulls me aside to explain what is going on in their life, I can help them prioritize what to catch up on and provide words of support. To minimize this problem — the missed class, leading to more missed classes, leading to failure — I now insist that students come to class even if they are unprepared, no penalty attached. But when you’re not prepared, I tell my students, you must approach me before the start of class and tell me so.

Knowing at the start of class that a particular student hasn’t completed a reading allows me to avoid embarrassing that student by calling on them to answer a question related to the assignment. Sometimes I can even take a few seconds to fill in background information so that the student can participate in the class discussion.

I insist that my students come to class when they are not prepared because they can still gain a lot from the class. They will feel connected to the course and to me, and they won’t feel so paralyzed by guilt. They are also much more likely, in my experience, to catch up. Since I’ve started this policy, I would say my attendance has increased, but to be fair, I’ve improved in other ways as a teacher, so I can’t chalk up the improved attendance solely to this no-guilt policy about being unprepared. There’s been no decline in the number of students who come to class fully prepared. The requirement that they tell me in person when they are behind is apparently enough to discourage people from abusing that option.

I’ve made other changes that are designed to lure my students out of the binary, good/bad, perfectionist framework that a number of them seem to bring to college from high school. I used to yell at students who were sleeping in my class. These days if I see a student sleeping, I will calmly ask them to take a walk to get some fresh air or I might suggest they get some coffee. The first time I responded to a sleeping student by suggesting coffee and a walk, the student bolted upright. “No, I’m good,” he said. He no doubt sensed a trap. Why would I suggest he get coffee unless it was part of some devious scheme? I told the student there was no penalty for stepping out for a few minutes, but he wouldn’t move.

Finally, I pulled out my wallet, handed the student a few bills, and told him to getme a cup of coffee and to get one for himself if he wanted. It was only after I specified two creams and one sugar that the student relaxed and realized I was not plotting a scheme. Invariably, the times I’ve sent sleepy students out for a walk or for coffee, they have returned within minutes, awake, in a better mood, and able to participate in the class.

My goal in taking this less harsh approach to students is not to be nice. Being “nice” without clear boundaries and limits is a recipe for chaos and student dissatisfaction. My goal is to model for young people how to think maturely, precisely, and creatively about problems they face inside and outside of class. How can I expect them to engage in imaginative thinking on an assignment if I don’t cultivate imaginative thinking on the practical problems they face in class? Yelling at sleeping students, as I did in the old days, didn’t show students how to handle sleepiness. Yelling only made them feel bad, and the result at best was a student who fought to keep their eyes open. Spending all your energy to keep your eyes open leaves little energy for listening, learning, and engaging in the class.

by Robert Anthony Watts, TSS |  Read more:
Image: uncredited

Blues by alvdesign

My Love-Hate Relationship with Medium

[ed. The last paragraph in this article is exactly why I hardly visit Medium anymore. Who wants to wade through a bunch of self-indulgent, self-promoting, whiny posts, about - whatever - searching for something of value? And that goes for so many other 'hot' media sites these days: BuzzFeed, Huffington Post, Daily Beast, Vox, Slate, Salon, Tech Crunch, Fast Company, Jezebel, Vice, Vulture, Fusion, Thought Catalog (is that still around?) etc. ... the list goes on and on. Echo chambers mostly, selling click bait and navel gazing, with objectives like those articulated below. At least in the old days publishers and editors acted as effective gate-keepers to quality journalism (because it mattered and markets responded accordingly). These days, not so much.]

By day, I am a wireless industry analyst and consultant. By night and on weekends, besides being an exercise and outdoors enthusiast, I write running guides. A few years ago, I self-published three books on running in the Boston area. In late 2015, I started a new project called Great Runs, which is a guide to the best places to go running in the world’s major cities and destinations. It’s geared toward travelers who run and runners who travel. This time, I decided to develop the content online, but I wanted more than a traditional blogging platform. A colleague recommended Medium, the online publishing platform started in 2013 by Twitter co-founder Evan Williams.

This has been a love-hate relationship from the get-go. By turns liberating and also maddening. I decided to focus a column on Medium because of its potential as a next-generation instrument for writers and readers: Ease of use, democratization and social journalism. But Medium also embodies a lot of what’s wrong with the web.

So here’s what’s fantastic. Medium is essentially a Version 2.0 blogging platform, allowing anyone from amateurs to professionals to corporations to post a story. Within five minutes, I was signed up and writing. The site is easy to use and visually elegant. Medium has kept things very simple, with limited formatting options. It’s easy to insert images, and they align and look beautiful. Content is auto-saved nearly constantly. I’ve hired some freelancers to develop content, and it’s easy to add them to Medium and edit their work. Write a piece, press "publish" and ba-bang, it’s out there for everyone to see. Social media sharing tools are well-integrated.

Authors are also interested in community, so the main Medium site has a list of tabs including Editor’s Picks, topics of the day and "For You," which seems to choose articles based primarily on folks I follow on Twitter, LinkedIn contacts and perhaps some relationship to tags in my stories (running, fitness, travel, etc.).

So, in many ways, Medium has been great. I’ve got more than 50 city guides up on the platform, and the responsive Great Runs "site" looks great on PCs, tablets and phones. I didn’t have to get a publisher or hire a web/WordPress/app developer.

And now for the downside. First beef: Discovery. Despite some pretty good content and a well-defined target market, getting my stuff discovered on Medium is hard. Really hard. The whole idea of a blog or "social journalism," as I think Ev calls it, is to build an audience. Yes, your Medium content is easily shared with your Twitter followers or your Facebook friends. So it’s great for Luluemon, which already has a huge social media presence. It now has more than 10,000 "followers" on Medium, and tons of folks recommending its content. For brands, established authors, and the companies who are seemingly flocking to Medium, it’s great. Because they already have an audience. (...)

My second major beef is monetization. As a side note, I am curious how Medium itself plans to make money. But as an author on Medium, there is presently no way to make any money from content. Blog sites, WordPress sites and so on all have some opportunity to run ads, host sponsors or sell content. But on Medium, nothing. Not even the ability to direct one’s Medium audience to a site where content could potentially be monetized in some way. (...)

In the end, some of Medium’s greatest benefits are also its biggest liabilities. Anyone can write on Medium. Which means anyone can write on Medium. There needs to be some delineation between the individual who wants to just post the occasional story on Medium and the individual/brand who want to use Medium for at least semi-professional or business purposes.

by Mark Lowenstein, Recode |  Read more:
Image: Lam L. / Yelp

How To Know If You’re Ready To Do A Tweetstorm


Tweetstorms! They’re in the news, for some reason. It seems like everyone’s doing them! Maybe you want to be a part of this magic social media experience too! Sure, go ahead! But before you start a tweetstorm, ask yourself the following questions, which should help you focus on the task at hand and really determine whether you are ready to speak to the masses.

1. Is what I have to say important?

2. Is it worth the time it will take for me to type it out?

3. Do its contents reflect an actual desire to impart knowledge on my part or is it just an obvious cry for attention?

4. Will what I share bring anything new to the conversation or am I just saying the same things everyone else has said before, but I am convinced they are more important because they are coming from me?

5. In a world which is already drowning in words that have no more meaning than a desire for acknowledgment on the part of the people who put them out there, will the torrent of verbiage I am about to inflict just add to the toxic landfill of the Internet?

6. Am I sure my opinion is so worthwhile that other people want to sit through a continuous short-burst monologue of it?

7. Twenty minutes after I post it and the likes and replies start trailing off will I feel empty at first and then increasingly embarrassed by the sheer narcissism it took to offer up my vapid and poorly-processed attempts at instruction? Will my cheeks burn with shame?

8. Who do I think I am? Why do I feel so alone? Am I about to cry?

by Alex Balk, The Awl |  Read more:
Image: Zooey

Sunday, August 7, 2016



via: here and here
 

via:

Man and Superman

In athletic competitions, what qualifies as a sporting chance?

Toward the end of “The Sports Gene” (Penguin/Current), David Epstein makes his way to a remote corner of Finland to visit a man named Eero Mäntyranta. Mäntyranta lives in a small house next to a lake, among the pine and spruce trees north of the Arctic Circle. He is in his seventies. There is a statue of him in the nearby village. “Everything about him has a certain width to it,” Epstein writes. “The bulbous nose in the middle of a softly rounded face. His thick fingers, broad jaw, and a barrel chest covered by a red knit sweater with a stern-faced reindeer across the middle. He is a remarkable-looking man.” What’s most remarkable is the color of his face. It is a “shade of cardinal, mottled in places with purple,” and evocative of “the hue of the red paint that comes from this region’s iron-rich soil.”

Mäntyranta carries a rare genetic mutation. His DNA has an anomaly that causes his bone marrow to overproduce red blood cells. That accounts for the color of his skin, and also for his extraordinary career as a competitive cross-country skier. In cross-country skiing, athletes propel themselves over distances of ten and twenty miles—a physical challenge that places intense demands on the ability of their red blood cells to deliver oxygen to their muscles. Mäntyranta, by virtue of his unique physiology, had something like sixty-five per cent more red blood cells than the normal adult male. In the 1960, 1964, and 1968 Winter Olympic Games, he won a total of seven medals—three golds, two silvers, and two bronzes—and in the same period he also won two world-championship victories in the thirty-kilometre race. In the 1964 Olympics, he beat his closest competitor in the fifteen-kilometre race by forty seconds, a margin of victory, Epstein says, “never equaled in that event at the Olympics before or since.”

In “The Sports Gene,” there are countless tales like this, examples of all the ways that the greatest athletes are different from the rest of us. They respond more effectively to training. The shape of their bodies is optimized for certain kinds of athletic activities. They carry genes that put them far ahead of ordinary athletes.

Epstein tells the story of Donald Thomas, who on the seventh high jump of his life cleared 7’ 3.25″—practically a world-class height. The next year, after a grand total of eight months of training, Thomas won the world championships. How did he do it? He was blessed, among other things, with unusually long legs and a strikingly long Achilles tendon—ten and a quarter inches in length—which acted as a kind of spring, catapulting him high into the air when he planted his foot for a jump. (Kangaroos have long tendons as well, Epstein tells us, which is what gives them their special hop.)

Why do so many of the world’s best distance runners come from Kenya and Ethiopia? The answer, Epstein explains, begins with weight. A runner needs not just to be skinny but—more specifically—to have skinny calves and ankles, because every extra pound carried on your extremities costs more than a pound carried on your torso. That’s why shaving even a few ounces off a pair of running shoes can have a significant effect. Runners from the Kalenjin tribe, in Kenya—where the majority of the country’s best runners come from—turn out to be skinny in exactly this way. Epstein cites a study comparing Kalenjins with Danes; the Kalenjins were shorter and had longer legs, and their lower legs were nearly a pound lighter. That translates to eight per cent less energy consumed per kilometre. (For evidence of the peculiar Kalenjin lower leg, look up pictures of the great Kenyan miler Asbel Kiprop, a tall and elegant man who runs on what appear to be two ebony-colored pencils.) According to Epstein, there’s an evolutionary explanation for all this: hot and dry environments favor very thin, long-limbed frames, which are easy to cool, just as cold climates favor thick, squat bodies, which are better at conserving heat.

Distance runners also get a big advantage from living at high altitudes, where the body is typically forced to compensate for the lack of oxygen by producing extra red blood cells. Not too high up, mind you. In the Andes, for example, the air is too rarefied for the kind of workouts necessary to be a world-class runner. The optimal range is six to nine thousand feet. The best runners in Ethiopia and Kenya come from the ridges of the Rift Valley, which, Epstein writes, are “plumb in the sweet spot.” When Kenyans compete against Europeans or North Americans, the Kenyans come to the track with an enormous head start.

What we are watching when we watch élite sports, then, is a contest among wildly disparate groups of people, who approach the starting line with an uneven set of genetic endowments and natural advantages. There will be Donald Thomases who barely have to train, and there will be Eero Mäntyrantas, who carry around in their blood, by dumb genetic luck, the ability to finish forty seconds ahead of their competitors. Élite sports supply, as Epstein puts it, a “splendid stage for the fantastic menagerie that is human biological diversity.” The menagerie is what makes sports fascinating. But it has also burdened high-level competition with a contradiction. We want sports to be fair and we take elaborate measures to make sure that no one competitor has an advantage over any other. But how can a fantastic menagerie ever be a contest among equals?

During the First World War, the U.S. Army noticed a puzzling pattern among the young men drafted into military service. Soldiers from some parts of the country had a high incidence of goitre—a lump on their neck caused by the swelling of the thyroid gland. Thousands of recruits could not button the collar of their uniform. The average I.Q. of draftees, we now suspect, also varied according to the same pattern. Soldiers from coastal regions seemed more “normal” than soldiers from other parts of the country.

The culprit turned out to be a lack of iodine. Iodine is an essential micronutrient. Without it, the human brain does not develop normally and the thyroid begins to enlarge. And in certain parts of the United States in those years there wasn’t enough iodine in the local diet. As the economists James Feyrer, Dimitra Politi, and David Weil write, in a recent paper for the National Bureau of Economic Research:
Ocean water is rich in iodine, which is why endemic goiter is not observed in coastal areas. From the ocean, iodine is transferred to the soil by rain. This process, however, only reaches the upper layers of soil, and it can take thousands of years to complete. Heavy rainfall can cause soil erosion, in which case the iodine-rich upper layers of soil are washed away. The last glacial period had the same effect: iodine-rich soil was substituted by iodine-poor soil from crystalline rocks. This explains the prevalence of endemic goiter in regions that were marked by intense glaciation, such as Switzerland and the Great Lakes region.
After the First World War, the U.S. War Department published a report called “Defects Found in Drafted Men,” which detailed how the incidence of goitre varied from state to state, with rates forty to fifty times as high in places like Idaho, Michigan, and Montana as in coastal areas.

The story is not dissimilar from Epstein’s account of Kenyan distance runners, in whom accidents of climate and geography combine to create dramatic differences in abilities. In the early years of the twentieth century, the physiological development of American children was an example of the “fantastic menagerie that is human biological diversity.”

In this case, of course, we didn’t like the fantastic menagerie. In 1924, the Morton Salt Company, at the urging of public-health officials, began adding iodine to its salt, and initiated an advertising campaign touting its benefits. That practice has been applied successfully in many developing countries in the world: iodine supplementation has raised I.Q. scores by as much as thirteen points—an extraordinary increase. The iodized salt in your cupboard is an intervention in the natural order of things. When a student from the iodine-poor mountains of Idaho was called upon to compete against a student from iodine-rich coastal Maine, we thought of it as our moral obligation to redress their natural inequality. The reason debates over élite performance have become so contentious in recent years, however, is that in the world of sport there is little of that clarity. What if those two students were competing in a race? Should we still be able to give the naturally disadvantaged one the equivalent of iodine? We can’t decide.

Epstein tells us that baseball players have, as a group, remarkable eyesight. The ophthalmologist Louis Rosenbaum tested close to four hundred major- and minor-league baseball players over four years and found an average visual acuity of about 20/13; that is, the typical professional baseball player can see at twenty feet what the rest of us can see at thirteen feet. When Rosenbaum looked at the Los Angeles Dodgers, he found that half had 20/10 vision and a small number fell below 20/9, “flirting with the theoretical limit of the human eye,” as Epstein points out. The ability to consistently hit a baseball thrown at speeds approaching a hundred miles an hour, with a baffling array of spins and curves, requires the kind of eyesight commonly found in only a tiny fraction of the general population.

Eyesight can be improved—in some cases dramatically—through laser surgery or implantable lenses. Should a promising young baseball player cursed with normal vision be allowed to get that kind of corrective surgery? In this instance, Major League Baseball says yes. Major League Baseball also permits pitchers to replace the ulnar collateral ligament in the elbow of their throwing arm with a tendon taken from a cadaver or elsewhere in the athlete’s body. Tendon-replacement surgery is similar to laser surgery: it turns the athlete into an improved version of his natural self.

But when it comes to drugs Major League Baseball—like most sports—draws the line. An athlete cannot use a drug to become an improved version of his natural self, even if the drug is used in doses that are not harmful, and is something that—like testosterone—is no more than a copy of a naturally occurring hormone, available by prescription to anyone, virtually anywhere in the world.

Baseball is in the middle of one of its periodic doping scandals, centering on one of the game’s best players, Alex Rodriguez. Rodriguez is among the most disliked players of his generation. He tried to recover from injury and extend his career through illicit means. (He has appealed his recent suspension, which was based on these allegations.) It is hard to think about Rodriguez, however, and not think about Tommy John, who, in 1974, was the first player to trade in his ulnar collateral ligament for an improved version. John used modern medicine to recover from injury and extend his career. He won a hundred and sixty-four games after his transformation, far more than he did before science intervened. He had one of the longest careers in baseball history, retiring at the age of forty-six. His bionic arm enabled him to win at least twenty games a season, the benchmark of pitching excellence. People loved Tommy John. Maybe Alex Rodriguez looks at Tommy John—and at the fact that at least a third of current major-league pitchers have had the same surgery—and is genuinely baffled about why baseball has drawn a bright moral line between the performance-enhancing products of modern endocrinology and those offered by orthopedics.

The other great doping pariah is Lance Armstrong. He apparently removed large quantities of his own blood and then re-infused himself before competition, in order to boost the number of oxygen-carrying red blood cells in his system. Armstrong wanted to be like Eero Mäntyranta. He wanted to match, through his own efforts, what some very lucky people already do naturally and legally. Before we condemn him, though, shouldn’t we have to come up with a good reason that one man is allowed to have lots of red blood cells and another man is not?

by Malcolm Gladwell, New Yorker |  Read more:
Image: Barry Blitt 

Middle-Aged Malaise

There’s a moment in Jay McInerney’s new novel, “Bright, Precious Days” (Knopf), when one of its principals, a book editor in his early fifties, comes to feel that he is a failure: “How was it that after working so hard and by many measures succeeding and even excelling in his chosen field, he couldn’t afford to save this house that meant so much to his family? Their neighbors seemed to manage, thousands of people no smarter than he was—less so, most of them—except perhaps in their understanding of the mechanics of acquisition.”

“Bright, Precious Days” forms a trilogy that began with “Brightness Falls” (1992), McInerney’s most accomplished and ambitious novel, and continued with “The Good Life” (2006). The three books revolve around Russell and Corrine Calloway, an attractive couple whose lives appear to be very nearly charmed. But the Calloways are restless types who have the misfortune of living on a certain “skinny island” where affluent professionals like them feel comparatively poor. In “Brightness Falls,” Russell became caught up in the leveraged buyout frenzy of the nineteen-eighties and recklessly attempted to buy the publishing company where he worked. “The Good Life” picked up the Calloways’ story fourteen years later, around the time of 9/11, when Corrine became involved with a man she met while volunteering at Ground Zero.

The new book, like its predecessors, is set against a major historical event—in this case, the financial crisis of 2008. The Calloways are still together. Russell now heads a small, independent publishing house with a focus on literary fiction. He yearns to make the company more profitable, but his big move in that direction backfires, humiliatingly. In the course of the novel, Russell, once brash and exuberant, is brought so low that, when Corrine spots him unexpectedly one day, she is thrown by “his slumped comportment, his slack demeanor, even by the gray in his hair. . . . He looked like one of those exhausted souls she saw every day on the subway, men she imagined stuck in jobs they hated, going home to wives they didn’t love.”

The final touch in this portrait of middle-aged malaise comes when Russell takes part in a ceremonial softball game in the Hamptons. A natural athlete, he sees the media-saturated event as a chance to redeem himself before the glitterati, if only for the duration of the game. But Russell plays badly, flubbing a key catch and allowing two decisive runs. When Corrine tries to cheer him up, Russell tells her not to bother:
“That was possibly the most mortifying moment of my adult life,” he added. 
“Oh, come on, it’s just a game.” 
“No, it’s not. It’s never just a game.” 
Nobody has a more exquisite appreciation than McInerney of the morbid, hypervigilant sensitivity we tend to harbor about our place in the world, especially when we’re feeling down.

Russell’s crisis of confidence coincides with Corrine’s renewed involvement with her attentive—and rich—love interest from “The Good Life.” (Though Russell has been guilty in the earlier books of his own indiscretions, he has grown too tired, or dejected, to bother with infidelity.) The contrast between these two story lines, and the picture that emerges of a marriage that seems both more stable and lonelier than it has ever been, is quietly affecting. The secret romantic longings and professional disappointments of people like the Calloways, who spend summers in the Hamptons and live in a Tribeca loft (albeit a rent-stabilized one), might seem too frivolous to be placed at the foreground of a novel, let alone three. But McInerney rejects satire’s self-protective distancing as surely as he resists its flattening effect on characterization; in tone, “Bright, Precious Days” is mellow, earnest, almost elegiac. It is intelligent, and knowing in its depiction of certain segments of New York (especially the world of publishing), but, unlike his best-known novels, it’s rarely dazzling.

That an author famous for slick, stylish evocation of drug-addled youth has evolved into a restrained, almost sombre chronicler of professional-class ennui may seem surprising. “Bright, Precious Days” is a far cry from “Bright Lights, Big City,” the novel that made McInerney an instant celebrity in 1984, at the age of twenty-nine. But, underneath the glamour and flash of his subject matter, he has always been a more committed psychological novelist than his reputation suggests.

Even “Bright Lights,” that most giddily evocative of eighties novels, isn’t really a period piece. It’s a highly disciplined work of fiction that happens to capture its period. That’s why it has aged better than the Brat Pack titles it’s typically associated with. Unlike some of those books, “Bright Lights” relies far less on the timeliness of its material than on the energy of its prose:
The night has already turned on that imperceptible pivot where two a.m. changes to six a.m. . . . Somewhere back there you could have cut your losses, but you rode past that moment on a comet trail of white powder and now you are trying to hang on to the rush. Your brain at this moment is composed of brigades of tiny Bolivian soldiers. They are tired and muddy from their long walk through the night. There are holes in their boots and they are hungry. They need to be fed.
McInerney maintains this brisk, moody comedy for the next hundred and eighty pages, as his unnamed narrator unravels in a bender.

The real drama of “Bright Lights” is not sociological. The narrator, however blitzed, thinks of himself as being, really, “the kind of guy who wakes up early on Sunday morning and steps out to cop the Times and croissants. Who might take a cue from the Arts and Leisure section and decide to check out an exhibition—costumes of the Hapsburg Court at the Met, say, or Japanese lacquerware of the Muromachi period at the Asia Society.” The disconnect between the narrator’s life and his almost comically staid vision of it is at the heart of the book. Why, McInerney earnestly wants to know, has this man lost his upper-middle-class bearings—why is he at a trashy night club in the middle of the night, chatting up a woman whose “voice is like the New Jersey State Anthem played through an electric shaver,” instead of living wholesomely and finding a nice girl (an editorial assistant, maybe, or a graduate student at an Ivy League school) to take to those exhibitions he imagines himself attending?

If this buttoned-up vision of the good life isn’t entirely convincing, neither is the answer McInerney offers—that the narrator is reeling from a family tragedy he hasn’t properly dealt with. The oversimplicity of this diagnosis wasn’t lost on McInerney, who has spent most of his career returning to the same questions, growing increasingly sophisticated in his attempts to understand the allure of self-destruction and the compromises required to support a sustainable degree of happiness for ambitious, intelligent (and relatively affluent) people.

by Adelle Waldman, New Yorker | Read more:
Image: Goodreads

Smart Drugs Made Me Dumber

[ed. For more on nootropics, see also: Nootropics Survey Results]

Why can’t an overworked, overstimulated, pharmaceutically-obsessed society just come up with the perfect chemical supplement to make us smarter, faster, and more engaged in today’s accelerated world? One without the self-destructive trade-offs like crashing hard, addiction, or eating your neighbor’s face?

Well, according to such cognitive boost energy junkies as stock marketeers, Silicon Valley wizards, and that CEO dude who advocates putting butter in your coffee to amplify caffeine’s effects, we have. Modafinil, a well-documented object of lust for the life hacking set, is a nootropic (smart drug) regularly prescribed for narcolepsy or “shift work disorder” that has been hailed as delivering intense focus, enhancing memory, and even seeming to actually increasing intelligence. Much like, say, Adderall, but without the drag of being pharmaceutical grade meth. Never mind that both have been shown to exacerbate sociopathic tendencies—a small price to pay for (temporarily) being your best self, right? Besides, it’s supposed to be lonely at the top.

A few years ago Modafinil became available in a generic form, meaning that suddenly Cephalon, the pharmaceutical juggernaut behind it, was about to lose their corner on the market. So what did they do? They introduced Armodafinil, aka the brand named Nuvigil. Newer, better, faster, stronger, and longer lasting, just like you will be when you take it.

But is it really better? Do any of these things really work? Would choking down an oblong white horse pill really thrust me across the DMZ of normal people and smack into genius territory? There’s only one way to find out—obviously I was gonna have to take some.

A little about me:

I just turned 41, and, honestly, it hurts a little. Turning 30 was fine, and 40 just slipped by, but 41… You’re forced to acknowledge you’ve somehow flipped over to the backside of life. Stereotypes about getting older—losing your keys, forgetting things, assorted aches and pains, a disconnect with what the kids are into these days—stop being stereotypes and start revealing themselves as small, bitter truths.

Working, as I do, in an industry dominated by hyper-intelligent workaholic twentysomethings makes these aging effects even more stark; there are times I’ll think the day’s winding down and be looking for a cozy place to nap while co-workers are just getting fired up. Luckily we live in an age when there are all manner of handy supplementals to jump-start the increasingly shrinking gelatinous bag of neurons sloshing around inside our skulls, and I’m not shy about doubling down on recommended dosages if that’s what I need to get ’er done.

But I also hate the inevitable spiral of despair and exhaustion that comes with mega-doses of external energy, and, as I get older, the consequences have become harder and harder to bear. Lately there are days when even the inside of my bones are tired, weighing me down as though filled with lead weights soaked in mononucleosis and apathy. Have you ever been so lethargic, texting is the physical equivalent of summiting Everest without oxygen? Welcome to my Wednesday.

Also, before we continue, it should be noted that neither The Daily Beast nor the author recommend dosing yourself with quasi-legal substances in the pursuit of performance or pleasure. I’m only doing so because A) journalism and B) how can these hipster smart drugs really be any worse than a few dozen Chicken McNuggets or gulping down enough THC to turn Maureen Dowd into the Mad Hatter? I’m a professional, after all.

Turns out that in today’s world it’s beyond easy to get your sweaty paws on a sample blister pack of Armodafinil pills. I didn’t even have to order them quasi-legally from a Canadian pill mill. Thus it was with a shrug, a smile, and a glass of cold-brew coffee to get it all going I tossed back 250 milligrams—the maximum recommended dosage—and set out to start a regular, but soon to be extraordinary, day.

I have to admit I was excited at the prospect of Nuvigil being the ultimate cure-all to arrest my slow devolution back to primordial ooze. Would it help me once again harness the full power of my once at least moderate mental acuity? Had some genius alchemists finally delivered on their promise of better living through chemistry, enough to bridge the gap between my sputtering middle-aged brain and the can’t-come-soon-enough onset of the Singularity?

Sadly, it appears not.

One hour after popping that first pill, and the cold brew coffee was beginning to wear off.

Which is normal. What’s immediately suspicious, however, is the total lack of energy coming up behind it. Had I taken Adderall, the creeping doldrums of caffeine depletion would have been chased away by a central nervous system kickstart so acute it’d immediately send you running to deposit any dead weight in the nearest toilet, a joyful evacuation heralding the forthcoming clean and clear mental state. Yet here I sat, waiting patiently, and… Nothing.

I gave it another fifteen minutes before tossing back the second 250 mg tablet dry, choking it down like a dope fiend discovering a forgotten Oxy in the desert. Stymied, I paced around my home office, tapping out a few short emails and growing more and more impatient. Where was my superhuman computing power? I checked the expiration date on the blister pack—there were years before they went bad. I was in the midst of Googling “too dumb for smart drugs” when things got weird.

First, my stomach flip-flopped. Almost like having gas, but without the bloating. Actually, it was more like I was suddenly aware of my stomach, like I could feel with it the same as I do my hands or feet. The feeling radiated outward, an intensity that made my whole body hold its breath in anticipation, every pore and follicle puckering inward. It was similar to eating super-strong MDMA, the big buildup before the epic meltdown.

“Here we go,” I thought, clenching and unclenching my fists and chewing on my top lip, fantasizing about all of the brilliance I was about to unleash upon the world.

Amphetamines, such as Adderall, work by increasing monoamine and other neurotransmitters in the brain, most notably dopamine and norepinephrine. This causes that laser-like focus, as well as the intense, euphoric highs. Conversely, the way that Armodafinil and Modafinil work on the human physiology is a mystery. Scientists literally have no idea why they do what they do. Yet it’s approved by the good ol’ FDA, and many of our nation’s best and brightest shovel these pills down their gullets like a technicolor sugary cereal on a Saturday morning—possibly even including President Obama himself.

My breaths are coming in short, deep gasps, and I’m suddenly aware of all the hair on my body growing, pushing inexorably outward from my dermis. Jaw clenched, I sit down on the couch and try to distract myself from the intense oversensitivity by flipping through social media, but it doesn’t help. I can’t seem to lock my attention on to anything—it’s like sudden onset ADHD mixed with binge eating a pound of Sour Patch Kids. Yet even through the discomfort, I’m excited. It’s happening! Soon, I’ll be amongst the Mensa crowd, if only for a little while. My body temperature increases, hot from the inside, my ears venting heat like twin chimneys.

And then, as quickly as it all came on, it’s gone.

by James Joiner, The Daily Beast |  Read more:
Image: Shutterstock

Saturday, August 6, 2016

How an Archive of the Internet Could Change History

A few years ago, the Brooklyn Museum put on a Keith Haring exhibition, with a focus on his early career. There were videos of Haring at work, feverishly painting his way across an enormous scroll, and a room filled with drawings he illegally chalked in subway stations. But most stunning, at least to me, were Haring’s notebooks. They were displayed under clear cubes, their well-worn sheets pinned open for visitors to study.

The notebooks were sublimely surreal, filled with dogs crawling beneath bulbous U.F.O.s and penises ejaculating alongside concave cylinders that looked like nuclear cooling towers. By the time I first encountered Haring’s work as a teenager, his artistic legacy had been reduced to catchy imagery of colorful, blocky bodies hugging and dancing on T-shirts. But the notebooks showed what nagged at the artist, what motivated him. I saw someone so suspicious of government surveillance that he often wrote in secret code, someone obsessed with the subversive power of gay sex and someone working to merge his skepticism of capitalism with a deep-­rooted desire for fame and commercial appeal.

I left with an urgent curiosity about what sort of artifacts we would display a few decades from now, for future generations to discover. Our contemporary analogues to the personal notebook now live on the web — communal, crowdsourced and shared online in real time. Some of the most interesting and vital work I come across exists only in pixels. Tumblr, for example, contains endless warrens of critical theory about trans identity politics and expression, one of the few havens on the web where that sort of discourse exists. Many of the short videos on Vine feel as though they belong to an ever-­evolving, completely new genre of modern folk art. Some of the most clever commentary on pop culture and politics is thriving deep in hashtags on Twitter. Social media is as essential to understanding the preoccupations and temperature of our time as Haring’s notebooks were for his. But preserving materials from the internet is much harder than sealing them under glass.

Building an archive has always required asking a couple of simple but thorny questions: What will we save and how? Whose stories are the most important and why? In theory, the internet already functions as a kind of archive: Any document, video or photo can in principle remain there indefinitely, available to be viewed by anyone with a connection. But in reality, things disappear constantly. Search engines like Google continually trawl for pages to organize and index for retrieval, but they can’t catch everything. And as the web evolves, it becomes harder to preserve. It is estimated that 75 percent of all websites are inactive, and domains are abandoned every day. Links can rot when sites disappear, images vanish when servers go offline and fluctuations in economic tides and social trends can wipe out entire ecosystems. (Look up a blog post from a decade ago and see how many of the images, media or links still work.) Tumblr and even Twitter may eventually end up ancient internet history because of their financial instability.

There are scattered efforts to preserve digital history. Rhizome, an arts nonprofit group, built a tool called Webrecorder to save parts of today’s internet for future generations. The Internet Archive’s Wayback Machine has archived hundreds of billions of web pages. But there’s still a low-grade urgency to save our social media for posterity — and it’s particularly urgent in cases in which social media itself had a profound influence on historic events. (...)

Social media might one day offer a dazzling, and even overwhelming, array of source material for historians. Such an abundance presents a logistical challenge (the total number of tweets ever written is nearing half a trillion) as well as an ethical one (will people get to opt out of having ephemeral thoughts entered into the historical record?). But this plethora of new media and materials may function as a totally new type of archive: a multidimensional ledger of events that academics, scholars, researchers and the general public can parse to generate a more prismatic recollection of history. (...)

The internet is pushing us ­— in good ways and in bad — to realize that the official version of events shouldn’t always be trusted or accepted without question. And historians are constantly updating the record by looking for primary sources that were overlooked in earlier eras, often from marginalized figures. These days, such omissions will still happen, but we can catch them faster. Oversights that would have taken decades to correct are now resolved in weeks, even hours. We now get a kaleidoscopic view of events as they unfold, often in real time, on our screens and devices. History is not neutral or synonymous with truth, but the internet affords us a newfound vantage on the totality of passing time — the profound implications of which we are just now beginning to grasp.

by Jenna Wortham, NY Times |  Read more:
Image: Adam Ferriss

[ed. From the comments section: MMonck:]

We’ve long known that this is how human history works — an unimaginable number of small stories, compressed into one big one. But maybe now we finally have the ability to record and capture them all, and history can become something else entirely: not a handful of voices, but a cacophony."

Great summation. I largely agree. However, I can't disagree more with the point, "...capture them all".

I had my Twitter and Facebook accounts deleted a couple of years ago. Does this make me, and the millions of others that have done the same thing, like the dead trees that never made a sound when they fell because no one heard their falling?

Is the Internet, especially the narcissist infested Twitter, the true sources of history? The cacophony that makes up history is way beyond the Internet...in personal files, notebooks and photos (electronic and non-) not "shared" and "liked" on the Internet.

As a former technology and Internet use researcher, what makes up the content of the Internet, even today, is a such an incredibly skewed point of view, I fear for anyone a hundred years from now drawing any conclusions about the totality of the human voice and experience based on the Internet.


Rolling Stones


[ed. Country (... or is it Western?), from some English guys.]

Massive Attack, feat. Young Fathers


[ed. Definitley not for the squeamish. But Rosamond Pike (Gone Girl) is amazing.]

Krzysztof BaranowskiSeoul Cityscape 4761

Friday, August 5, 2016

Brazil: The New Frontier?

Of all the things golf has going for it, one thing it doesn't have is an Olympic heritage. Golf was included in the half-assed 1900 Games in France (historians call them the farcical Olympics), but just barely. There was a stroke-play event with all of 12 competitors, several of whom didn't realize that it was connected to something called the Olympics. There was also a full-handicap event won by a vacationing American from St. Louis named Albert Lambert. Wealthier than he was skilled, Lambert was nonetheless so delighted with his medal that, when St. Louis was awarded the 1904 Games, he managed to get golf included. Lambert's two claims to actual fame are that his company (later Warner-Lambert) invented Listerine and that he was the main sponsor of Charles Lindbergh's trans-Atlantic flight. He gave enough money to the effort that the St. Louis airport was eventually named for him. The golf on display at the St. Louis Olympics was not exactly international in scope, competed as it was by 74 Americans and three Canadians. One of the Canadians won, George Lyon.

Similarly, of all the things golf has going for it, it doesn't have much of a foothold in this year's Olympic host country, where the sport returns to competition. Brazil has roughly 200 million people, and its land mass is the fifth-largest on Earth. On such a massive canvas, there are only about 110 courses and 20,000 people who play. In Rio de Janeiro, host city and home to 6.5 million people, there are perhaps a few more than 1,000 families who belong to one of two private clubs.

People here are poor. The average annual income in Rio is approximately R$20,000 (about $6,000 in U.S. dollars). Steeply discounted memberships in Rio's two private clubs go for more than that. But it's also cultural. To understand Rio, one regular visitor said, you have to understand the beaches. This reporting trip began on what turned out to be a four-day weekend. My guide, Eduardo, had no idea what the holiday might be—his area of specialization is golf, and he had only recently resigned from Rio 2016's golf staff. Besides, he said, "Brazil has too many holidays." It turned out to be Dia de Tiradentes, which marks the hanging of an 18th-century revolutionary for advocating independence from Portugal.

In any case, walk Ipanema Beach on such a weekend. For four days and most of the nights, the entire beachfront is packed. Soccer and beach tennis are played every few yards. Watch the beach volleyball and realize what a weak facsimile the Olympic version is. These two-person sides feature rallies that last for a minute—and they're not using their hands. They tee the ball up on a mound of sand to kick it off and then use their heads, chests, knees, feet, occasionally backs or butts—to keep rallies alive. It's extraordinary. The people of Rio live for the beach. They live on the beach. Every kind of business is there on the sand—from alcoholic beverages to vendors selling corn on the cob to freaking massage. It's a complete and self-sustaining universe, and it's just as alive on the beaches of Copacabana and Barra da Tijuca and São Conrado.

Golf doesn't enter the public imagination. Most people know what it is, but it's not covered in the newspapers and it's not broadcast on television. It's nearly invisible.

And into these contexts, Gil Hanse has designed and built (with the help of his superintendent Neil Cleverly) an ambitious course intended not only to host the best in the world as they compete for gold, silver and bronze, but to become what Rio 2016 touts as the first public golf course in the country, a new beginning for golf and a bet on its future.

Off The Grid

If golf does have a future in Brazil, it might be right here, on what is actually the country's first, and for now, only public course. The Associacão Golfe Publico de Japeri is about 70 minutes by car from the Barra da Tijuca section of Rio, site of the Olympic course. And believe me when I say "by car," which is the only way to get from Rio to Japeri unless you want a three-hour trip by bus, train and foot.

But Japeri is even farther than the mileage. When Eduardo told his mother the night before where we were going, she asked us not to. Japeri, a city of about 100,000 people, is notoriously violent and is last in the state of Rio de Janeiro's human-development rankings.

On our way to Japeri, Google Maps failed us, which was particularly disconcerting because we were nearly out of gas. (Eduardo had underestimated the distance.) But then, suddenly, just a few hundred yards off a new and desolate highway, there it was: a concrete bunker with a metal roof. Baked-dirt parking lot for maybe five or six cars. Fairways burned white by the ongoing drought. The range—the left half of which doubles as the fairway for the first hole—looks like an open, arid field. Which is exactly what these hundred acres were—a farm fallen into arrears on the outskirts of town—until about 15 years ago, when Jair Medeiros, a caddie from Gavea Golf and Country Club in Rio, along with a dozen other caddies, started bringing used clubs and balls back home and knocking them around.

One day, Jair approached the mayor about turning the tract into an actual golf course. The mayor was amenable, so Jair enlisted the support of Vicky Whyte, a Gavea member who's also the first female South American member of the R&A. It was her idea to seek funding for the enterprise by making it more than a golf course, a bastion of hope in a distressed place.

Whyte, with her son Michael, applied for a grant from the R&A and secured a sponsorship from Nationwide Insurance to build a course. The sponsorship was insufficient to fund 18 holes, so Whyte commissioned Brazilian architect Ricardo Pradez to build nine. "The construction process was slow," Whyte says. "Very little earth-moving and shaping was done. We went with the lie of the land. We did, however, build two lakes for drainage and irrigation purposes. Money was always a problem."

The course opened in 2005. Jair is now in charge and employs most of those original caddies from Gavea to maintain the facility.

The golf course is modest. Well, modest is generous. There is little irrigation—only the greens get water—so the course is as hard as rock. When the new highway came through, it stole three holes. Jair and his staff, with equipment on loan from the mayor, built replacement holes with the help of Fabio Silva, the agronomist at Gavea. Michael Whyte will tell you that the course forces you to use every club in your bag—the signature hole is a winding, downhill par 4 to a natural island green. But it's by no means a gimme that, were the flagsticks removed, most people would readily identify this place as a golf course. But, given what goes on at the academy (more on that in a bit), that's almost beside the point.

In The Stadium

We're in another metal-roofed building. This time, it's the construction offices of the Olympic course, and we're talking to the only person in the world who holds the title of "Superintendent, Olympic Golf Course." It's right there, stitched on the breast of his dark-green shirt and matching cap.

This is the day before our trip to Japeri, and when Cleverly hears that we're heading there, says, "You're gonna see one extreme to another when you look at that facility, because it's still golf in some shape or form. But to transition to that from this is a huge, huge thing."

Cleverly is wiry, wired and intense. His bearing is military, which is how he spent his time before his career in golf. He's burned brown, not a natural shade for an Englishman. He was hired to take an inhospitable and inadequate piece of land (it's only about 100 acres) in a country where there is little inborn interest in the sport and, with an absolute dearth of experienced workers, help Gil Hanse build a world-class golf course. It is especially punishing to take on an Olympics during a crushing recession and only two years after hosting a World Cup. On this day, Cleverly was 104 days away from the beginning of the Games. When his course will then be trod upon by caddies and players and beat to hell by 15,000 ticket holders per day, and none of it, not one square inch of it, has ever really been tested. Well, unless you count the single rehearsal event, in which a couple dozen Brazilian pros played in front of a few hundred spectators and, even then, the roping was all screwed up and players got touchy when the crowd jostled them. One hundred and four days, and he still doesn't have enough mowers. He's also waiting for someone at the organizing committee to approve and organize the logistics to bring several dozen volunteer superintendents from around the world who are willing to pay their expenses and donate their time and labor to get the course into the pristine shape that the only course ever built for Olympic golf deserves.

It has been a challenge, and it's going to continue to be one. But when you're out on the course with Cleverly, and he shows you the green complex at No. 7, the way the rhythm of it is set up by the false front on the left side of the green, which falls gently toward the center, where it's joined by the razor edge of the bunker face, which rises to create an almost musical symmetry—his love of this course and all the pain it has caused him is right there in his eyes, even though he deflects any credit from himself and pronounces Gil Hanse "an artist."

When you ask Cleverly what this course's future is, it's a loaded question because, though he has worked on 11 other courses, this is the one by which the world will judge him. More than that, he's deeply concerned whether this course is an opportunity Brazil is capable of taking advantage of, or, like so many former Olympic facilities, it will fall into disuse. "The only answer I can give you would be the 'if' scenario," Cleverly says. "If there would be a street-level golfing commodity, I'd say that this golf course would be the most played on a regular basis. But because I feel now, having been here for the last three years, and realizing that unless things change from a street level—the price of equipment, the price of a round, the mentality of people ... from what I've learned in my time here, it's really hard for people to live normal lives in this country. So there has to be some kind of format where they can exhibit the golf course to juniors or schoolchildren. Bring them to the driving range, show them how to play golf, and if you get a percentage of those that are interested, then maybe there will be the roots of something in Brazil. It's easy to talk a good game and to say that you have some kind of a plan, but whether they step up and do it is another matter."

by David Granger, Golf Digest |  Read more:
Image: Dom Furore