Friday, January 31, 2020

The Secretive Company That Might End Privacy as We Know It

Until recently, Hoan Ton-That’s greatest hits included an obscure iPhone game and an app that let people put Donald Trump’s distinctive yellow hair on their own photos.

Then Mr. Ton-That — an Australian techie and onetime model — did something momentous: He invented a tool that could end your ability to walk down the street anonymously, and provided it to hundreds of law enforcement agencies, ranging from local cops in Florida to the F.B.I. and the Department of Homeland Security.

His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.

Federal and state law enforcement officers said that while they had only limited knowledge of how Clearview works and who is behind it, they had used its app to help solve shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.

Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.

But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. The computer code underlying its app, analyzed by The New York Times, includes programming language to pair it with augmented-reality glasses; users would potentially be able to identify every person they saw. The tool could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew.

And it’s not just law enforcement: Clearview has also licensed the app to at least a handful of companies for security purposes.

“The weaponization possibilities of this are endless,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University. “Imagine a rogue law enforcement officer who wants to stalk potential romantic partners, or a foreign government using this to dig up secrets about people to blackmail them or throw them in jail.”

Clearview has shrouded itself in secrecy, avoiding debate about its boundary-pushing technology. When I began looking into the company in November, its website was a bare page showing a nonexistent Manhattan address as its place of business. The company’s one employee listed on LinkedIn, a sales manager named “John Good,” turned out to be Mr. Ton-That, using a fake name. For a month, people affiliated with the company would not return my emails or phone calls.

While the company was dodging me, it was also monitoring me. At my request, a number of police officers had run my photo through the Clearview app. They soon received phone calls from company representatives asking if they were talking to the media — a sign that Clearview has the ability and, in this case, the appetite to monitor whom law enforcement is searching for.

by Kashmir Hill, NY Times | Read more:
Image: Clearview
[ed. See also: Facing Up to Facial Recognition (IEEE Spectrum).]

Why You Should or Shouldn't Fear Slaughterbots

Thursday, January 30, 2020

Brad Pitt and the Beauty Trap

The meaning of Brad Pitt — as actor, star and supreme visual fetish — can be traced to the moment in the 1991 film “Thelma & Louise” when the camera pans up from his bare chest to his face like a caress. William Bradley Pitt was born 1963, but Brad Pitt sprung forth in that 13-second ode to eroticized male beauty, initiating a closely watched career and life, dozens of movies, and libraries of delirious exaltations, drooling gossip and porny magazine layouts.

The delirium has resumed with Quentin Tarantino’s “Once Upon a Time … in Hollywood,” in which Pitt plays the Pitt-perfect role of Cliff Booth, a seasoned stunt man and coolest of cats. Everything about Cliff looks so good, so effortlessly smooth, whether he’s behind the wheel of a Coupe de Ville or strolling across a dusty wasteland. The novelist Walter Kirn once wrote that Robert Redford “stands for the [movie] industry itself, somehow, in all its California dreaminess.” In “Once Upon a Time,” Tarantino recasts that idea-ideal with Cliff, exploiting Pitt’s looks and charm to create another sun-kissed, golden and very white California dream.

So of course Tarantino being Tarantino has Cliff-Pitt doff his shirt, in a scene that both nods to the actor’s foundational “Thelma & Louise” display and offers another effusive paean to masculine beauty. It’s a hot day; Cliff is scarcely working. So he grabs his tools and a beer and scrambles on a roof to fix an antenna, wearing pretty much what Pitt first wears in “Thelma & Louise.” Then Cliff strips off his Hawaiian shirt and the Champion tee underneath it and once again, Brad Pitt stands bare-chested, soaring above both Hollywood and our gaze, the already porous line between actor and character blurring delectably further.

On Feb. 9, Oscar night, our gaze will again fix on Pitt, who has been nominated for best supporting actor for his role in “Once Upon a Time.” It’s nice that his peers bothered because they’ve been reluctant to honor him in the past. Despite his years of service and critically praised roles, Pitt has won just one Oscar: a best picture statuette for helping produce “12 Years a Slave.” As an actor, he has been nominated three previous times: once for supporting (“12 Monkeys”) and twice for lead (“The Curious Case of Benjamin Button” and “Moneyball”). As a reminder, Rami Malek, Eddie Redmayne and Roberto Benigni have all won best actor. (...)

Critics could be unkind (guilty), but as the bad movies gave way to good, the notices improved. Soon, it became a favorite cliché to write that he was a character actor trapped in the body of a star (guilty again). Some of this, I think, stems from a suspicion of beauty, that it can’t be trusted, is “merely” superficial and silly, which makes the beautiful one also superficial and maybe even worthy of contempt that can lurk under obsession. There’s nothing new about how we punish beauty. The history of movies is filled with the victims of this malignant love-to-love and love-to-hate dynamic, not all of them women. (...)

Pitt should have been nominated this year for best actor for his delicate, deep work in James Gray’s “Ad Astra,” a meditation on the unbearable weight of masculinity set largely in outer space. The film was praised as was Pitt’s turn, but neither found awards momentum. The performance was too good and certainly too subtle and interiorized for the academy. It has a historic weakness for showboating — the more suffering the better — which is why Joaquin Phoenix (often otherwise worthy) and his jutting rib cage in “Joker” seem like a lock. But Pitt has time. It took seven nominations for Paul Newman to win best actor; Redford has been nominated only once for acting (he lost).

Like Newman and Redford, Pitt has always seemed born to the screen, a natural. He has a palpable physical ease about him that seems inseparable from his looks, that silkiness that seems, at least in part, to come from waking up every day and going through life as a beautiful person. This isn’t to say that good-looking people don’t have the same issues, the neuroses and awkwardness that plague us mortals. But Pitt has always moved with the absolute surety you see in some beautiful people (and dancers), the casualness of movement that expresses more than mere confidence, but a sublime lack of self-consciousness and self-doubt about taking up space, something not everyone shares. This isn’t swagger; this is flow. (...)

In the years since “Fight Club,” the film has been embraced without irony and apparently without humor by men’s rights partisans. I wonder if they think Tyler is hot, and what exactly they see when they look at his body. Movies have always banked on the audience’s love of male violence. Throughout their history, they have exploited male beauty, tapping the passion it inspires. “Everybody wants to be Cary Grant. Even I want to be Cary Grant,” said, well, Cary Grant.

But the beautiful man can make us nervous, partly because he complicates gender norms. George Clooney is more than a pretty face, more than one writer has insisted. Yes, but he is also pretty. Some of this anxiety reeks of gay panic and misogyny.

by Manohla Dargis, NY Times | Read more:
Image: Andrew Cooper/Columbia via

The Truth About "Dramatic Action"

"As far as I know, trying to contain a city of 11 million people is new to science.” This was how Dr. Gauden Galea, the World Health Organization’s country representative in China, described the situation facing the city of Wuhan when asked late last week for his update on the coronavirus outbreak.

It was clear from Galea’s remarks that the total containment of Wuhan, the city where I have lived for the past few decades, was not a course of action the WHO had recommended. Nor did the organization have any clear view on whether such an action would prove effective in limiting the spread of the disease. “It has not been tried before as a public health measure,” he said, “so we cannot at this stage say it will or will not work.”

I am now one of 11 million people in Wuhan who are living through this grand experiment, a measure that, Galea also said, shows “a very strong public health commitment and a willingness to take dramatic action.” From inside the curtain that now encloses my city, I wish to offer my thoughts on this “dramatic action,” and to judge what we have actually seen and experienced in terms of commitment to public health.

Closing Up the Cities

At 2AM on January 23, authorities in Wuhan suddenly issued the order to close off the city. According to the order, from 10AM that same day, all public buses, subways, ferries, long-distance buses and other transport services would be suspended; the airport and train stations would be shuttered. At this point, the WHO might have had reservations about the necessity and effectiveness of this strategy – but in any case, is was irreversible, and it would soon extend to neighboring cities as well.

In less than two days, up to noon on January 24, a total of 14 cities in Hubei province would be brought into the quarantine zone. These cities, with a population of around 35 million, include: Huanggang (黄冈) and E’zhou (鄂州), were quickly brought under the order for closure. More cities followed: Chibi (赤壁), Xiantao (仙桃), Zhijiang (枝江), Qianjiang (潜江), Xianning (咸宁), Huangshi (黄石), Enshi (恩施)、Dangyang (当阳), Jingzhou (荆州), Jingmen (荆门) and Xiaogan (孝感).

This was no longer a city under lockdown, but effectively an entire province under quarantine.

Galea and other foreign experts have expressed a sense of awe about the boldness of the quarantine in Hubei province. Over the weekend, the New York Times quoted Dr. William Schaffner, an expert on infectious disease from Vanderbilt University, as saying that the lockdown is a “public health experiment, the scale of which has not been done before.” Schaffner was clearly astonished: “Logistically, it’s stunning, and it was done so quickly.”

China’s capacity to impress with such grand gestures calls to mind talk of the “Chinese miracle,” often used to describe the performance of the country’s economy over four decades. But is it fair to regard this case of large-scale quarantine also as a “Chinese miracle” in public health?

Shutting People’s Mouths

Everyone must understand, first of all, that this epidemic was allowed to spread for a period of more than forty days before any of the abovementioned cities were closed off, or any decisive action taken. In fact, if we look at the main efforts undertaken by the leadership, and by provincial and city governments in particular, these were focused mostly not on the containment of the epidemic itself, but on the containment and suppression of information about the disease.

The early suppression of news about the epidemic is now fairly common knowledge among Chinese, and many people view this failure to grapple openly with the outbreak as the chief reason why it was later seen as necessary to take the “dramatic action” of closing down my city and many others.

The direct cause of all of this trouble is of course the new coronavirus that has spread now from Wuhan across the globe and has everybody talking. Up to January 24, in Hubei province alone, there were 549 admitted cases of the virus. Among these there have been 24 deaths. But the real numbers are still unknown.

According to reports from Caixin Media, one of China’s leading professional news outlets, the entire situation began on December 8, with the discovery of the first known case of an infected patient in Wuhan, a stall operator from the Huanan Seafood Market. The Huanan Seafood Market is a large-scale wet market, with an area about the size of seven football pitches and more than 1,000 stalls. The market has a constant flow of customers, making it the ideal place for the spread of infectious disease. A seafood market only in name, it sells a wide array of live animals, including hedgehogs, civet cats, peacocks, bamboo rats and other types of wild animals. At this market, the nearly inexhaustible appetite, and insatiable greed and curiosity of Chinese diners is on full display.

The number of infected people rose rapidly, reaching 27 people within a short period of time. Health professionals in Wuhan began suspecting in early December that this was an unknown infectious disease, not unlike the Severe Acute Respiratory Syndrome (SARS) that emerged in southern China in 2003. The ghost of SARS seemed to wander Wuhan in December, and rumors spread farther and farther afield of a new disease on the prowl.

China is a society closely monitored by the government, and the shadow of Big Brother is everywhere. Social media in particular are subject to very close surveillance. So when the authorities detected chatter about the re-emergence of SARS, or of a similar unknown outbreak, they took two major steps initially. First, they tried to ensure that this new outbreak remained a secret; second, they put the stability preservation system into effect (启动稳控机制). On December 30, the Wuhan Health Commission (武汉市卫建委) issued an order to hospitals, clinics and other healthcare units strictly prohibiting the release of any information about treatment of this new disease. As late as December 31, the government in Wuhan was still saying publicly that there were no cases of human-to-human transmission, and that no medical personnel had become infected.

by Da Shiji, China Media Project | Read more:
Image: uncredited

Wednesday, January 29, 2020


Jack Whitten, Space Flower #9, 2006

How Billie Eilish Harnesses The Power Of ASMR In Her Music

The Little Man on the Big Screen

In the early 1940s, there was a cinematic battle raging between two populist filmmakers — Frank Capra and Preston Sturges. If you watch their movies today, you’re almost certain to like Preston Sturges’s better. They’re wild, chaotic, hilarious films that assume all governing officials are ridiculously corrupt and pretty much all ordinary citizens are scrambling and hustling and stuttering and screeching and flailing in their mad slapstick efforts to succeed in America.

It’s telling that the Coen brothers, the contemporary filmmakers most overtly engaged with conveying the American experience, cite Sturges frequently. One of their most popular films, O Brother, Where Art Thou? (2000), was directly inspired by Sturges’s Sullivan’s Travels (1941), in which a successful Hollywood director of comedies, John L. Sullivan, yearns to make a serious, socially conscious drama entitled O Brother, Where Art Thou?, a movie Sullivan explicitly sees as his own Capra picture. And The Hudsucker Proxy (1994), perhaps the Coens’ least popular film, was directly inspired by Capra’s film Meet John Doe (1941). Though the plotting is Capraesque, the tone of the film is far closer to Sturges — hectic and satirical. The combination was an uneasy one.

Sturges himself had better luck taking on Capra. In the 1930s, when Sturges arrived in Hollywood, Capra had reached the dizzying peak of his career, directing hit after hit with It Happened One Night, Mr. Deeds Goes to Town, You Can’t Take It with You, and Mr. Smith Goes to Washington. Capra was the Spielberg of his day — world-famous, revered, loaded up with Academy Awards, and celebrated on the cover of Time magazine. He rivaled director John Ford in presenting America to itself in instantly mythologizing terms that the public loved. Decades later, actor and independent filmmaker John Cassavetes would say, “Maybe there really wasn’t an America. Maybe there was only Frank Capra.”

It was this towering patriotic mythology that Sturges riotously satirized in a series of frenetic screwball comedies such as The Great McGinty, Christmas in July, The Lady Eve, Sullivan’s Travels, The Palm Beach Story, The Miracle of Morgan’s Creek, and Hail the Conquering Hero. For Sturges, American life was the experience of being whipsawed around by unseen forces of corruption and incompetence. Kicking the slats out from under meritocracy by showing the ways sky-high success and abject failure result from a fundamentally insane system are signature Sturges moves. (...)

Among cinephiles, the reputation of Sturges grows shinier every year, even as Capra’s becomes more tarnished. Both directors were conservatives who fundamentally distrusted politics. Both were also enamored of their own ideas about America and its people, and both were egomaniacal auteurs, film “authors,” long before that term came into use.

But beyond those similarities, the two directors could hardly have seemed more at odds. Capra was the driven child of impoverished immigrant Sicilian laborers. Like many people who come up the hard way, he grew to love the idea of its hardness, and to romanticize the lone individual struggling for success as the figure that should be at the center of American society, a moral paragon setting the bootstrapping standards for others to emulate.

Sturges, on the other hand, came up the soft way. He was the son of a bold American self-creationist named Mary Dempsey, who changed her name to D’Este and convinced herself she truly was the daughter of Italian nobility until one of the real D’Estes sued her for using the family name on her line of cosmetics. She altered the name to Desti and found some success as an entrepreneur, able to spend most of her time lounging around the Continent having affairs with a wide range of fringe characters including occultist Aleister Crowley and participating in art happenings with her idol and best friend, Isadora Duncan, the modern dance pioneer. Desti gave Duncan the dramatically long scarf that caught on the back wheel of Duncan’s convertible and snapped her neck, a perfect bohemian death.

Preston “got dragged through every museum in Europe” by his art-addled mother and came to hate anything smacking of pretentious high culture. Instead, he adored his stepfather, Solomon Sturges, a Chicago stockbroker whose mild, stable, plainspoken character came as a refreshing change. Though Preston was often broke, making and losing several fortunes in his lifetime, he was never poor. He had tremendous cultural capital: he spoke fluent French, wore custom-made suits, and had gone to boarding schools with the sons of dukes and prime ministers.

His vision of America was of a chaotic but protean place where the next person you met might be the key to either dizzying success or total disaster. Sturges attempted to forge a career as a cosmetics tycoon, an inventor, a songwriter, and (reluctantly) the kept husband of a wealthy wife, failing at everything until he tried his hand at being a playwright. Finally, his erratic energies and brilliant facility with language found a home and a hit with Strictly Dishonorable, a title he got from his own line to a date who asked what his intentions were that evening. He loved America, with its fast pace, high risk, and popping energy.

It was an altogether different America that Capra loved. Capra, unlike Sturges, was generally and erroneously regarded as a highly political, left-wing populist, always bravely willing to court controversy in order to make “significant” films celebrating the common man and exposing the ways the system didn’t work. Mr. Smith Goes to Washington was considered borderline seditious by members of Congress, who raised a ruckus over its portrayal of a US Senate rotten with corruption. The Daily Worker even praised Capra for his “notable progressive films in the 1930s.”

And the French public, given the chance by the Vichy government to vote on which Hollywood movie they’d like to see before the Nazis ended the import of American films, chose Mr. Smith Goes to Washington as an inspirational representation of a still-thriving democracy that can speak truth to power.

Capra created heroes who are idealists full of small-town American values, brave and adventuresome in their Boy Scout–ish endeavors, generally tongue-tied except when quoting their role models Jefferson and Lincoln. They’re often played by tall, lanky, all-American actors like Gary Cooper or Jimmy Stewart. They go to the big city and meet the cynical, self-serving representatives of financial and state power and are almost undone by the depths of their corruption. But then, at the climactic point, they come back strong and show that the individual can go up against the capitalist forces of darkness and save American democracy.

This last-ditch triumph always occurs with the crucial help of a tough, brainy, and cynical career woman typically played by Jean Arthur or Barbara Stanwyck, who actually has the know-how to fight the system she learned from the inside, once she’s won over to the hero’s cause.

One idealistic man saving American democracy, aided by one formidable woman, can still only do it by inspiring the people to action, starting with that man’s hordes of friends. Like the angel says in It’s a Wonderful Life, “No man is a failure who has friends.” Especially friends who show up with money when the shit hits the fan, as occurs at key points in It’s a Wonderful Life and You Can’t Take It with You, defying the machinations of the rich and powerful with heartwarming hatfuls of crumpled dollar bills. And don’t think you won’t be moved by these scenes either — they still work like magic. (...)

The suicidal urges of a desperate working-class man who can see no other way out was a specialty of theirs in both Meet John Doe (1941) and It’s a Wonderful Life (1946), the latter of which was a notable financial failure when it came out. It was right after the end of World War II, when Americans were in no mood for the film’s bleak look at an unhappy life consumed by relentless money troubles, capped by watching the despairing Jimmy Stewart plunge off a bridge into icy black water on the night before Christmas, even if an intervention from heaven makes it all turn out fine in the end. The 1940s public probably saw the film more clearly than we do now, when it’s considered a cornball Christmas classic.

Capra’s reputation has suffered badly over the course of several decades, as his films came to represent populism’s supposedly slippery slope to fascism. It was shrewd of Capra to obscure his actual politics during the 1930s and ’40s, when most people were fooled by his films into believing the director was “quite liberal,” as Katharine Hepburn did when she agreed to star in State of the Union. Capra was, in fact, an open admirer of both Mussolini and Franco. During the McCarthy era, he served as an FBI informer, helping to persecute his fellow film industry professionals as a way of making sure his own history of working with left-wing writers didn’t come back to bite him.

But as corny as Capra looks today, Sturges relied on his films. He couldn’t help but pay tribute even as he wrestled with Capra’s idea of the American experience. 

by Eileen Jones, Jacobin |  Read more:
Image: A still from Sullivan's Travels (1942)

‘BoJack Horseman’ and ‘The Good Place’ Took Us to Hell and Back

In the penultimate episode of “The Good Place,” after four seasons wandering the afterlife, our dear-departed heroes finally make it to the destination promised in the title. It is, of course, beautiful, with lush gardens and buildings with alabaster walls.

It’s also familiar. The first time I watched, I felt like I knew this place. Was I recovering a memory from another life, or a state before life? Had I — good Lord — had I been to heaven?

Turns out I had, kind of. It took a few minutes of searching my memory and Google Images to realize that the location the producers chose to represent the Good Place was … the Getty Center, the art museum in the hills overlooking Los Angeles.

It’s a fitting choice for a humanist Hollywood reboot of paradise. “The Good Place,” whose finale airs Thursday night on NBC, is a slapstick survey of moral philosophy that places its faith not in a higher power (or a lower one) but in human culture and creation.

It’s also a visual echo of another great comedy, the zoologically incorrect Hollywood satire “BoJack Horseman,” whose final eight episodes arrive on Netflix Friday. Its title sequence begins with a wide shot of the cliffside house where the title character (Will Arnett), an anthropomorphic horse and former ’90s sitcom star, has spent six seasons chugging booze, pills and the occasional chaser of remorse.

If heaven is in the L.A. hills, so is hell. And over the past several years, these two comedies have wandered the crooked path between the two, trying to figure out how to be a decent person in a fallen world. (...)

The moral universe of “BoJack” is darker and messier than its NBC counterpart. Even its aesthetic is baroque, Hieronymus Bosch-like, compared with the clean, jewel-tone fantasy of “The Good Place.”

In “BoJack,” there are no cosmic do-overs, no second or two-thousandth chances. In one of the final episodes, BoJack imagines seeing a long-dead friend, who tells him: “There is no other side. This” — i.e., mortal life — “is it.”

It’s a dark statement. But dark is not the same as hopeless. Really, “BoJack” is making a kind of moral argument from atheism. In its universe, you have to do right not because you might end up in The Bad Place but because this, right here, is the only place.

Where “BoJack” is most like “The Good Place” is that it, too, is about the moral obligation to help others to be good. But it’s complicated; the show is also aware of the blurry line between help and enabling.

Throughout the series, BoJack is bailed out and pulled from the brink by others: his friend Mr. Peanutbutter (Paul F. Tompkins), a chuckleheaded Labrador retriever; his overstressed feline agent, Princess Caroline (Amy Sedaris); and his ghostwriter-turned-confidante, Diane (Alison Brie).

But Diane — as close as anything to the show’s moral center — starts to wonder if she’s really helping BoJack improve or (à la Dr. Melfi counseling Tony Soprano) just making him a more efficient miscreant. There’s an entire showbiz industry built around performative contrition, and BoJack has mastered its turns and straightaways like Secretariat. (He walks out of one supposedly harrowing confessional interview as if he’d aced the SAT: “I felt like I could see the matrix!”)

If “The Good Place” is how we need to raise one another up, “BoJack” is often about the need not to let one another off the hook. At the end of Season 5, for instance, Diane rejects BoJack’s plea that she write an exposé on him after a #MeToo incident, realizing that she’d just be stage-managing his redemption theater.

But she’s also reluctant to cut him off entirely. As she says, toward the end of the series: “Maybe it’s everybody’s job to save each other.”

As different as “The Good Place” and “BoJack” are in tone, each in its absurdist way gets at a piece of the current moment, in which many of our public fights are as much about morality — complicity, complacency, enabling — as they are about politics. In very different ways, both shows ask: Is being good simply an individual act that you can undertake in isolation? Is it enough to tend your personal moral garden if you allow evil to flourish around you?

by James Poniewozik, NY Times | Read more:
Image: BoJack Horseman, Netflix

Gutting the Clean Water Act

It may be hard to remember these days, but the nation that led the world on to the stage of modern environmental protection was the United States.

Starting in the early 70s, the US Congress enacted bold bipartisan laws to protect America’s wildlife, air and water. America’s skies cleared. Waterfronts across the nation went from blighted dumping grounds into vital civic hearts.

And, in this journey from smog to light, America’s economy thrived. Our environment improved even as our economy grew. Both Republican and Democratic administrations upheld this commitment to a clean environment, and it endured for decades.

Following the 2016 election, polluting-industry veterans commandeered the country’s environmental agencies with one central aim: make pollution free again.

The assaults have been fast, furious and many. But the latest one stands out above, or below, the others. Administration officials have now targeted the Clean Water Act, perhaps the most fundamental environmental law ever enacted by the US Congress.

The law’s main mechanism is simple: before discharging waste into the nation’s waters, polluters must first try to clean it up.

So how did the former lobbyists running the agencies sabotage the act? By radically shrinking it. By its terms, the act only protects waters “of the United States”. But according to this administration, waters “of” the United States does not mean waters in the United States. In their view, the Clean Water Act only applies to a subset of waters, and the rest are unprotected.

The scope of the contraction is staggering. In some states out west, 80% of stream miles would lose their protection. Drinking water sources for millions of Americans would be at risk from pollution. The administration’s redefinition would leave millions of acres open for destruction – wetlands that buffer communities from storms, serve as homes for wildlife and nurseries for fish and shellfish, and act as natural water filters.

This is the single largest loss of clean water protections that America has ever seen. And the timing couldn’t be worse. From lead contamination in drinking water to the proliferating threat of toxic industrial chemicals, new threats to water quality are emerging daily. (...)

Now the administration wants to scrap all that by only defending the very largest rivers and declaring open season on the smaller tributaries upstream. That’s like trying to address heart disease by ignoring the blood that travels through it.

by Blan Holman, The Guardian | Read more:
Image: Chris O'Meara/AP via
[ed. Not to mention, gutting NEPA (National Environmental Policy Act). See also: Trump Administration Cuts Back Federal Protections For Streams And Wetlands (NPR)]

Linda Ronstadt


[ed. See also: It Doesn't Matter Anymore (YouTube).]

Tuesday, January 28, 2020


Anders Kjær, Untitled, 1981
via:

Peter Hutchinson, Somewhere. 2017
via:

The Most Loved and Hated Classic Novels

Here are the top ten most popular classics, which likely corresponds with the list of books most assigned in American high schools:


Every book listed is a “great novel”. These books wouldn’t have been read hundreds of thousands of times if that weren’t the case. However, we can recognize a book as a “great novel” while also recognizing that many readers will not enjoy it.

These rankings matter because reading books you love is the gateway to a love of reading and reading books you hate is the gateway to a life without reading. Too often people are turned off from reading by being fed books they hate, either through school, or because the internet/friends make a certain book seem like it must be read.

via: The Most Loved and Hated Classic Novels According to Goodreads Users (Goodreads).

[ed. I don't know what the average high school literature curriculum is these days, but if these moldy oldies are at the core of it, no wonder kids get disconnected from reading for pleasure and enlightenment. See also: On the Hatred of Literature (The Point).]

Steely Dan


[I'm working on gospel time these days (Summer, the summer. This could be the cool part of the summer). The sloe-eyed creature in the reckless room, she's so severe. A wise child walks right out of here. I'm so excited I can barely cope. I'm sizzling like an isotope. I'm on fire, so cut me some slack. First she's way gone, then she comes back. She's all business, then she's ready to play. She's almost Gothic in a natural way. This house of desire is built foursquare. (City, the city. The cleanest kitten in the city). When she speaks, it's like the slickest song I've ever heard. I'm hanging on her every word. As if I'm not already blazed enough. She hits me with the cryptic stuff. That's her style, to jerk me around. First she's all feel, then she cools down. She's pure science with a splash of black cat. She's almost Gothic and I like it like that. This dark place, so thrilling and new. It's kind of like the opposite of an aerial view. Unless I'm totally wrong. I hear her rap, and, brother, it's strong. I'm pretty sure that what she's telling me is mostly lies. But I just stand there hypnotized. I'll just have to make it work somehow. I'm in the amen corner now. It's called love, I spell L-U-V. First she's all buzz, then she's noise-free. She's bubbling over, then there's nothing to say. She's almost Gothic in a natural way. She's old school, then she's, like, young. Little Eva meets the Bleecker Street brat. She's almost Gothic, but it's better than that. ~ Almost Gothic. 

[ed. Sizzling like an isotope. See also: What a Shame About Me (lyrics) and West of Hollywood.]

Monday, January 27, 2020

Remembering Jim Lehrer

This is FRESH AIR. Jim Lehrer, the respected journalist and a nightly fixture on PBS for more than three decades, died Thursday at his home in Washington. He was 85. Lehrer is best-known for co-anchoring "The MacNeil/Lehrer NewsHour" from 1983 to '95 with co-host Robert MacNeil and then, when McNeil retired, "The NewsHour With Jim Lehrer" until his retirement in 2011.

Lehrer grew up in Texas and was a newspaper journalist before getting into broadcasting. He was also a prolific writer. He published more than 20 novels and three memoirs and wrote four plays. Known for a calm, unflappable style and a commitment to fairness, Lehrer moderated presidential debates in every election from 1988 through 2012. He won numerous Emmys, a George Foster Peabody Award and a National Humanities Medal.

Jim Lehrer spoke to Terry Gross in 1988, five years after he'd suffered a heart attack and had double bypass surgery. (...)

TERRY GROSS: You have been the subject of many interviews since your heart attack, really, in 1983 and then since the writing of your plays and your new novel. Have you learned a lot about interviewing from being an interviewee yourself?

JIM LEHRER: I have, I think. I - MacNeil says, I think correctly, that I am a terrible interviewee because I give very long answers. In fact, as he said, you know, Lehrer, if you were ever on our program, we'd never invite you back because your answers are very - you asked me a question when we started. You know, I went on for five minutes, I think. I mean, that's a problem I have, and I understand. I sympathize, and I'm sure you must, too. I mean, I have great sympathy for the people I'm interviewing because I ask a question of somebody - now, keep in mind 99% of the interviews I do are live. I ask somebody a question, and then I'm immediately jumping, ready for the next question or ready to go on with it, you know?

I mean, I would much rather interview than be interviewed. I have learned a lot just out of sympathy for the people as a result of being the subject of the interview. There's no question about it. I now understand how difficult it is.

GROSS: Well, do you tell the people who are appearing on your program to give you short answers (laughter), and how do you stop them if the answers...

LEHRER: No.

GROSS: ...Are long?

LEHRER: What I tell the folks to do is to give their best answer. If it's short, that's fine. If it's long, that's fine. I can always interrupt them. I interrupt people for a living. That's what I tell them. It's very important that the person not have to be - not have to confine themselves to your rules. For instance, if - let's say somebody is like me, gives long answers like I'm giving you right now, as a matter of fact. And - but, I mean, that's your problem, see? That's not my problem.

GROSS: (Laughter).

LEHRER: I mean, if I'm going...

GROSS: Hey; thanks a lot.

LEHRER: Yeah, right. I mean, I've come - if you asked me the story of my life and if it takes two hours to tell you the story of my life, I think it takes two hours. And it's your job as the professional to cut it down a little bit. And I think that also, you get better answers that way. If I say to somebody who sits down who's already nervous - now, that's not true of people that are used to television. But if somebody comes in there very nervous - live show going all over the country, their mother's watching and everybody's there - and I say to them, all right; keep your answer short and blah blah blah, all it does is add to their anxiety. And I want people to be relaxed. I want them to forget that there are all these lights and cameras around and have eye contact. Our studios are set up, both in Washington and New York - are set up...

GROSS: This answer's too long. No, I'm kidding.

LEHRER: No, I know (laughter). I know it is.

GROSS: Just thought I'd try that out, see what happened (laughter).

LEHRER: See, it doesn't work with me. That's - it feels right. But we set our people - our guests are very close to us, and there's direct eyeball-to-eyeball contact. So that - so you try to confine the situation so the person is comfortable, and all they have to do is look at you. They're not - they don't have to look around. There's not a place to - you know, to be distracted. It's to make people comfortable.

GROSS: You've had to interview many politicians over the years, and I think that is always so difficult because politicians give you answers, but they're not necessarily answers to the questions you've asked. I don't mean you in particular.

LEHRER: Oh, I know.

GROSS: But in general, what are some of the techniques you've come up with for actually getting an answer to the question that you want answered because you're just not necessarily going to get it?

LEHRER: Terry, there's only one technique that works, and that's to have enough time to ask the question a second time and then a third time and maybe a fourth time. And then, if Billy Bob Senator isn't going to answer it, you at least have a stab. That's his option. If he didn't want to - you know, I mean, there's no law that says he has to answer all the questions that Jimmy Charles Lehrer asks him on television, but I have the time. We have the time on our program.

Senator, what is your position on selling grain to the Soviet Union? Well, you know, Jim, that reminds me of when I was a little boy growing up in Oklahoma. And then he tells you a story. And you ask, yeah, but Senator - you give him the time, you know? He does that, and you say, yes, but what's your position on selling grain? Well, you - first of all, you got to understand what grain is. Grains are these little - he still hasn't answered. So then you say, yes, but Senator, again - you know? And then finally, you have to decide. And you're sitting there in a live situation. Do I ask this sucker this question again, or do I go on? You have to - at some point, you have to have real confidence in your audience that they realize, hey; this jerk isn't going to answer this question, or, this wonderful man isn't going to answer this question, or whatever the situation is. Then you go on with it.

I do not believe in beating up on guests. I don't - we don't invite people on our program to abuse them. And so the other way to do it if you don't have the time is you say, you didn't answer my question, you know? Hey, hey, blah, blah, blah, you know? We don't do it that way. And it's because - it's not because we object to it. That's somebody else's job to object to it. That's just not our style. We're not comfortable doing that. And we have the luxury of time.

GROSS: You know, you strike me as one of the few news anchors on television - I mean, you and MacNeil, really - who do more than just read the news while the newscast is on. Does the emphasis that American news viewers put on news anchors on commercial news seem a little absurd to you?

LEHRER: It seems incredibly absurd to me. I don't understand it. I do - I simply do not understand the value that is placed on the ability of somebody to look into a television camera and read a teleprompter. Now, that's called a short answer.

by Terry Gross, NPR |  Read more:
Image: via

Are You Local?

When it comes to thinking about being local in Hawaii, most might not immediately think back to a notorious murder case of nearly a century ago.

Yet, the Massie case of 1931-1932, in which a young Native Hawaiian was tragically killed by a group of whites associated with the Navy, is precisely the historic event that scholars at the University of Hawaii say is central to appreciating the concept of local identity.

“The Massie Case has since become a kind of origins story of the development of local identity in Hawaii among working-class people of color,” John P. Rosa writes in his 2014 book, “Local Story: The Massie Kahahawai Case and the Culture of History.”

In his view and that of other scholars, it represented the first time the term “local” was used in Hawaii with any significance.

And while definitions of local identity have evolved, at its core local identity is as much about dividing people as it is about uniting them, and about who has power and influence and who does not.

It’s common to hear people define local as where someone went to high school, taking your slippers off before entering someone’s home, preferring your peanuts boiled or speaking pidgin English.

But, while these habits are not without comfort and significance, they are in a sense only surface-level connections that may prevent the people of Hawaii from recognizing what really brings us together, and what may be in the way of bridging differences to address the many troubles in our society.

What defines local identity, says Jonathan Okamura, an ethnic studies professor at UH Manoa, is a shared appreciation of the land, the peoples and the cultures of the islands.

But now that shared identity could be imperiled by the same powers that held sway in the 1930s: a local and national government inattentive to their concerns, abetted by economic forces controlled by others.

Hawaii was already becoming too reliant on outside economic forces, especially tourism, Okamura warned 25 years ago, disrupting the value of a shared identity.

The color of one’s skin may not serve as the best way to identify who is and is not local.

“Local identity, while not organized into a viable social movement, will continue in its significance for Hawaii’s people if only because of their further marginalization through the ongoing internationalization of the economy and over-dependence on tourism,” he wrote. (...)

Today, the troubles that are dividing us are made all the more difficult by economic dependency on tourism, the large military presence in the islands, and foreign investment and ownership that Okamura writes about.

Local identity and any disconnect that comes with it is also being shaped by increased immigration from the mainland and the broader Asia-Pacific region to Hawaii even as the local-born population is moving elsewhere.

Rosa says that local identity doesn’t necessarily divide us as long as we continue to discuss what it means to be local.

“Sometimes things get a little emotional when we think about identity and ‘who I am,’ but when we think of what place and shared values might be, that is one way to think about it,” he said in an interview. “It is people committed to this place in particular ways.” (...)

What Is Local?

It is easy to think of local identity as being based on race and ethnicity.

Indeed, in the Massie case Grace Fortescue singled out Joseph Kahahawai as the “darkest” of the five men. And the words malihini haole are frequently and sometimes pejoratively used to describe whites who move here from the mainland.

The working-class origins of local identity were informed by the labor needs of the plantations that brought large numbers of migrants from China, Portugal, Japan, Puerto Rico, the Philippines and Korea to Hawaii in the mid-to-late 19th century and into the early 20th. Many stayed, and it is their descendants that “made up the core of locals” since the 1930s, says Rosa in a 2018 book, “Beyond Ethnicity: New Politics of Race in Hawai’i.”

Meanwhile, a white oligarchy remained in power in the islands for decades following the Massie case.

But demographics gave way to substantial change through several transformative periods since that time: martial law during World War II, the return of Japanese-American veterans to the islands, the so-called 1954 revolution that saw the territorial Legislature wrestled away from mostly white Republicans by racially diverse Democrats, the tourism and development boom that begins in the 1950s and 1960s, the Hawaiian Renaissance of the 1970s, the Japanese investment of the 1980s and the economic slowdown of the 1990s.

Hawaii is now in the midst of another transformative period, one whose dimensions are still being drawn but one that continues to reflect the dynamics of previous generations. It is also driven by something that did not exist until recently: the online world and social media.

All through it, local identity has continued.

“Over the years, local identity gained greater importance through the social movements to unionize plantation workers by the International Longshoremen’s and Warehousemen’s Union in 1946 and to gain legislative control by the Democratic Party in 1954,” Okamura writes.

Today, those who might identify as local are no longer just members of the working class. There are whites whose roots go back multiple generations. And the color of one’s skin may not serve as the best way to identify who is and is not local.

Changing Demographics

There is also a new category of people besides Native Hawaiian, haole and local — one that Rosa calls “other.”

Their arrivals began in small numbers in the 19th century but have grown significantly, more recently from places such as Latin America — including Mexicans and Brazilians — Southeast Asia (Vietnamese) Micronesia (Marshallese and Chuukese) and other parts of the Pacific (Samoans).

Are these groups considered locals?

It depends, in part on whether they acquire local knowledge, language and customs, whether they have respect for the indigenous population, the degree of their intermarriage rates, and on whether these groups are still primarily connected to their former homes or are nurturing ties to their new ones.

There is no litmus test for being local. But newer arrivals to Hawaii who integrate into local society rather than resist it — who seek to transplant themselves in a new environment with the same trappings of their old one — may sometimes find it easier to get along. (...)

‘Where You ‘Wen Grad?’

The topic of what it means to be local in Hawaii has been written about extensively in local media, including Civil Beat.

One of the most popular occasions was from the Honolulu Advertiser in 1996, which published readers’ answers to the question, “You Know You’re A Local If …”

The newspaper was flooded with countless letters, postcards, emails and faxes. It ended up publishing the “ones that made us laugh the hardest” while running more in a new column that would debut later that year.

Here are just a few excerpts from the initial article in the Advertiser that August, broken into categories for food, fashion, philosophy, habits, awareness and the like:
  • “Your only suit is a bathing suit.”
  • “You have at least five Hawaiian bracelets.”
  • “You know ‘The Duke’ is not John Wayne.”
  • “You measure the water for the rice by the knuckle of your index finger.”
  • “You let other cars ahead of you on the freeway and you give shaka to anyone who lets you in.”
  • “Your first question is, ‘Where you ’wen grad?’ And you don’t mean college.”
The entries and ideas kept on coming.

In a May 2002 column, the Advertiser’s Lee Cataluna revisited the topic. She wrote, “Every couple of months, a new one will show up in your e-mail inbox, one of those ‘You know you’re local if …’ lists.”

But Cataluna also observed that, “The only problem with those lists is they’re made for people who have no doubt that they’re local.” They are for “entertainment purposes only, eliciting happy nods of recognition rather than gasps of self-revelation.”

What Cataluna wanted to talk about was people who did not grow up in Hawaii but who had spent “some serious time and effort to understand and adopt the culture.”

She asked, “When do they know they’ve turned the corner to local-ness? How can they tell when they’ve passed major milestones?”

Such a list, she said, would include these characteristics:
  • “You know you’re turning local when you no longer think eating rice for breakfast is strange.”
  • “You know you’re turning local when, even though you hate seafood, you love poke cuz’ that’s different.”
  • “You know you’re turning local when you say the word ‘pau’ so often that you forget what it means in English. Pau is pau.”
Cataluna concluded with what she called “the big one”: “You know you’re local when you get irked by people who act too ‘Mainland.’” (...)

But there is also much to celebrate and even honor in localisms.

“Our cultural expression is manifest through the adoption of others’ customs as our own,” said Davianna Pōmaikaʻi McGregor, an ethnic studies professor and the department’s director for the Center for Oral History, in an interview. “It is identified with Hawaiians — mixed plates, that sort of thing — and if you lose that you begin to erode at those cultures that cohere us and connect us.

“And the fact that people are coming together to celebrate life events, bringing food and sharing — on Molokai, people go and clean yards when someone passes away — if we stop doing those things, we are going to lose that connection. So it is important.”

by Chad Blair, Honolulu Civil Beat |  Read more:
Image: Cory Lum
[ed. See also: Can A White Person Ever Be ‘Local’ In Hawaii? (HCB).]

Why Netflix’s Fantastic New Docuseries Cheer Is So Addictive

Fifty-three seconds into the first episode of Netflix’s docuseries Cheer, teenaged Morgan talks about pain. Fifty-four seconds into Cheer, she’s thrown into the air, twisting and flipping like a fish on a line. She comes careening back down into three sets of arms one second later, and she lands with a thunderclap of brutality, muscle smacking against muscle.

“Are you okay, Morgan?” someone asks. My untrained eye can’t pick up what’s wrong — just that something is wrong. And though Morgan walks off the rough landing, her body, gingerly stiff and wobbling unevenly, is what I think it looks like to silently scream.

After watching Cheer’s first 55 seconds, I knew I was going to spend the next six hours of my life breathing, consuming, Googling, and social media-stalking everything about the show. I knew then that it was my favorite new show of this very young year.

Cheer focuses on a competitive sport that fuses turgid, erotic tribalism with the body-breaking violence of muscular humans flinging tinier, lighter humans into the air and then catching them — callused hands atop thickly taped wrists, clawing into triceps and ankles. To that mixture, the show adds the us-against-the-world mentality of Charles Xavier’s X-Men and the small-town glamour of Friday Night Lights.

This is competitive junior college cheerleading at the dynastic Navarro College. This is Cheer. And this show is ballistically addictive.

Cheer takes place in the mecca of junior college competitive cheerleading, a place called Navarro College, Navarro for short. It’s in a town 60 miles south of Dallas called Corsicana, and absolutely nothing competes with the Navarro cheerleaders. They are the biggest and only thing in town, having won 14 National Cheerleaders Association National Championships and five “grand national” wins, which basically means they got the highest score at the national championships regardless of division and designation.

But while the Navarro Bulldogs dominate on the mat, they’re still underdogs.

Director Greg Whiteley (of Netflix’s college football docuseries Last Chance U) doesn’t shy away from showing the grim reality of many of these cheerleaders’ futures. Not many have options beyond cheering at this National Championship-caliber school, many stating that the team is the only thing that’s keeping them from getting into trouble or making bad decisions. The one kid who has seemingly solid post-cheer prospects, an Instagram “cheerlebrity” with nearly a million followers, is blatantly being used by her parents as a cash cow.

Even then, the escape Navarro cheer offers these young women and men is temporary, as cheerleading is something that ends after college. Professional cheerleading is more like dancing, and those gigs aren’t usually fairly paid. This makes the years spent at Navarro so important for the kids there, especially the ones who would otherwise be at risk and out of school.

In the crosshairs of Cheer’s urgency, desperation, and drama are the National Championships in Daytona Beach, Florida. Specifically, the national championship performance, the two minutes and 15 seconds that’s relegated to an intricate and difficult routine where anything, even moves drilled into muscle memory by thousands of repetitions, could go wrong. And it’s coach Monica Aldama’s job to create a team that won’t break in those 135 seconds, as she’s done 14 times in her life.

by Alex Abad-Santos, Vox | Read more:
Image: Netflix
[ed. Highly recommended (I'm in love with Monica). See also: How Cheer’s Superstar Coach Monica Gets It Done (The Cut).]

Sunday, January 26, 2020

In and Out


[ed. Talk about getting robbed.]

The Myth of the “Millennial-Friendly City”

If there is one thing that is true about Millennials, it is that we are mystifying, and therefore constantly being asked to explain ourselves. This is the premise, I think, behind Angela Lashbrook’s recent viral article for OneZero titled “Millennials Love Zillow Because They’ll Never Own a Home.” The piece rightly points out that often, our wish to escape our terrible lives leads to us fantasizing about buying nice houses in cities where we do not, and, due to the circumstances of our personal lives and/or careers, probably could not, live. In fact, there is an entire genre of internet content — some of it reputable, some of it laughably not so — that seemingly exists to either supplement these fantasies of skipping town or to actively encourage them.

The most recent example of this phenomenon came from the commercial real estate listings start-up, which last week proclaimed that it had objectively determined the most Millennial-friendly cities in the country. Judging by things like population trends, affordability, average commute times, and the number of young people in a city whose jobs offer health insurance, Commercial Cafe determined that the metro areas surrounding places like Denver, Austin, Seattle, and Portland were, definitively, friendly to Millennials. Of course, I already knew these cities were Millennial-friendly through another methodology: being friends with people who aren’t boring as hell, since if you’re friends with any kind of young, cool or cool enough person, you’ll invariably hear one of them talking about how they’re thinking about moving to that city, if they haven’t already.

Still, this is not the only study that claims to have figured out what makes a city Millennial-friendly, a concept I find fascinating because of how arbitrary it seems. Politico believes that Millennials choose which city to live in based on the number of other young people, especially those with college degrees or who have recently relocated there, as well as the average GDP and the possibility of taking an “alternative commute” to work. Business Insider has its own rankings, based on population changes, increases in median wages, and decreases in unemployment rate. The Penny Hoarder developed a formula for Millennial-friendliness which factored in “Millennial happiness” and ended up placing St. Louis, MO and Grand Rapids, MI at one and two, respectively. This is just random enough for me to believe that these places might secretly be tight.

But these lists, including Penny Hoarder’s (whose counterintuitive conclusions I honestly do appreciate), fail to grasp what makes a city a genuinely compelling place to live. Cities like New York, Berlin, and Austin are not “cool” because of their public transportation or how many jobs there are there; instead, they were all direct beneficiaries of a cycle in which artists, punks, and general counterculture types ended up moving there when they were still cheap, treating these underpopulated cities as places where they could live affordably and in close quarters with likeminded people, together producing the sort of radical art and culture that end up being cool enough to get vacuumed into the city’s self-conception, after which a bunch of yuppies move in and fuck it all up. (I don’t have specific numbers to back this up, but my landlord once told me if I ever wanted to buy an investment property, I should buy something in a town where an anarchist bookstore just opened up.)

This isn’t a great cycle, especially since the arrival of the artists and punks is the first sign that the local population — in these neighborhoods, that most often means people of color and immigrants — is only a decade or two away from being priced out. Think of it as Lenin’s theory of the two-stage revolution, except in reverse, and instead of communism, it’s a path for gentrifying a city until it sucks ass.

Since 2016, I’ve lived in the Raleigh-Durham municipal area, which is frequently pegged as one of the most Millennial-friendly locales in the nation. Durham in particular has seen its star rise dramatically, to the point that all the artists and punks barely had a chance to set up shop before everybody else started moving in. Case in point: About a year ago, I was sitting in the backyard of a local bar when I ended up talking to a bro wearing a Patagonia sweater and Sperry boat shoes who told me that he and his roommates from architecture school had all moved down to the area after graduation because a friend had told them that, “the job market was poppin’.” (In case I have not been clear enough: this person was white and very fratty.)

Ever since then, I have noticed an influx of “that type” of person — preppy out-of-towners who flock to an area during a boom period and, through sheer force of numbers, end up changing its character in increasingly generic ways. Previously fun bars where adult people can simply relax while drinking an adult beverage either get overrun or run out of the neighborhood, with “experiential” bars that Millennials allegedly enjoy (read: bars where you can throw axes, play arcade games, or do mini-golf) popping up in their place. Music venues start booking different acts who appeal to this growing market of kinda-generic Millennials, letting local scenes languish in the background.

When people treat the place they live as a giant AirBnB they can check out of after a few years working as a “creative lead” at a mid-sized start-up before moving elsewhere, they become less attuned to local issues, specifically the problems faced by those outside their specific, transplant-y milieu. In other words, there are two types of people: those for whom such lists apply, and those who are negatively affected by those for whom such lists apply.

by Drew Millard, The Conversation |  Read more:
Image: uncredited

Why Some Kids Wear Shorts All Winter

Lindsey Miller first took note of the boys who refused to wear long pants when she was in grade school. At her elementary school in Maryland, a few particular boys made a habit of wearing shorts to school all winter, even though January temperatures in the mid-Atlantic state routinely drop below freezing. And it was always boys, she told me, never female students—“Girls made fun of them, but other guys cheered them on,” she recalled. One kid she knew in third grade, whose name has escaped her memory in the decade-plus since, “wore basically the same pair of shorts all year,” Miller, now 20, remembered.

The “one kid who wears shorts to school all year”: In regions that get cold and snowy in the winter, he’s a figure that’s equal parts familiar and bewildering to kids and teachers alike, and his clothing choices present an annual hassle for his parents. On Twitter, where Lindsey Miller once joked about the middle-school winter-shorts boy, he is in fact the butt of a number of observational jokes, many of them from classmates and beleaguered moms and dads: “There’s really this dude wearing shorts at school… IN THE WINTER.” “Have kids so you can argue with tiny, opinionated people about why they can’t wear shorts in winter and then coats when it’s 80 degrees.” Educators at a middle school and high school in Minnesota confirmed to me that they can count on having two or three of him every year, arriving at school after braving the morning windchill with bare calves. (In the interest of transparency, both were former teachers of mine, who I’m sure were perplexed to hear from me for the first time in more than a decade only to be asked about this.)

In other words, the Boy Who Wears Shorts All Winter is a highly recognizable but largely inscrutable character, and when I asked parents, teachers, child psychologists, and a former B.W.W.S.A.W. himself to try to explain what exactly motivates such a plainly impractical clothing choice, they all offered different answers.

by Ashley Fetters, The Atlantic |  Read more:
Image: Charles Rex Arbogast/AP Images
[ed. It's not just boys. I used to see schoolgirls in Alaska skittering across freezing, wind-whipped sidewalks and parking lots in shorts and mini-skirts. I figured they were braving short-term pain for later long-term gain inside a warm school building.]