Monday, January 15, 2018

The TL:DR Guide to Michael Wolff's 'Fire and Fury'

A quick note about Michael Wolff's Fire and Fury, which upon a second pass still has, to put it mildly, some serious issues: As any art historian can pick out a forgery, veteran journalists reading this book will quickly spot an oversold narrative and perhaps unprecedented sourcing issues.

The tortured "Author's Note" preceding the prologue almost reads like a novel in itself. In fact, trying to follow Wolff's idea of what "off the record" means or does not mean is like trying to follow the hands of a three-card monte dealer. It just can't be done.

As a White House source put it, Wolff's narrative personality is almost like a comedy act in itself:

"He's like the old Jon Lovitz character from Saturday Night Live," the source said. "You know – 'Yeah, I went to Harvard, that's the ticket. And, yeah, I was on the couch in the West Wing for months, that's the ticket.'"

Fire and Fury is really two books rolled into one. The first is a compelling nonfiction book about the intellectual divide in the modern right, as candidly hashed out to Wolff by influential figures like Steve Bannon and Roger Ailes and (seemingly?) Rupert Murdoch.

The second is a Primary Colors-style novel about what goes on behind various closed doors in the Trump White House, based on a few bits and pieces of fact, which are offset by mountains of eye-rollingly insupportable supposition, spiced with occasional stretches of believable analysis.

There is considerable debate in the media world, on both the left and the right, about the value of this book (even I've gone back and forthon it). In the end, I think it's like a piece of moldy rye bread – you have to cut around the hairily sourced parts to keep from getting poisoned. But on a broad level, there is something to dig into.

Reading the book, there are at least a few real points about Trump that shine through:

1) Trump has almost no ideological convictions and is motivated almost entirely by the classic narcissistic value equation, i.e. how much praise or scorn he gets on a second-to-second basis, from whom, and why. Had he not run as a Republican – and in particular won on a platform scripted by a nationalist true believer like Bannon – he might very well by now have been pushed into a completely different kind of presidency. Trump wants so badly to be liked that, especially with the influence of Kushner and Ivanka, he might easily have allowed his White House to drift back toward his original politics, which (as New Yorkers and furious conservatives alike will clearly remember) was once squarely in the Bob Rubin rich-guy sort-of Democrat mold.

2) However, as Bannon points out in the book – correctly – Trump by now is so firmly entrenched in the consciousness of America's intellectual elite as a villain that he will never be accepted by that crowd. The constant battering Trump gets from the press, especially, ensures that he will continue to lash out at them, forcing him continually to tack back to the only people who still like him – Bannon's angry-man followers. This despite the fact that what Trump clearly craves is, instead, the approval of members of his own class.

3) The result is an insane paradox of an America led by a doomed and trapped psyche. This is a president who in another era might have been confined to the impact of an ordinary bad commander-in-chief (we've had many), i.e., sedated and/or scripted in public, and kept on the golf course the rest of the time while the empire runs on the dreary autopilot of donors, P.R. flacks and military advisers.

Instead, we get a leader whose most dangerous moments come during his ever-expanding calendar of hyper-tweeting downtime (incidentally, is anything more certain than the term "executive time" replacing "taking my talents to South beach" as this generation's euphemism for masturbation?). All those crazed Trump tweets guarantee an endless cycle of paranoia and rebuke – and a permanently paralyzed White House.

Anyway, it's a fascinating book. But too long for most people in the Internet age to actually read. So without further ado, here's shorter Michael Wolff, in chapter form:

a) The Author's Note:

See if you can make sense of this passage:

"Many of the accounts of what has happened in the Trump White House are in conflict with one another; many, in Trumpian fashion, are baldly untrue. Those conflicts, and that looseness with the truth, if not with reality itself, are an elemental thread of the book. Sometimes I have let the players offer their versions, in turn allowing the reader to judge them. In other instances I have, through a consistency in accounts and through sources I have come to trust, settled on a version of events I believe to be true."

In other words: The unattributed facts you're about to read are sometimes my best guess as to the truth, and sometimes someone else's more dubious version, and you won't know which is which, but – whatever, enjoy!

b) Prologue: Ailes and Bannon

This is the most interesting part of the book, and not just because Wolff has the stones to use the word "louche" in a sentence early on (there's an "I went to college, honest" word choice about once every four pages in Fire and Fury). This passage alone sums up 30 years of the history of right-wing thinking:

"Ailes was convinced that Trump had no political beliefs or backbone. The fact that Trump had become the ultimate avatar of Fox's angry common man was another sign that we were living in an upside-down world. The joke was on somebody – and Ailes thought it might be on him."

This is the main theme of the book: That both the Republican establishment (as represented by the likes of Ailes and Murdoch) and the alt-right revolution (as represented by Bannon) think Trump is a fumbled football they can pick up and run into the end zone of power.

In the end, of course, the joke is on everyone, as Trump's brain fumbles hopelessly out of bounds and neither side successfully appropriates his presidency, which becomes an endlessly circular, purposeless, narcissistic tweet-storm.

1. ELECTION DAY


Wolff becomes roughly the 40,000th writer to compare Trump's campaign to The Producers. In classic Hollywood formula-script fashion, the Trump campaign is presented as composed of characters that each have their own desperate motivation to lose, only to each be crushed in their own way by the shocker result.

This chapter reads a lot like Shattered, the acid catalogue of finger-pointing that took place among high-ranking Clinton campaign figures after Hillary's loss, except here it's backwards. In this case, the characters start to blame each other for somehow transforming what Steve Bannon called a surefire "broke dick" loser campaign into a winner.

The only person who truly believed from the start that Trump would win is Melania, who had learned to expect, with religious certainty, that her husband would deliver upon the worst-case scenario in every situation. She was right.

by Matt Taibbi, Rolling Stone |  Read more:
Image: Carolyn Kaster/AP

Beware the Lessons of Growing Up Galapagos

I'm wary of all conclusions drawn about media in the scarcity age, including the idea that people went to see movies because of movie stars. It's not that Will Smith isn't charismatic. He is. But I suspect Will Smith was in a lot of hits in the age of scarcity in large part because there weren't a lot of other entertainment options vying for people's attention when Independence Day or something of its ilk came out, like clockwork, to launch the summer blockbuster season.

The same goes for the general idea that any one star was ever the chief engine for a film's box office. If the idea that people go see a movie just to see any one star was never actually true, we can stop holding the modern generation of movie stars to an impossible standard.

The same mistake, I think, is being made about declining NFL ratings. Owners blame players kneeling for the national anthem, but here's my theory: in an age of infinite content, NFL games measure up poorly as entertainment, especially for a generation that grew up with smartphones and no cable TV and thus little exposure to American football. If I weren't in two fantasy football leagues with friends and coworkers, I would not have watched a single game this season, and that's a Leftovers-scale flash-forward twist for a kid who once recorded the Superbowl Shuffle to cassette tape off a local radio broadcast just to practice the lyrics.

If you disregard any historical romantic notions and examine the typical NFL football game, it is mostly dead time (if you watch a cut-down version of a game using Sunday Ticket, only about 30 minutes of a 3 to 3.5 hr game involves actual game action), with the majority of plays involving action of only incremental consequence, whose skill and strategy on display are opaque to most viewers and which are explained poorly by a bunch of middle-aged white men who know little about how to sell the romance of the game to a football neophyte. Several times each week, you might see a player hit so hard that they lie on the ground motionless, or with their hands quivering, foreshadowing a lifetime of pain, memory loss, and depression brought on by irreversible brain damage. If you tried to pitch that show concept just on its structural merits you'd be laughed out of the room in Hollywood.

Cultural products must regenerate themselves for each successive age and generation or risk becoming like opera or the symphony is today. I had season tickets to the LA Phil when I lived in Los Angeles, and I brought a friend to the season opener one year. A reporter actually stopped us as we walked out to interview us about why we were there, so mysterious it was to see two attendees who weren't old enough to have been contemporaries of the composer of the music that night (Mahler).

Yes, football has been around for decades, but most of those were in an age of entertainment scarcity. During that time the NFL capitalized on being the only game in town on Sundays, capturing an audience that passed on the game and its liturgies to their children. Football resembles a religion or any other cultural social network; humans being a tribal creature, we find products that satisfy that need, and what are professional sports leagues but an alliance of clans who band together for the network effects of ritual tribal warfare?

Because of its long incubation in an era of low entertainment competition, the NFL built up massive distribution power and enormous financial coffers. That it is a cultural product transmitted by one generation to the next through multiple channels means it's not entirely fair to analyze it independent of its history; cultural products have some path dependence.

Nevertheless, even if you grant it all its tailwinds, I don't trust a bunch of rich old white male owners who grew up in such favorable monopolistic conditions to both understand and adapt in time to rescue the NFL from continued decline in cultural relevance. They are like tortoises who grew up in the Galapagos Islands, shielded on all sides from predators by the ocean, who one day see the moat dry up, connecting them all of a sudden to other continents where an infinite variety of fast-moving predators dwell. I'm not sure the average NFL owner could unlock an iPhone X, let alone understand the way its product moves through modern cultural highways.

Other major sports leagues are in the same boat though most aren't as oblivious as the NFL. The NBA has an open-minded commissioner in Adam Silver and some younger owners who made their money in technology and at least have one foot in modernity. As a sport, the NBA has some structural advantages over other sports (for example, the league has fewer players whose faces are seen and who are active on social media in an authentic way), but the league also helps by allowing highlights of games to be clipped and shared on social media and by encouraging its players to cultivate public personas that act as additional narrative fodder for audiences.

I remember sitting in a meeting with some NFL representatives as they outlined a long list of their restrictions for how their televised games could be remixed and shared by fans on social media. Basically, they wanted almost none of it and would pursue take-downs through all the major social media companies.

Make no mistake, one possible successful strategy in this age of abundant media is to double down on scarcity. It's often the optimal strategy for extracting the maximum revenue from a motivated customer segment. Taylor Swift and other such unicorns can only release their albums on CD for a window to maximize financial return from her superfans before releasing the album on streaming services, straight from the old media windowing playbook.

However, you'd better be damn sure your product is unique and compelling to dial up that tactic because the far greater risk in the age of abundance is that you put up walls around your content and set up a bouncer at the door and no one shows up because there are dozens of free clubs all over town with no cover charge. (...)

My other test of narrative value is a variant of the previous compression test. Can you enjoy something just as much by just watching a tiny fraction of the best moments? If so, the narrative is brittle. If you can watch just the last scene of a movie and get most or all the pleasure of watching the whole thing, the narrative didn't earn your company for the journey.

Much more of sports fails this second test than many sports fans realize. I can watch highlights of most games on ESPN or HouseofHighlights on Instagram and extract most of the entertainment marrow and cultural capital of knowing what happened without having to sit through three hours of mostly commercials and dead time. That a game can be unbundled so easily into individual plays and retain most of its value to me might be seen as a good thing in the age of social media, but it's not ideal for the sports leagues if those excerpts are mostly viewed outside paywalls.

This is the bind for major sports leagues. On the one hand, you can try to keep all your content inside the paywall. On the other hand, doing so probably means you continue hemorrhaging cultural share. This is the eternal dilemma for all media companies in the age of infinite content.

by Eugene Wei, Remains of the Day |  Read more:
Image:Curtis Compton/Atlanta Journal-Constitution via AP

Hawaii and Human Error

The Cold War came to an end, somehow, without any of the world’s tens of thousands of nuclear warheads being fired. But there were decades-worth of close calls, high alerts, and simple mistakes that inched world leaders shockingly close to catastrophe.

Saturday’s terrifying, 38-minute episode in Hawaii will not go down as one of those close calls: Residents of the state waited for the bombs to fall after receiving text messages that a ballistic missile was on its way. FCC Chairman Ajit Pai on Sunday said “the government of Hawaii did not have reasonable safeguards or process controls in place to prevent the transmission of a false alert”—a case of human error, in other words.

But the episode did reveal the glaring deficiencies of an early-warning system that can easily misfire, along with some frightening truths about the speed at which policymakers and presidents must make decisions in the event that missiles really do fly. “Mistakes have happened and they will continue to happen,” the Arms Control Association’s Daryl Kimball told me. “But there is no fail safe against errors in judgment by human beings or the systems that provide early warning.”

As such, worries about miscalculation remain vivid. Vipin Narang, a political science professor at MIT focused on nuclear issues, tweeted one scenario on Saturday. “POTUS sees alert on his phone about an incoming toward Hawaii, pulls out the biscuit, turns to his military aide with the football and issues a valid and authentic order to launch nuclear weapons at North Korea. Think it can’t happen?”

The United States operates a series of radar and missile-defense systems across the Pacific. It includes satellites monitoring the Korean peninsula and fleets of American and Japanese warships equipped with the Aegis system, a powerful computing network that detects and tracks missile launches and aircraft. Those systems are tied to the U.S. Strategic Command’s Global Operations Center, buried deep underground in Nebraska, which monitors events around the world in real time and pumps that information to the Pentagon and the White House.

In the Hawaii incident, there was little danger of the United States firing off a nuclear response. Military officials knew within minutes of receiving the alert that there was no threat to U.S. territory; none of the Pentagon and U.S. spy satellites or the ground and sea-based radars detected any sign of missile launches from North Korea, government officials told me.

But with a president obsessed with cable news and Twitter, the erroneous alert could have easily triggered an angry or provocative tweet, which could have been interpreted by the North Koreans or Russians as an imminent threat. According to pool reports, Trump was briefed on the false alarm while at his private golf course in Florida. Hours later, he tweeted about Hillary Clinton’s “missing” emails and the performance of the stock market. He has yet to comment on the incident despite knowing within minutes that all was safe, even as horrified Hawaiians continued to expect the worst. (...)

While the United States has a series of sophisticated early warning systems, potential adversaries do not, making initial statements from American officials critical in tense situations. “We have to be concerned about our adversaries early warning systems and their interpretation of these signals and messages,” Kimball said.

Entering this complex array of political signaling, high-tech surveillance, and careless tweeting, is the Pentagon’s new Nuclear Posture Review, the first since 2010. Originally slated for release next month, a draft of the document leaked this past week shows the Trump administration is lowering the bar for what would trigger an American nuclear response. It includes an entire section about non-nuclear strategic attacks that could spur an American nuclear response: cyber warfare, massive blows to critical infrastructure, and certain catastrophic attacks on civilians.

That is a “major expansion over Clinton, Bush and Obama,” all of whom attempted to reduce the role of nuclear weapons, Jon Wolfsthal, a former Obama official who worked on nuclear issues, told me. The new strategy views nuclear weapons as “a swiss army knife that can be pulled out to solve a range is issues,” he added. Among several new weapons the document proposes are so-called “low-yield nukes,” which could be placed on existing Trident ballistic missiles launched from submarines, lowering the threshold for use by causing less fallout, limiting the impact zone, and causing fewer civilian casualties.

As one defense official involved in nuclear issues put it: “We are self-deterred because our nuclear weapons are too big, and would cause too much damage if used.” The new strategy paper, then, expands the types of scenarios under which the United States would choose the nuclear option, which in turn “could lead to a new round of testing of nuclear weapons,” the official said.

by Paul McCleary, The Atlantic |  Read more:
Image: Ben Jennings via The Guardian

Sunday, January 14, 2018

False Ballistic Missle Alert


[ed. I was in Honolulu when the alert went out. My initial reaction? Must be a virus. Nothing on the news, no jets screaming overhead, no sirens blaring, nothing. So I figured, just a spoof and clicked the phone off. Unfortunately, my brother was in Kona at the airport when the warning appeared. All flights were immediately cancelled and TSA operations shut down. We could only laugh afterward. When you think about it, where would you rather be in a real situation - standing in a line several hundred people deep waiting to get through TSA - or taking off in a jet? I'm actually surprised people responded as rationally as they did. No massive car pile-ups. No screaming in the streets. No looting or anything else (no strangling airport security). Just everyone seemingly taking it in stride, like... what can you do?

See also: Missle-Alert Error Reveals Uncertaintly About How to React, and What It Felt Like in Hawaii When Warning of an In-Bound Missile Arrived.]

Saturday, January 13, 2018

Audiophilia Forever: An Expensive New Year’s Shopping Guide

Here are some of the most beautiful recorded musical sounds that I have heard in the past few weeks: the matched horns and clarinet, very soft, in Duke Ellington’s “Mood Indigo,” recorded in 1950; Buddy Holly, in his just-hatched-this-morning voice, singing “Everyday,” recorded in 1957; the London Symphony Orchestra in full cry under André Previn, playing Shostakovich’s tragic wartime Symphony No. 8, recorded in 1973; and Willie Watson’s rich-sounding guitar, accompanying him singing “Samson and Delilah,” recorded last year. The source of all these sounds was a vinyl long-playing record.

I tried to quit. I tried to give up audiophilia. You might even say I stopped my ears. That is, I listened to my O.K. high-end audio rig when I could find a few hours, ignoring its inadequacies. But, most of the time, I listened to CDs ripped into iTunes and then played on an iPod with a decent set of headphones. Hundreds of hours of music were inscribed there: Wagner’s “Parsifal” and John Coltrane’s “Blue Train” and the Beatles’ “Rubber Soul”—soul music, indeed! The glories of Western music, if you want to be grand about it, were at my fingertips, and I was mostly content. For years, I relinquished the enthralling, debilitating, purse-emptying habit of high-end audio, that feverish discontent, that adolescent ecstatic longing for more—a better record player, speakers with more bottom weight, a CD player that completely filtered out such digital artifacts as ringing tones, brittleness, and hardness.

Most people listen to music in the way that’s convenient for them; they ignore the high-end stuff, if they’ve even heard of it, as an expensive fetish. But audiophiles are restless; they always have some sort of dream system in their heads. They are ready, if they can afford it, to swap, trade, buy. It’s not enough, for some listeners, to have a good turntable, CD player, streaming box, pre-amplifier, amplifier, phono stage, speakers, and top-shelf wires connecting them all together. No, they also need a power conditioner—to purify the A.C. current. Does it matter, each separate thing? The cables, too? Is it all nonsense? The debates rage on, for those who are interested. At the moment, the hottest thing in audio is “high-resolution streaming”—the hope, half-realized, of getting extraordinary sound through the Internet.

We audiophiles want timbal accuracy. We want the complex strands of an orchestral piece disentangled, voice recordings that reveal chest tones and a clear top, pianos that sound neither tinkly nor dull, with the decay of each note sustained (not cut off, as it is in most digital recordings). We want all that, yet the sound of live music is ineffable. The goal can never be reached. The quest itself is the point. (...)

Yet there’s a serious problem with most of the streaming services: the sound is no more than adequate (exceptions to follow). And therein lies a tale—a tale, from the high-end audiophile’s point of view, of commercial opportunism,betrayal, and, well, audiophile-led redemption. A little potted audio history is now in order.

The first betrayal: in the sixties, Japanese solid-state equipment (Sony, Panasonic, Yamaha, etc.) emerged as a low-cost mass-market phenomenon, driving American quality audio, which had made analog, vacuum-tube equipment, deep underground. The big American names (like Marantz and McIntosh) stayed quietly in business while a variety of engineers and entrepreneurs who loved music started small companies in garages and toolsheds. It was (and is) a story of romantic capitalism—entrepreneurship at its most creative. Skip forward twenty years, to the second betrayal: in 1982, digital sound and the compact disk were proclaimed by publicists and a gullible press as “perfect sound forever.” But any music lover could have told you that early digital was often dreadful—hard, congealed, harsh, even razory, the strings sounding like plastic, the trumpets like sharp instruments going under your scalp. The early transfer of “Rubber Soul,” just to take one example, was unlistenable.

The small but flourishing high-end industry responded to digital in three different ways: it produced blistering critiques of digital sound in the musically and technically literate audiophile magazines The Absolute Sound and Stereophile; it developed CD players that worked to filter out some of the digital artifacts; and it produced dozens of turntables, in every price range, which kept good sound and the long-playing record alive. Years ago, many refused to believe in the LP, but, really, anyone with a decent setup could have proved this to you: a well-recorded LP was warmer, more natural, more musical than a compact disk.

The recording industry woke up, as well: Sony and Phillips, which had developed the compact disc together, released, in 1999, a technology called D.S.D. (Direct Stream Digital) and embedded the results in Super Audio CDs—S.A.C.D. disks. Remember them? Some six thousand titles were produced, and the sound was definitely better than that of a standard CD. But the Super Audio CD was swamped by another marketing phenomenon—the creation of the iPod and similar devices, in 2001, which made vast libraries of music portable. So much for S.A.C.D.s—your music library was now in your hand! For me, the iPod was, for long periods, the default way of listening to music. God knows I have sinned. I knew that I wasn’t hearing anything like the best.

Which brings us to betrayal No. 3: music was streamed to iPods and laptops by squeezing data so that it would fit through the Internet pipes—the sound, in the jargon, was “lossy.” And that’s the sound—MP3 sound—that a generation of young people grew up with. The essentials of any kind of music came through, but nuance, the subtleties of shading and color, got slighted or lost. High-end types, both manufacturers and retailers, still lament this development with rage and tears. Availability was everything for the iPod generation. Well, yes, of course, says the high end, availability is a great boon. But most of the kids didn’t know that they were missing anything in the music.

Except for the few who did. A growing corpus of young music lovers have, in recent years, become attached to vinyl—demanding vinyl from their favorite groups as they issue new albums, flocking to new vinyl stores. For some, it may be about the sound. Or maybe it’s about backing away from corporate culture and salesmanship. Vinyl offers the joys of possessorship: if you go to a store, talk to other music lovers, and buy a record, you are committing to your taste, to your favorite group, to your friends. In New York, the independent-music scene, and the kinds of loyalties it creates, are central to vinyl. In any case, the young people buying vinyl have joined up with two sets of people who never really gave up on it: the scratchmaster d.j.s deploying vinyl on twin turntables, making music with their hands, and the audiophiles hoarding their LPs from decades ago. The audiophile reissue market has come blazingly to life:

by David Denby, New Yorker |  Read more:
Image: Janne Iivonen

Friday, January 12, 2018

How, and Why, the Spectre and Meltdown Patches Will Hurt Performance

As the industry continues to grapple with the Meltdown and Spectre attacks, operating system and browser developers in particular are continuing to develop and test schemes to protect against the problems. Simultaneously, microcode updates to alter processor behavior are also starting to ship.

Since news of these attacks first broke, it has been clear that resolving them is going to have some performance impact. Meltdown was presumed to have a substantial impact, at least for some workloads, but Spectre was more of an unknown due to its greater complexity. With patches and microcode now available (at least for some systems), that impact is now starting to become clearer. The situation is, as we should expect with these twin attacks, complex.

To recap: modern high-performance processors perform what is called speculative execution. They will make assumptions about which way branches in the code are taken and speculatively compute results accordingly. If they guess correctly, they win some extra performance; if they guess wrong, they throw away their speculatively calculated results. This is meant to be transparent to programs, but it turns out that this speculation slightly changes the state of the processor. These small changes can be measured, disclosing information about the data and instructions that were used speculatively.

With the Spectre attack, this information can be used to, for example, leak information within a browser (such as saved passwords or cookies) to a malicious JavaScript. With Meltdown, an attack that builds on the same principles, this information can leak data within the kernel memory.

Meltdown applies to Intel's x86 and Apple's ARM processors; it will also apply to ARM processors built on the new A75 design. Meltdown is fixed by changing how operating systems handle memory. Operating systems use structures called page tables to map between process or kernel memory and the underlying physical memory. Traditionally, the accessible memory given to each process is split in half; the bottom half, with a per-process page table, belongs to the process. The top half belongs to the kernel. This kernel half is shared between every process, using just one set of page table entries for every process. This design is both efficient—the processor has a special cache for page table entries—and convenient, as it makes communication between the kernel and process straightforward.

The fix for Meltdown is to split this shared address space. That way when user programs are running, the kernel half has an empty page table rather than the regular kernel page table. This makes it impossible for programs to speculatively use kernel addresses.

Spectre is believed to apply to every high-performance processor that has been sold for the last decade. Two versions have been shown. One version allows an attacker to "train" the processor's branch prediction machinery so that a victim process mispredicts and speculatively executes code of an attacker's choosing (with measurable side-effects); the other tricks the processor into making speculative accesses outside the bounds of an array. The array version operates within a single process; the branch prediction version allows a user process to "steer" the kernel's predicted branches, or one hyperthread to steer its sibling hyperthread, or a guest operating system to steer its hypervisor.

We have written previously about the responses from the industry. By now, Meltdown has been patched in Windows, Linux, macOS, and at least some BSD variants. Spectre is more complicated; at-risk applications (notably, browsers) are being updated to include certain Spectre mitigating techniques to guard against the array bounds variant. Operating system and processor updates are needed to address the branch prediction version. The branch prediction version of Spectre requires both operating system and processor microcode updates. While AMD initially downplayed the significance of this attack, the company has since published a microcode update to give operating systems the control they need.

These different mitigation techniques all come with a performance cost. Speculative execution is used to make the processor run our programs faster, and branch predictors are used to make that speculation adaptive to the specific programs and data that we're using. The countermeasures all make that speculation somewhat less powerful. The big question is, how much?

by Peter Bright, ARS Technica |  Read more:
Image: Aurich/Getty
[ed. A graduate seminar in micro-processor technology.]

Thursday, January 11, 2018


Gyakusou
[ed. Cool pants.]

Motion Capture Tech For Fixing Your Golf Game

Albert Einstein was once asked if he played golf. “No, no,” said the man who devised the theory of relativity. “Too complicated.” The story has served as a humbling reminder that even geniuses can find golf to be, as Bobby Jones, a co-founder of the Masters Tournament, described it, “a mystifying game.”

But in the past year, golf instructors have begun using an unassuming piece of technology that aims to take the guesswork out of your stroke. MySwing, introduced in late 2016, is a small box with 17 motion-capture sensors that ­attach to various parts of the body—the shin, the top of the feet, around the arms and chest and ­forehead. A separate one attaches to the club.

Once the sensors are calibrated on a Windows-based device, a skeletal avatar appears on screen and begins to move with you in real time. Take a few swings, and the feeling is similar to a science-­fiction fantasy. (Everyone from Game of Thrones to NASA creates characters using mapping tech from MySwing’s Beijing-based parent company, Noitom Ltd.—“motion” spelled backward.) The system ­re-creates the angles, tilt, and rotations of your swing and plays it back from overhead, behind, and the side.

The key, though, is the software, which produces line graphs and bar charts that tell you whether you need to be more patient with your arms and get your lower body to do a better job of initiating the downswing. It can observe, with sometimes excruciating detail, that the bum shoulder you got from playing college football is costing you 20 degrees on your turn, or that your right leg is overcompensating for a weak left one.

Swing-analyzing technology isn’t new, says golf instructor Ben Shear, who advises top pros such as Luke Donald and hosts the Golfers Edge show on SiriusXM’s PGA Tour Radio channel. But the old systems took an hour to set up, whereas MySwing takes about 20 minutes from start to finish. The sensors attach wirelessly, another first, and can be used indoors or outside. Most important, it’s only $6,000, a relatively affordable piece of equipment for a country club that wants a competitive advantage. (TrackMan Golf, the shot-monitoring technology familiar from television tournament broadcasts, runs closer to $25,000.) (...)

“A lot of golfers are guys who sit behind a desk working 60 hours a week, they’ve got three kids who are all in sports, and they’re driving them everywhere,” Shear says. “They’re not going to get to the gym four times a week. But they still want to know what their physical capabilities are. And then I can build a golf swing around what they can actually do.”

Some limitations may not be physical. “If you can’t chip and putt, then this isn’t going to help you all that much,” Shear says with a laugh. “If you’ve got a 4-footer and you just rolled it by 10 feet, then that’s why you’re not good at golf.”

by James Gaddy, Bloomberg |  Read more:
Image: MySwing

Wednesday, January 10, 2018

After Hours: Off-Peaking

Mr. Money Mustache is in his early 40s, and he has been retired for 12 years. “One of the key principles of Mustachianism,” begins a lofty 2013 post, “is that any and all lineups, queues, and other sardine-like collections of humans must be viewed with the squinty eyes of skepticism.” His blog explains that everything you have been taught about money and time is wrong. Mr. Money Mustache, once the subject of a New Yorker profile, worked as a software engineer and saved half of his salary from the age of 20, and his vision of time is that of an engineer: time becomes a machine that can be tinkered with, hours and minutes rewired to achieve a more elegant purpose. His primary message is that you will not achieve financial security and personal happiness by working harder to get ahead of the pack; you will find these things by carefully studying what the pack is doing and then doing the opposite.

A post entitled “A Peak Life is Lived Off-Peak” extols the virtues of doing everything at the wrong time. The Mustache family lives in Colorado, where everyone goes skiing on the weekends; Mr. Mustache recommends hitting the slopes on Tuesdays. The Mustaches drive through major cities between 10 in the morning and four in the afternoon. Thursday morning is for teaching robotics to his son, whom he homeschools; below-freezing nights in January are for moonlit walks. Holidays are to be taken only when everyone else is at work. “Most people spend most of their time doing what everyone else does, without giving it much thought,” Mr. Money Mustache writes. “And thus, it is usually very profitable to avoid doing what everyone else is doing.”

The Mustaches are not the only online evangelists for the off-peak lifestyle. In a post entitled, “I Want You to Become an Off-Peak Person!” Peter Shankman, an entrepreneur who writes about turning his ADHD to his advantage, recommends grocery shopping at one in the morning. J.P. Livingston’s blog the Money Habit features photos of New York City that make it seem like a small town: a thinly populated subway, a near-empty museum. (The bins in time’s bargain basement seem to be overflowing with Tuesdays: train rides, drinks, meals, museum visits, and movies are cheaper when they happen on what is referred to in Canada as “Toonie Tuesdays,” in Australia as “Tight-Arse Tuesdays.”)

The thesis of off-peak evangelism is summed up by one of Mr. Mustache’s calls for a rejection of conformity: “In our natural state,” he writes, “we are supposed to be a diverse and individualistic species.” It is natural, he argues, for individual schedules to vary — why should we all expect to eat, sleep, work, and play in lockstep, like members of a militaristic cult? Standardized schedules create waste and clog infrastructure. Off-peak evangelism proposes a market value to individuality and diversity as mechanisms for repurposing humanity’s collective wasted time. While not a formalized movement, people who blog about off-peaking often seem to feel that they’ve discovered a secret too good to keep to themselves — something that was right in front of us the whole time, requiring only that we recognize our own power to choose.

Off-peaking is the closest thing to a Platonic form of subculture: its entire content is its opposition to the mainstream. As an economic approach, the solution off-peaking proposes can seem unkind — it’s a microcosm of the larger capitalist idea that it is right to profit from the captivity of others. And yet off-peakers only want, in effect, to slow time down by stretching the best parts of experience while wasting less. The arguments for off-peaking have centered on both the economic and the social advantages of recuperating unexploited time, like a form of temporal dumpster-diving that restores worth to low-demand goods. (...)

Taken at its most individualistic, it can seem that the idea of off-peaking is not to free everyone from the bonds of inefficency, but to position oneself to take advantage of the unthinking conformity of others. Success depends upon continued brokenness, not on fixing what is broken — or at least, on fixing it only for oneself and a canny self-selecting few. In this view, off-peaking is a miniaturized entrepreneurialism that exploits a wonky blip in the way slots of time are assigned value; a matter of identifying an arbitrage opportunity created by the system’s lack of self-awareness.

The comment sections of off-peakers’ blogs are, paradoxically, bustling: stories of going to bed at nine and waking up at four to ensure that the day is perfectly out of step; Legoland on Wednesdays in October; eating in restaurants as soon as they open rather than waiting for standard meal times. There’s a wealth of bargains to be had by juggling one’s calendar to take advantage of deals. (The app Ibotta, which tracks fluctuating prices on consumer goods popular with millennials, determined that Tuesdays are actually the worst days to buy rosé and kombucha; you should buy them on Wednesdays. Avocados are also cheapest on Wednesdays, while quinoa should be bought on Thursdays and hot sauce on Fridays.) Many posters write that they are considering changing professions or homeschooling their children to join the off-peakers.

Some off-peakers are motivated by savings, some by avoiding crowds, but off-peaking also offers a more abstract pleasure: the sheer delight in doing the unexpected. The gravitas attached to the seasons of life listed off in Ecclesiastes is echoed in the moral overtones attached to perceptions of what is appropriate for different hours of the day. It is wrong to laugh when everyone else is weeping or to embrace when everyone else is refraining from embracing. Ordinary activities become subversive when done at the wrong time: eating spaghetti for dinner is ordinary, but having linguini with clam sauce for breakfast breaks the unwritten rules. Once you start transgressing, it can be hard to stop: The arbitrariness of custom begins to chafe.

But off-peakers are generally not hoping to be completely solitary in their pursuits; most people don’t want to be the only person in their step-aerobics class at two in the afternoon. Instead, they want to be one among a smaller, more manageable group than urban cohorts tend to allow. Subcultures offer the pleasure of being different along with the pleasure of being the same; variation becomes a passport to acceptance. The two people who encounter one another at the aquarium on a Wednesday morning appear to have more in common than the two hundred people who see each other there on a weekend. Like other choices that divide people into subsets, off-peaking allows its adherents to discover a kinship that may or may not reveal a significant similarity in worldview.

by Linda Besner, Real Life |  Read more:
Image: Movie Theater, Los Angeles by Ed Freeman
[ed. The New Yorker link on Mr. Money Mustache is a great read in itself.]

The Breeders

The Strange Brands In Your Instagram Feed

It all started with an Instagram ad for a coat, the West Louis (TM) Business-Man Windproof Long Coat to be specific. It looked like a decent camel coat, not fancy but fine. And I’d been looking for one just that color, so when the ad touting the coat popped up and the price was in the double-digits, I figured: hey, a deal!

The brand, West Louis, seemed like another one of the small clothing companies that has me tagged in the vast Facebook-advertising ecosystem as someone who likes buying clothes: Faherty, Birdwell Beach Britches, Life After Denim, some wool underwear brand that claims I only need two pairs per week, sundry bootmakers.

Perhaps the copy on the West Louis site was a little much, claiming “West Louis is the perfection of modern gentlemen clothing,” but in a world where an oil company can claim to “fuel connections,” who was I to fault a small entrepreneur for some purple prose?

Several weeks later, the coat showed up in a black plastic bag emblazoned with the markings of China Post, that nation’s postal service. I tore it open and pulled out the coat. The material has the softness of a Las Vegas carpet and the rich sheen of a velour jumpsuit. The fabric is so synthetic, it could probably be refined into bunker fuel for a ship. It was, technically, the item I ordered, only shabbier than I expected in every aspect.

I went to the West Louis Instagram account and found 20 total posts, all made between June and October of 2017. Most are just pictures of clothes. Doing a reverse image search, it’s clear that the Business-Man Windproof Long Coat is sold throughout the world on a variety of retail websites. Another sweatshirt I purchased through Instagram—I tracked down no less than 15 shops selling the identical item. I bought mine from Thecuttedge.life, but I could have gotten it from Gonthwid, Hzijue, Romwe, HypeClothing, Manvestment, Ladae Picassa, or Kovfee. Each very lightly brands the sweathshirt as its own, but features identical pictures of a mustachioed, tattooed model. That a decent percentage of the brands are unpronounceable in English just adds to the covfefe of it all.

All these sites use a platform called Shopify, which is like the Wordpress or Blogger of e-commerce, enabling completely turnkey online stores. Now, it has over 500,000 merchants, a number that’s grown 74 percent per year over the last five years. On the big shopping days around Thanksgiving, they were doing $1 million dollars in transactions per minute. And the “vast majority” of the stores on the service are small to medium-sized businesses, the company told me.

Shopify serves as the base layer for an emerging ecosystem that solders digital advertising through Facebook onto the world of Asian manufacturers and wholesalers who rep their companies on Alibaba and its foreigner-friendly counterpart, AliExpress.

It’s a fascinating new retail world, a mutation of globalized capitalism that’s been growing in the cracks of mainstream commerce.

Here’s how it works.

“What is up everybody?!” a fresh-faced man with messy brown hair shouts into the camera. Behind him, two computers sit open on a white desk in a white room. By the looks of him, he might not be an adult, but he has already learned to look directly into the camera when delivering the ever-appealing gospel of Easy Money on the Internet.

“In this challenge, I’m going to take a brand new Shopify store to over one thousand dollars,” he says. “So I invite you to follow along with me as I take this brand new store from 0, literally 0, to over one thousand dollars in the next 7 days.”

In the corner of YouTube dedicated to e-commerce, these videos are a bit of a phenomenon, racking up hundreds of thousands of views for highly detailed explanations of how to set up an e-commerce shop on the Internet.

Their star is Rory Ganon. Though his accent is Irish (“tousand”), his diction is pure LA YouTuber. He’s repetitive, makes quick cuts, and delivers every line with the conviction of youth. He appears to live in Ratoath, a small Irish commuter town about half an hour outside Dublin. His Facebook page describes him as a 17-year-old entrepreneur.

His success finding an audience seems predicated on the fact that when he says he’s going to show you everything, he really is going to show you everything. Like, you will watch his screen as he goes about setting up a store, so anyone can follow along at home. He’s a Bob Ross of e-commerce.

These techniques work the same for him as for Gucci. Some Instagram retailers are legit brands with employees and products. Others are simply middlemen for Chinese goods, built in bedrooms, and launched with no capital or inventory. All of them have been pulled into existence by the power of Instagram and Facebook ads combined with a suite of e-commerce tools based around Shopify.

The products don’t matter to the system, nor do they matter to Ganon. The whole idea of retail gets inverted in his videos. What he actually sells in his stores is secondary to how he does it. It’s as if he squirts hot dogs on his ketchup and mustard.

What Ganon does is pick suppliers he’ll never know to ship products he’ll never touch. All his effort goes into creating ads to capture prospective customers, and then optimizing a digital environment that encourages them to buy whatever piece of crap he’s put in front of them.

And he is not alone. (...)

Ganon’s videos are particularly fascinating in describing the mechanics of digital advertising through Instagram and Facebook.

In the tutorial, he briefly discusses finding a niche for the products in your store, and he uses some business school powerpoint terms. But when he actually selects a niche, it is Lions. That’s right: Lions, the animals.

by Alexis C. Madrigal, The Atlantic |  Read more:
Image: Alexis Madrigal