Sunday, October 20, 2019

Bob Dylan



[ed. What a weirdo (and he likes it like that). Lyrics]

John Rawlings, Lauren Bacall
via:

Heel Turns: The History of Modern Celebrity

When future historians study these troubled times, they will marvel at the relentless rise of sea levels, strongman politics and Kardashians. The fame-babies of a double murder (their father Robert Kardashian represented O. J. Simpson), the Kardashians and their extension pack, the Jenners, morphed from Los Angeles socialites into seemingly inevitable magnets of scandal, desire and money. Kim set the pace with a leaked sex tape in 2007, teaching the clan to cheerfully break the boundaries of good taste and common sense, to absorb the energy of the world’s criticism and translate it into cash. Keeping Up with the Kardashians, a reality show centred on the lives and careers of the family, first aired in late 2007 and is now in its sixteenth series. In 2014, Kim posed for Paper Magazine holding a champagne bottle, the foamy liquid squirting over her head and into the glass perched on her extended backside. Critics noted the channelling of the eighteenth-century Khoikhoi woman Sara “Saartjie” Baartman and debated whether Kim understood that she was the butt of an old racist joke. That year she made $28 million, overtaking Meryl Streep, Stephen King and J. K. Rowling on Forbes’s list of highest-paid celebrities. Her little sister, meanwhile, plumped her lips with filler, lied about it, and became the unwitting namesake of the “Kylie Jenner Lip Challenge”, in which masses of people pressed their lips into shot glasses, sucked as hard as possible, and then recoiled in horror at their own self-mutilation. Kylie responded by selling lipstick and, at twenty-one, quietly became the world’s youngest billionaire. No doubt the gang’s current racket hawking laxative teas, diet lollipops and candy-coloured vitamins will leap right over the naysayers and fuddy-duddys to reach the kids who can truly appreciate it. Like Antaeus drawing his fighting strength from the earth, the family is invigorated by Mother Notoriety, growing more powerful every time it seems to fall.

The Kardashian-Jenners have all the external trappings of charisma without its sacred core. This makes them useful for understanding the phenomenon of celebrity, much as a body whose soul has departed is handier for studying anatomy. They are famous for being famous, but why, after all, are they famous? Why, of all the personal stylists, exhibitionists and rich kids in Calabasas, CA, did they become such magnets for attention? You may not be one of Kim’s 143 million Instagram followers but you do know who she is.

There are two ways of telling the story of celebrity, and both are true. The first narrative holds celebrity to be a modern invention. There were always famous people, but they made their names through great deeds and works and with an eye to posterity. Glory usually came after death, in monuments and songs and rumours of miracles near their graves. They were kings and heroes and saints, embodiments of the highest and most precious values of their communities. Their example inspired the young and chastened the reprobate. Their touch healed the sick, their flesh a direct conduit to the divine. Then came modernity, with wires and steamships and women shameless enough to strut the stage. A celebrity changed from a man who had done useful, important things in the world to an entertainer, often female and young, with a knack for fascinating audiences. The religious fanatic transformed into a fan, eager for stolen glimpses of the beloved star, hungry for private gossip and salacious revelations, ready to buy an endorsed cigarette or shoe or perfume for the feeling of having come closer to her image. The internet sped up the process and took it to its inevitable conclusion. Celebrity became its own performance. Reality itself turned into a show, and ordinary people began to polish their personal brand. Fame was the accomplishment, the great deed, the healing salve, the song that sang itself.

The second version of the story is not as breathless, and suggests that celebrity has been around much longer. Even when women were kept from performing in the high drama of the ancient Roman stage, some captivated audiences with dance and music and bawdy mime. There was usually someone around to say it was a bad idea. In his treatise “On the Spectacles”, the second-century Christian writer Tertullian railed against the cross-dressing actors, pantomimes and women prostitutes on the stage, claiming that the entire allure of the theatre lay in its filth. The great heroes of old were contradictory figures, too: Mark Antony’s fame came with a dose of scandal and erotic transgression, as did Joan of Arc’s. Before being canonized and neutralized, saints and prophets enchanted their followers by refusing contemporary notions of the good life. Their disciples sought out the places where they had slept and suffered, travelled to touch a slip of skin or cloth or hair. Kings and queens may have paid less attention to their great deeds and more to their public image, to the masques and poems and ceremonies that cemented their exceptional status. Some, such as Elizabeth I, had a talent for turning a personal failing (her lack of children, for example) into evidence of divine nature. Contemporary celebrity culture is a pumped-up, sped-up version of an old dance between people who want to be special and the folks who want to watch them try.

In The Drama of Celebrity Sharon Marcus takes a middle path between these two narratives. Marcus acknowledges the long prehistory of modern-day stardom, but focuses on the flowering of celebrity culture in the West since the eighteenth century. In lucid prose, she describes celebrity as a drama with three main characters: celebrities, the public that adores and judges them, and the media producers who exalt, criticize and satirize. The star of the book is Sarah Bernhardt, the genial actress and calculatedly charismatic “godmother of modern celebrity culture”, whose success in shaping her public persona in the late nineteenth and early twentieth centuries was unprecedented. Marcus introduces a predictable supporting cast – Elvis Presley, Marilyn Monroe, Anna Pavlova, Madonna – but Bernhardt remains the magnetic centre of the story. The book reproduces a rich trove of archival material which, if it does not bring Bernhardt back to life, at least reveals the scintillating liveliness of her image a century ago. Photographs, engravings, paintings, fan scrapbooks, outlandish caricatures, letters and diaries all speak to Bernhardt’s hold on the public attention.

Bernhardt’s methods may sound familiar. She took little account of society’s rules for women or even of its lowered expectations of actresses. We might expect this in the sexual arena, and indeed, Bernhardt had a child out of wedlock, briefly married a much younger man, and had a long, possibly intimate relationship with another woman, the painter Louise Abbéma. But Bernhardt outraged in other ways, too, breaking her contract with the Théâtre-Français, managing her own productions, flitting around in a hot-air balloon and writing a book about her travels in the clouds. Her very body was an affront, slender at a time when public taste preferred plump and curvy women. Marcus explains the appeal of celebrity scandal as a kind of wish fulfilment. While most people who break the rules are stigmatized as a result, defiant celebrities – and notorious politicians – do not lose face. They hold out the promise of winning all of society’s rewards – money, fame, adoration – while ignoring its precepts. Obedient, scared mortals need not suffer the penalties of nonconformism to enjoy its pleasure: they can watch a star do it for them.

by Irina Dumitrescu, TLS |  Read more:
Image: Kevin Tachman/MG19/Getty Images for The Met Museum/Vogue

We Need A Truth-In-Advertising Commission For Voters

It was a pretty good troll. On Friday, Elizabeth Warren’s campaign ran an ad on Facebook saying that the social media giant had endorsed President Trump.

It hasn’t, of course — as the ad acknowledged a few sentences later. The goal, which it achieved and then some, was to draw attention to Facebook’s recent refusal to take down a Trump campaign ad that makes an objectively false claim about former Vice President Joe Biden and Ukraine, and the threat that stance poses to fair elections. Let’s see how you like having lies told publicly about you, Warren was saying to Mark Zuckerberg and company.

Facebook’s decision on the Biden ad wasn’t a one-off. As Warren later noted on Twitter, Facebook recently tweaked its policies to make its stance on false political content even more hands-off than before. “It is not our role to intervene when politicians speak,” a company VP wrote in explaining the move.

Attacks that play fast and loose with the facts may seem like the kind of hardball politics that’s gone on forever. After all, in 1828, Andrew Jackson’s campaign falsely accused President John Quincy Adams of pimping out an American girl to the Russian czar earlier in his career. But when voters make decisions based on false information spread virally to millions, the damage to the integrity of our elections is profound.

Think of it this way: We understand the harm done by voter suppression schemes that tell people the wrong location for their polling place or that the election is Wednesday not Tuesday, and the Brennan Center has helped draft legislation to crack down on them. There’s also the danger from “deep fakes” — manipulated images, spread online, that aim to falsely discredit or embarrass a candidate — a threat California recently aimed to tackle with a new law. Why should false facts aimed at affecting voters’ decisions be treated differently?

Or consider a different analogy: if advertisers make false claims about their own products or their competitors’, they can be fined by the Federal Trade Commission (FTC). That’s because we recognize that the free market can’t function effectively if consumers don’t have accurate facts — just as voters need accurate facts for free elections to work properly.

This isn’t just about Facebook. It shouldn’t be up to for-profit companies alone to decide which campaign messages can responsibly be aired and which aim to mislead voters. Instead, that should be done by an entity whose only goal is to further the public interest.

That’s why we need a neutral government regulator tasked with ensuring misinformation doesn’t undermine our elections. In my perfect world, this body would be empowered to block or punish false or substantially misleading campaign speech — whether in the form of campaign ads or comments by candidates and their backers. But the Supreme Court’s broad reading of the First Amendment makes that a nonstarter for the foreseeable future. Indeed, as of 2014, 27 states barred false political statements, but four of those laws have since been struck down. And state-level bans seem poorly suited to regulate campaign speech that, especially in a presidential race, is national in reach.

So for now, this body would function simply as an authoritative fact-checker, stamping “False” — or perhaps in some cases, a designation like “Unproven” or “Dubious”— on any political communication that merited it based on a careful, transparent investigation, just as health authorities stamp warnings on cigarettes.

That’s a role that appears to put it on much firmer constitutional ground than if it were authorized to actually block false speech. And it still falls well short of some measures adopted by other advanced democracies, including Canada, which criminalizes “knowingly making or publishing a false statement of fact in relation to the personal character or conduct of a candidate with the intention of affecting the result of an election.”

Of course, voters are deceived not only by ads that make flatly false claims, but also by broader attacks or storylines that are based on a false or misleading premise, even if no specific assertion is narrowly untrue. These kinds of made-up scandals can often be even more damaging than narrowly false statements, since the mainstream media has more trouble ignoring them. A prime example is the Biden-Ukraine story — in which Trump and his allies charge Biden had Ukraine’s top prosecutor fired in order to protect Biden’s son, even though the evidence shows Biden did no such thing — which despite being essentially false, may have done real damage to the former vice president’s campaign.

That’s why this body might also be authorized to declare such storylines broadly illegitimate. Yes, that would force it to make more subjective judgments about which attacks are fundamentally bogus and which are valid. But again, the FTC is authorized to go beyond narrow true or false determinations when considering whether an ad misleads consumers. Why should voters get less protection?

Perhaps the most obvious danger of a body like this is that it would be captured by one side. After all, giving Trump the power to upgrade his claims of “fake news” into official government rulings would be disastrous. So the issue of how its members would be chosen to ensure it stayed unbiased would be crucial. But that concern shouldn’t kill this conversation in its cradle. Perhaps the answer is to split membership between the two parties, or maybe it’s to try to ensure that commissioners are genuinely nonpartisan. But if states can create independent commissions to handle redistricting — an area that’s no less politically fraught — we shouldn’t assume it’s impossible to do the same here.

Of course, in the current political climate, plenty of highly engaged partisan voters will automatically view the decisions of a government panel — however fairly its members are appointed, and however transparent its decision-making —as illegitimate if the decision hurts that voter’s favored candidate. But having a claim officially declared false might still make an impression on some swing voters. More importantly, it could lead news outlets and fair-minded opinion-shapers to avoid amplifying the message, ultimately starving it of oxygen.

by Zachary Roth, TPM | Read more:
Image: Robert Alexander/Getty Images
[ed. Everyone hates the refs, but at least accept some objective authority calling the penalties.]

Grant Snider
via:

Revolution? There's an App For That.

Tsunami Democràtic is a radical, decentralized wing of the resurgent Catalan independence movement, centered around an anonymously authored app designed to coordinate revolutionary uprisings.

The Tsunami Democràtic app embodies the "be water" motto of the Hong Kong uprising and builds on the Sukey anti-kettling app from the UK's 2011 student protests: it can only be activated by scanning a QR code from an existing member, and once it is activated, it places you in a "cell" with nearby users and shows you actions taking place nearby -- measures designed to both coordinate protests and to limit the exposure when the police get ahold of the app.

The app is a sideloaded Android app and there's no Ios version, meaning that there's no way for either Google or Apple to remove the app from their stores under pressure from Madrid (Apple bans sideloading apps so it's Android-only).

The app was first made available on Oct 14 and in-app messages have promised its first major use tomorrow, on Oct 21. The app's had more than 270,000 downloads.

The app is a fork of an existing tool, Retroshare, and some of its source has been published for inspection. No one is sure whether the fork was created by a team of programmers or a dedicated individual, and without a full code audit, it's impossible to say whether it is either maliciously or accidentally exposing its users.

This is essentially a reworking of the revolutionary tactical doctrine set out by Heinlein in his 1966 science fiction novel The Moon is a Harsh Mistress (which also served as the inspiration for Ian McDonald's incredible Luna trilogy).
But another theory is also gaining ground. “I think it's a change of strategy of the main groups, which were involved in the first of our referendum two years ago,” says Luján. He believes that Tsunami Democràtic is a proxy group for the larger separatist organisations, and former members of the former Catalan government, currently residing in Brussels after fleeing the country in 2017. 
Some Catalan politicians – including president of the Generalitat, Quim Torra; its vice president, Pere Aragonès, and the president of the Parliament, Roger Torrent – have publicly supported the group on social media. Tsunami Democràtic denies any link. 
Spain’s interior ministry has expressed the desire to discover who is behind the group and the app, but this will likely be difficult – given it could be set up and run from anywhere in the world.
Catalonia has created a new kind of online activism. Everyone should pay attention [Laurie Clarke/Wired UK]

by Cory Doctorow, Boing Boing |  Read more:
Image: uncredited

Radical Survival Strategies for Struggling Colleges

When Steve Thorsett crunched the numbers, things looked grim.

Business was flagging. His flow of customers had fallen to a 10-year low, down nearly 20 percent since 2015. By the year after that, annual expenses were outpacing operating revenues by $14 million.

In an increasingly unforgiving market, Mr. Thorsett needed to do more than chip away at the margins of this problem. He could make cuts, but that was complicated in his industry, and would likely only speed the downward spiral. To differentiate himself from his competitors, this chief executive determined that his operation needed to grow bigger, not smaller.

So Mr. Thorsett took a classic shortcut to expansion. He found a partner that was on even shakier ground. The resulting acquisition will bring with it several hundred new consumers, allowing efficiencies of scale that can lower costs.

Now Mr. Thorsett radiates optimism about the future — something rare these days among his counterparts, many of whom face challenges as bad as or worse than he did.

Mr. Thorsett is the president of Willamette University, part of a higher education sector grappling with a sharp decline in enrollment and financial challenges that cry out not for incremental change, but for radical solutions. Colleges and universities that fail to adapt risk joining the average of 11 per year that the bond-rating firm Moody’s says have shut down in the last three years.

Thanks, among other reasons, to a decline in the number of 18-year-olds and low unemployment luring potential students straight into the work force, enrollment is down by more than 2.9 million since the last peak, in the fall of 2011, according to the National Student Clearinghouse Research Center. More than 400 colleges and universities still had seats available for freshmen and transfer students after the traditional May 1 deadline to enroll for this fall, the National Association for College Admission Counseling reports.

More are likely to go under; Moody’s projects that the pace of closings will soon reach 15 per year. Yet when asked what steps they are taking to avoid this fate, some campus leaders responded like the president of one small private liberal arts college in Pennsylvania. It would, he said, “continue to graduate students who will make a tangible and constructive difference in the world.”

The crisis has advanced beyond the point where those sorts of good intentions are enough, Mr. Thorsett said. He and others in higher education have been actively searching for concrete new ways to rebuild enrollment and produce much-needed revenue.

“This is a business,” Mr. Thorsett said. “It’s not for profit, but we have to keep the lights on. We have to build a model that’s sustainable.”

One way is through acquisitions like the one his university has made of the Claremont School of Theology in California, or C.S.T., which is being moved to the Salem, Ore., campus of Willamette, just as private companies might do to increase their size and cost-effectiveness.

The pace of mergers and acquisitions is predicted to pick up so quickly that the self-described first full-service university and college merger consulting firm, Higher Ed Consolidation Solutions, hung out its shingle in August. “Will there be more? Yeah, we’re betting on it,” said Brian Weinblatt, the firm’s founder.

Colleges are also working to reduce the number of dropouts, on the principle that it’s cheaper to provide the support required to keep tuition-paying students than to recruit more. A few are pushing job and on-time graduation guarantees as selling points. Several are getting into the business of corporate training, which is lucrative because employers foot the bill for workers who don’t need financial aid or fitness centers.

Many institutions are adding programs tied to real-time workplace demand, including online courses that appeal to people who are balancing their educations with families and work. Some are even squeezing small amounts of money from such things as renting out their dorm rooms in the summers on Airbnb, catering weddings and licensing their logos for products including (in the case of 48 universities and colleges) caskets and urns.

“You have to be thinking beyond the current business model, whoever you are,” said Stephen Spinelli Jr., president of Babson College, whose Academy for the Advancement of Global Entrepreneurial Learning makes money for the business university by training educators worldwide how to teach entrepreneurship. “That’s what higher education is going to have to do if it’s going to survive.”

Distinguishing itself is also part of Willamette’s even more aggressive strategy in acquiring the 134-year-old C.S.T., which was suffering multimillion-dollar annual shortfalls that, unlike Willamette, it could not make up from its endowment.

Among the institutions Willamette considers its competitors are small liberal-arts colleges such as Reed and Whitman. But it has something they don’t: several graduate divisions (Reed offers one master’s degree in liberal studies) and a goal of increasing its enrollment from the current 2,700 to 4,000 over the next 10 years, starting with about 400 from the theology school.

“‘Midsize university’ is a sweet spot,” said Mr. Thorsett, who is working to position his school as small enough to promise personal attention but big enough to offer lots of choice, while not coincidentally lowering per-unit costs by serving a larger study body. “The university nature of our institution lets us do things our competitors can’t do.”

by Jon Marcus, NY Times |  Read more:
Image: Ashlee Culverhouse/Chattanooga Times Free Press, via Associated Press
[ed. Tip: The NY Times paywall seems to have gotten tighter lately, so if you're having trouble accessing articles I'd recommend Cookie Remover (Chrome extension). Another tip for articles that require disabling your ad-blocker: just cut and paste the url in Outline.]

Saturday, October 19, 2019

Surveying the Entire Great Barrier Reef

Watching and Its Implications

In the 35 years that I have followed boxing, I’ve witnessed perhaps a dozen fighters killed or catastrophically injured. Indeed, most fights are in part spectacles of risk and are marketed as such. I once left an arena with the blood of the great American cruiserweight Steve Cunningham spattered over my lapels, a measure of the cost he paid in a fight that he nearly won. It was a thrilling evening, not because Cunningham shed blood but because he held his dignity against great odds fighting a much younger and stronger man. He was down four times, but he ended the bout on his feet. I hope that Cunningham, who has since retired, has no regrets.

Once among America’s most popular sports, boxing was brought low by its inveterate corruption—and the National Football League was a primary beneficiary of its downfall. Professional football’s violence is somewhat more sanitized, and it replaces the ethnic tribalism of boxing with slightly less corrosive regional loyalties. It’s therefore more appealing to the affluent audiences that advertisers want to reach. We’re discovering, though, that football is nearly as dangerous as boxing, and in the same way. Head trauma, it turns out, doesn’t discriminate.

A number of writers have recently suggested that moral disgust should cause spectators to turn their backs on football, or more saliently, to turn off their televisions. Though I disagree with this argument, we should take it seriously. A sports spectator is implicated in the violence of the games he watches, if only because the games wouldn’t be played without him. The NFL’s enormous television audience—now 45 percent female—creates the incentive structure that induces players to take risks with their health. The categorical imperative works like this: if no one watched football, the games wouldn’t be played, at least not on the same scale; less football would mean less head trauma; head trauma can generate debilitating chronic conditions in later life; therefore, we should not watch football.

Harvard professor Steven Pinker claims in his book The Better Angels of Our Nature that, notwithstanding what we see on the evening news, the broad arc of human history is moving away from violence and toward cooperation and community. Evolutionary biology increasingly rewards those who channel their aggression into productive work. This argument has enormous intuitive appeal, but in the end, it’s as much narrative as science, and the story may seem to take one shape in Cambridge and quite another in Camden or Kabul. For now, it remains true that violence is endemic to human life. The terms on which we engage it may be more or less in our control, depending upon our environment, but it finds us all eventually.

The ethical implications of spectatorship have inspired many books, though it’s doubtful as to how practically useful that literature is for people deciding whether they can permit themselves to watch a college football game or a Mixed Martial Arts fight. A leading work in the field is Susan Sontag’s Regarding the Pain of Others, a book-length essay that explores war photography and its effect on the viewer, securely away from the front lines. Sontag doesn’t speak directly where circumlocution is possible, and what she believes about the ethics of spectatorship is not always evident, but she gave the subject a vocabulary. She notes that war photography often recapitulates an insidious process of war itself—the depersonalization of the individual. Wars kill indiscriminately and in large numbers, and the images captured by war photographers are often of soldiers destined to remain anonymous.

Surely the opposite is true in sports, however. A fighter who dies in the ring retains his personhood. Indeed, he gains a kind of unsought immortality. Any serious fight fan can toll the names: Benny Paret, Davey Moore, Duk-Koo Kim—or 27-year-old Patrick Day, who died on Wednesday from injuries suffered in a bout last weekend. (...)

Each of us has his own ethical economy. Often that economy stems from how we were raised: with religion or without; with or without respect for the law; with our eyes trained to local or universal concerns. That some retired athletes suffer from terrible injuries incurred from competition is a fact that we should not turn away from. But what is the precise ethical principle being invoked by the claim that we may not watch contests in which adults voluntarily compete, and with the usual social goods as their quarry—money, fame, sexual opportunity—along with psychic benefits impossible to quantify? We know that athletes derive enormous satisfaction from pursuing excellence in sports, in addition to significant material benefits. Most of those walking around on artificial knees say that they would do it all again, even knowing the costs. We should not lightly assume that athletes don’t know what is good for them—or that we are responsible for protecting them from themselves.

by Jonathan Clarke, City Journal | Read more:
Image: Dustin Bradford/Getty Images

Friday, October 18, 2019

Flacks and Figures

I'm getting paid $1,000 for this article. Last year, I made roughly $50,000 between a 7:30 a.m. to 3:30 p.m. freelance gig writing celebrity news and publishing some one-off articles. I grew up middle class, though my divorced father eventually worked his way well into the upper-middle class. Financially speaking, I’m fine, though I live alone in Toronto, and I likely won’t be able to afford a house unless my parents die or my dad provides the cash for a down payment. You probably don’t need to know these details, but it may color what I say next: it is my opinion that wealthy journalists should disclose their wealth when matters of finance, taxation, or any public policy they report on will affect their bottom line.

Back in January, Anderson Cooper, scion of the Vanderbilt family, conducted a one-on-one 60 Minutes interview with the newly sworn-in congressional representative from New York’s 14th District, Alexandria Ocasio-Cortez. The splashy interview generated its biggest moment when Cooper suggested that Ocasio-Cortez’s policy agenda of Medicare for All and the Green New Deal was “radical,” asking her, “Do you call yourself a radical?” “Yeah. You know, if that’s what radical means, call me a radical,” she responded, defiantly.

Less viral but more telling was the exchange leading up to that moment, with Cooper pressing Ocasio-Cortez about the revenue needed to pay for her programs. “This would require, though, raising taxes,” he said, as though the very notion were absurd. When Ocasio-Cortez agreed that “people are going to have to start paying their fair share in taxes,” Cooper pressed her again, almost annoyed: “Do you have a specific on the tax rate?” This gave the first-year congresswoman space to explain top marginal tax rates because Cooper and the 60 Minutes producers evidently had no interest in doing so themselves. Which gets to what was so clarifying about the back-and-forth: not Cooper’s questions about how a politician intended to pay for her agenda, but his disbelief verging on indignation at the prospect of a tax increase for the wealthiest Americans. It’s an idea with broad popular support, though perhaps not among the Vanderbilts.

Imagine, for a moment, if, at the top of the segment, Cooper had told his audience—reminded them—that he is a multimillionaire. That he is the primetime anchor at one of the country’s biggest cable news outlets. Though CNN and CBS don’t disclose the value of their contracts with on-air talent, pegging Cooper’s earnings in the tens of millions isn’t a stretch. Take a look at Megyn Kelly’s $30 million exit package from NBC News—after being fired for being racist, no less!—and you’ll get a good sense of the exorbitant salaries networks pay their top anchors. So, imagine it. Cooper, before launching into a loaded line of questioning about Ocasio-Cortez’s tax policy, openly states to the audience, “In the interest of full-disclosure: I, Anderson Cooper, heir to a vast fortune, currently make more money per year than you plebs at home could dream of, and I would be directly affected by Ocasio-Cortez’s proposed 70 percent marginal tax on incomes over $10 million.” Would he then have had the gall to highlight the tax increase? And would any reasonable viewer have bought into his bullshit?

Avoiding conflicts of interest is basic ethical practice for journalists. Check any news organization or journalism school’s handbook on ethics, and you’ll find the concept is central to maintaining credibility in journalism. “Any personal or professional interests that conflict with [our allegiance to the public], whether in appearance or in reality, risk compromising our credibility,” explains NPR’s Ethics Handbook. “We are vigilant in disclosing to both our supervisors and the public any circumstances where our loyalties may be divided—extending to the interests of spouses and other family members—and when necessary, we recuse ourselves from related coverage.”

Watching for potential conflicts, understanding them, acknowledging and disclosing them, publicly where necessary, are among the core jobs of any journalist with a shred of self-respect. Consumers of journalism, meanwhile, are already accustomed to such disclosures, which often come in the form of “so-and-so company is owned by our parent company.” When spouses or family members are involved, a recusal is usually in order, but it’s not unheard of for a journalist or news anchor to state that one of the subjects in a story is a friend. This is all a matter of simple honesty, though it’s not always adhered to in the strictest terms. Still, the prejudicial effects of a journalist’s net worth never enter into the equation at all.

Searching through various publications’ codes of ethics, from the Washington Post to the New York Times, directly named conflicts of interest tend to fall into categories of familial relation, partisan work, direct financial entanglements, work outside the organization, and the accepting of gifts, travel, or direct payment. Listed nowhere is the matter of salary or wealth. Given a few moments’ thought, it’s staggering to consider all of the effort that went into the New York Times’ eleven-thousand-word “Ethical Journalism” handbook without its writers ever considering, at least on the page, their salaries or inherited wealth as potential conflicts. Then again, the paper that employs Bari Weiss to garner hate-clicks may not be the ideal place to search for structural critiques of capitalism.

by Corey Atad, The Baffler |  Read more:
Image: Zoë van Dijk

Pentagon Budget Could Pay for Medicare for All While Creating Progressive Foreign Policy Americans Want

The Institute for Policy Studies on Thursday shared the results of extensive research into how the $750 billion U.S. military budget could be significantly slashed, freeing up annual funding to cover the cost of Medicare for All—calling into question the notion that the program needs to create any tax burden whatsoever for working families.

Lindsay Koshgarian, director of the National Priorities Project at the Institute for Policy Studies (IPS), took aim in a New York Times op-ed at a "chorus of scolds" from both sides of the aisle who say that raising middle class taxes is the only way to pay for Medicare for All. The pervasive claim was a primary focus of Tuesday night's debate, while Medicare for All proponents Sens. Bernie Sanders (I-Vt.) and Elizabeth Warren (D-Mass.) attempted to focus on the dire need for a universal healthcare program.

At the Democratic presidential primary debate on CNN Tuesday night, Sen. Elizabeth Warren (D-Mass.) was criticized by some opponents for saying that "costs will go down for hardworking, middle-class families" under Medicare for All, without using the word "taxes." Sen. Bernie Sanders (I-Vt.), on the other hand, clearly stated that taxes may go up for some middle class families but pointed out that the increase would be more than offset by the fact that they'll no longer have to pay monthly premiums, deductibles, and other medical costs.

"All these ambitious policies of course will come with a hefty price tag," wrote Koshgarian. "Proposals to fund Medicare for All have focused on raising taxes. But what if we could imagine another way entirely?"

"Over 18 years, the United States has spent $4.9 trillion on wars, with only more intractable violence in the Middle East and beyond to show for it," she added. "That's nearly the $300 billion per year over the current system that is estimated to cover Medicare for All (though estimates vary)."

"While we can't un-spend that $4.9 trillion," Koshgarian continued, "imagine if we could make different choices for the next 20 years."

Koshgarian outlined a multitude of areas in which the U.S. government could shift more than $300 billion per year, currently used for military spending, to pay for a government-run healthcare program. Closing just half of U.S. military bases, for example, would immediately free up $90 billion.

"What are we doing with that base in Aruba, anyway?" Koshgarian asked.

by Julia Conley, Common Dreams |  Read more:
Image: David B. Gleason/Flickr/cc

My Adventures in Psychedelia

It all began with a book review. Last year, I read an article by David Aaronovitch in The Times of London about Michael Pollan’s How to Change Your Mind. The book concerns a resurgence of interest in psychedelic drugs, which were widely banned after Timothy Leary’s antics with LSD, starting in the late 1960s, in which he encouraged American youth to “turn on, tune in, and drop out.” In recent years, though, scientists have started to test therapeutic uses of psychedelics for an extraordinary range of ailments, including depression, addiction, and end-of-life angst.

Aaronovitch mentioned in passing that he had been intrigued enough to book a “psychedelic retreat” in the Netherlands run by the British Psychedelic Society, though, in the event, his wife put her foot down and he canceled. To try psychedelics was something I’d secretly hankered after doing ever since I was a teenager, but I was always too cautious and risk-averse. As I got older, the moment seemed to pass. Today I am a middle-aged journalist working in London, the finance editor of The Economist, a wife, mother, and, to all appearances, a person totally devoid of countercultural tendencies.

And yet… on impulse, I arranged to go. Only after I booked did I tell my husband. He was bemused, but said it was fine by him, as long as I didn’t decide while I was under the influence that I didn’t love him anymore. My eighteen-year-old son thought the whole thing was hilarious (it turns out that your mother tripping is a good way to make drugs seem less cool).
***
One day, after closing that week’s finance and economics section of The Economist, I boarded a Eurostar train to Amsterdam. The next day, I met my fellow travelers—ten of them in all, from various parts of Europe and the United States—in a headshop in Amsterdam. Per the instructions we’d received, we each bought two one-ounce bags of “High Hawaiian” truffles—squishy, light brown fungi in a vacuum pack—at a discounted price of 40 euros, and headed off for four days in a converted barn in the countryside.

I had a foreboding that, besides whatever psychedelic experience I might have, there would also be a lot of chanting and holding strangers’ hands. I’m an atheist and devout skeptic: I don’t believe in chi or acupuncture, and have no time for crystals and chimes. But, mindful that it’s arrogant to remain aloof in such circumstances, I decided I would throw myself into whatever was asked of me.

And so, I not only did yoga and meditation, but also engaged in lengthy periods of shaking my whole body with my eyes closed and “vocal toning”—letting a sound, any sound, escape on every out-breath. I looked into the eyes of someone I had just met and asked, again and again, as instructed: “What does freedom mean to you?” I joined “sharing circles.” All this was intended to prepare us for the trip. The facilitators talked of the importance of your “set” (or state of mind) and of feeling safe and comfortable in your “setting” (where you are and who you’re with).

One of my fellow trippers had taken part in a psilocybin trial at King’s College London. He and three others received at random either a placebo or a low, normal, or high dose of the drug in pill form. It was obvious, he said, that he was the only one given the placebo. To make bad trips less likely, the researchers had advised the participants not to resist anything that happened: “If you see a dragon, go toward it.” The misery of sitting, stone sober, in a room with three people who were evidently having a fascinating time was why he had come on this retreat. “They all had dragons,” he told me. “I wanted a dragon, too.”

People who have taken psychedelics commonly rank the experience as among the most profound of their lives. For my part, I wasn’t searching for myself, or God, or transcendence; nor, with a happy, fulfilling life, was I looking for relief from depression or grief. But I was struck by something Pollan discusses in his book: studies in which therapists used trips to treat addiction.

I’ve never smoked and have no dramatic vices, but the habits of drinking coffee through the morning and a glass of wine or two most evenings had crept up on me in recent years. Neither seemed serious but both had come to feel like necessities—part of a larger pattern of a rushed, undeliberative life with too much done out of compulsion, rather than desire or pleasure. It is the middle-aged rather than the young who could most benefit from an “experience of the numinous,” said Carl Jung, quoted by Pollan.

by Helen Joyce, NYRB |  Read more:
Image: United Archives/Carl Simon/Bridgeman Images

Thursday, October 17, 2019

Bill Kirchen


[ed. Telecaster master and rockabilly legend. See: this clip where he talks about unique guitar modifications and techniques; and this one, demonstrating how he gets a "pop" out of guitar riffs.]

Alexander Kanoldt, Still Life With Guitar (Still Life VI), 1926

Eileen Williams, Whale Watching Alaska

Crash Course

How Boeing's Managerial Revolution Created the 737 MAX Disaster.

Nearly two decades before Boeing’s MCAS system crashed two of the plane-maker’s brand-new 737 MAX jets, Stan Sorscher knew his company’s increasingly toxic mode of operating would create a disaster of some kind. A long and proud “safety culture” was rapidly being replaced, he argued, with “a culture of financial bullshit, a culture of groupthink.”


Sorscher, a physicist who’d worked at Boeing more than two decades and had led negotiations there for the engineers’ union, had become obsessed with management culture. He said he didn’t previously imagine Boeing’s brave new managerial caste creating a problem as dumb and glaringly obvious as MCAS (or the Maneuvering Characteristics Augmentation System, as a handful of software wizards had dubbed it). Mostly he worried about shriveling market share driving sales and head count into the ground, the things that keep post-industrial American labor leaders up at night. On some level, though, he saw it all coming; he even demonstrated how the costs of a grounded plane would dwarf the short-term savings achieved from the latest outsourcing binge in one of his reports that no one read back in 2002.*

Sorscher had spent the early aughts campaigning to preserve the company’s estimable engineering legacy. He had mountains of evidence to support his position, mostly acquired via Boeing’s 1997 acquisition of McDonnell Douglas, a dysfunctional firm with a dilapidated aircraft plant in Long Beach and a CEO who liked to use what he called the “Hollywood model” for dealing with engineers: Hire them for a few months when project deadlines are nigh, fire them when you need to make numbers. In 2000, Boeing’s engineers staged a 40-day strike over the McDonnell deal’s fallout; while they won major material concessions from management, they lost the culture war. They also inherited a notoriously dysfunctional product line from the corner-cutting market gurus at McDonnell.


And while Boeing’s engineers toiled to get McDonnell’s lemon planes into the sky, their own hopes of designing a new plane to compete with Airbus, Boeing’s only global market rival, were shriveling. Under the sway of all the naysayers who had called out the folly of the McDonnell deal, the board had adopted a hard-line “never again” posture toward ambitious new planes. Boeing’s leaders began crying “crocodile tears,” Sorscher claimed, about the development costs of 1995’s 777, even though some industry insiders estimate that it became the most profitable plane of all time. The premise behind this complaining was silly, Sorscher contended in PowerPoint presentations and a Harvard Business School-style case study on the topic. A return to the “problem-solving” culture and managerial structure of yore, he explained over and over again to anyone who would listen, was the only sensible way to generate shareholder value. But when he brought that message on the road, he rarely elicited much more than an eye roll. “I’m not buying it,” was a common response. Occasionally, though, someone in the audience was outright mean, like the Wall Street analyst who cut him off mid-sentence:


“Look, I get it. What you’re telling me is that your business is different. That you’re special. Well, listen: Everybody thinks his business is different, because everybody is the same. Nobody. Is. Different.”


And indeed, that would appear to be the real moral of this story: Airplane manufacturing is no different from mortgage lending or insulin distribution or make-believe blood analyzing software—another cash cow for the one percent, bound inexorably for the slaughterhouse. In the now infamous debacle of the Boeing 737 MAX, the company produced a plane outfitted with a half-assed bit of software programmed to override all pilot input and nosedive when a little vane on the side of the fuselage told it the nose was pitching up. The vane was also not terribly reliable, possibly due to assembly line lapses reported by a whistle-blower, and when the plane processed the bad data it received, it promptly dove into the sea.


It is understood, now more than ever, that capitalism does half-assed things like that, especially in concert with computer software and oblivious regulators: AIG famously told investors it was hard for management to contemplate “a scenario within any kind of realm of reason that would see us losing one dollar in any of those transactions” that would, a few months later, lose the firm well over $100 billion—but hey, the risk management algorithms had been wrong. A couple of years later, a single JP Morgan trader lost $6 billion because someone had programmed one of the cells in the bank’s risk management spreadsheet to divide two numbers by their sum instead of their average. Boeing was not, of course, a hedge fund: It was way better, a stock that had more than doubled since the Trump inauguration, outperforming the Dow in the 22 months before Lion Air 610 plunged into the Java Sea.


And so there was something unsettlingly familiar when the world first learned of MCAS in November, about two weeks after the system’s unthinkable stupidity drove the two-month-old plane and all 189 people on it to a horrific death. It smacked of the sort of screwup a 23-year-old intern might have made—and indeed, much of the software on the MAX had been engineered by recent grads of Indian software-coding academies making as little as $9 an hour, part of Boeing management’s endless war on the unions that once represented more than half its employees. Down in South Carolina, a nonunion Boeing assembly line that opened in 2011 had for years churned out scores of whistle-blower complaints and wrongful termination lawsuits packed with scenes wherein quality-control documents were regularly forged, employees who enforced standards were sabotaged, and planes were routinely delivered to airlines with loose screws, scratched windows, and random debris everywhere. The MCAS crash was just the latest installment in a broader pattern so thoroughly ingrained in the business news cycle that the muckraking finance blog Naked Capitalism titled its first post about MCAS “Boeing, Crapification and the Lion Air Crash.”


But not everyone viewed the crash with such a jaundiced eye—it was, after all, the world’s first self-hijacking plane. Pilots were particularly stunned, because MCAS had been a big secret, largely kept from Boeing’s own test pilots, mentioned only once in the glossary of the plane’s 1,600-page manual, left entirely out of the 56-minute iPad refresher course that some 737-certified pilots took for MAX certification, and—in a last-minute edit—removed from the November 7 emergency airworthiness directive the Federal Aviation Administration had issued two weeks after the Lion Air crash, ostensibly to “remind” pilots of the protocol for responding to a “runaway stabilizer.” Most pilots first heard about MCAS from their unions, which had in turn gotten wind of the software from a supplementary bulletin Boeing sent airlines to accompany the airworthiness directive. Outraged, they took to message boards, and a few called veteran aerospace reporters like The Seattle Times’ Dominic Gates, The Wall Street Journal’s Andy Pasztor, and Sean Broderick at Aviation Week—who in turn interviewed engineers who seemed equally shocked. Other pilots, like Ethiopian Airlines instructor Bernd Kai von Hoesslin, vented to their own corporate management, pleading for more resources to train people on the scary new planes—just weeks before von Hoesslin’s carrier would suffer its own MAX-engineered mass tragedy.
 (...)

Simulator training for Southwest’s 9,000 pilots would have been a pain, but hardly ruinous; aviation industry analyst Kit Darby said it would cost about $2,000 a head. It was also unlikely: The FAA had three levels of “differences” training that wouldn’t have necessarily required simulators. But the No Sim Edict would haunt the program; it basically required any change significant enough for designers to worry about to be concealed, suppressed, or relegated to a footnote that would then be redacted from the final version of the MAX. And that was a predicament, because for every other airline buying the MAX, the selling point was a major difference from the last generation of 737: unprecedented fuel efficiency in line with the new Airbus A320neo.


The MAX and the Neo derived their fuel efficiency from the same source: massive “LEAP” engines manufactured by CFM, a 50-50 joint venture of GE and the French conglomerate Safran. The engines’ fans were 20 inches—or just over 40 percent larger in diameter than the original 737 Pratt & Whitneys, and the engines themselves weighed in at approximately 6,120 pounds, about twice the weight of the original engines. The planes were also considerably longer, heavier, and wider of wingspan. What they couldn’t be, without redesigning the landing gear and really jeopardizing the grandfathered FAA certification, was taller, and that was a problem. The engines were too big to tuck into their original spot underneath the wings, so engineers mounted them slightly forward, just in front of the wings.


This alteration created a shift in the plane’s center of gravity pronounced enough that it raised a red flag when the MAX was still just a model plane about the size of an eagle, running tests in a wind tunnel. The model kept botching certain extreme maneuvers, because the plane’s new aerodynamic profile was dragging its tail down and causing its nose to pitch up. So the engineers devised a software fix called MCAS, which pushed the nose down in response to an obscure set of circumstances in conjunction with the “speed trim system,” which Boeing had devised in the 1980s to smooth takeoffs. Once the 737 MAX materialized as a real-life plane about four years later, however, test pilots discovered new realms in which the plane was more stall-prone than its predecessors. So Boeing modified MCAS to turn down the nose of the plane whenever an angle-of-attack (AOA) sensor detected a stall, regardless of the speed. That involved giving the system more power and removing a safeguard, but not, in any formal or genuine way, running its modifications by the FAA, which might have had reservations with two critical traits of the revamped system: Firstly, that there are two AOA sensors on a 737, but only one, fatefully, was programmed to trigger MCAS. The former Boeing engineer Ludtke and an anonymous whistle-blower interviewed by 60 Minutes Australia both have a simple explanation for this: Any program coded to take data from both sensors would have had to account for the possibility the sensors might disagree with each other and devise a contingency for reconciling the mixed signals. Whatever that contingency, it would have involved some kind of cockpit alert, which would in turn have required additional training—probably not level-D training, but no one wanted to risk that. So the system was programmed to turn the nose down at the feedback of a single (and somewhat flimsy) sensor. And, for still unknown and truly mysterious reasons, it was programmed to nosedive again five seconds later, and again five seconds after that, over and over ad literal nauseam.


And then, just for good measure, a Boeing technical pilot emailed the FAA and casually asked that the reference to the software be deleted from the pilot manual.


So no more than a handful of people in the world knew MCAS even existed before it became infamous. Here, a generation after Boeing’s initial lurch into financialization, was the entirely predictable outcome of the byzantine process by which investment capital becomes completely abstracted from basic protocols of production and oversight: a flight-correction system that was essentially jerry-built to crash a plane. “If you’re looking for an example of late stage capitalism or whatever you want to call it,” said longtime aerospace consultant Richard Aboulafia, “it’s a pretty good one.”


by Maureen Tkacik, The New Republic |  Read more:
Image: Getty

Diplomacy for Third Graders


via: here (The Guardian) and here (New Yorker).
[ed. See also: Erdoğan Threw Trump's Insane Letter Right in the Trash (Vanity Fair); and The Madman Has No Clothes (TNR).]

Make Physics Real Again

Why have so many physicists shrugged off the paradoxes of quantum mechanics?

No other scientific theory can match the depth, range, and accuracy of quantum mechanics. It sheds light on deep theoretical questions — such as why matter doesn’t collapse — and abounds with practical applications — transistors, lasers, MRI scans. It has been validated by empirical tests with astonishing precision, comparable to predicting the distance between Los Angeles and New York to within the width of a human hair.

And no other theory is so weird: Light, electrons, and other fundamental constituents of the world sometimes behave as waves, spread out over space, and other times as particles, each localized to a certain place. These models are incompatible, and which one the world seems to reveal will be determined by what question is asked of it. The uncertainty principle says that trying to measure one property of an object more precisely will make measurements of other properties less precise. And the dominant interpretation of quantum mechanics says that those properties don’t even exist until they’re observed — the observation is what brings them about.

“I think I can safely say,” wrote Richard Feynman, one of the subject’s masters, “that nobody understands quantum mechanics.” He went on to add, “Do not keep saying to yourself, if you can possibly avoid it, ‘But how can it be like that?’ because you will get ‘down the drain,’ into a blind alley from which nobody has yet escaped.” Understandably, most working scientists would rather apply their highly successful tools than probe the perplexing question of what those tools mean.

The prevailing answer to that question has been the so-called Copenhagen interpretation, developed in the circle led by Niels Bohr, one of the founders of quantum mechanics. About this orthodoxy N. David Mermin, some intellectual generations removed from Bohr, famously complained, “If I were forced to sum up in one sentence what the Copenhagen interpretation says to me, it would be ‘Shut up and calculate!’” It works. Stop kvetching. Why fix what ain’t broke? Mermin later regretted sounding snotty, but re-emphasized that the question of meaning is important and remains open. The physicist Roderich Tumulka, as quoted in a 2016 interview, is more pugnacious: “Ptolemy’s theory” — of an earth-centered universe — “made perfect sense. It just happened not to be right. But Copenhagen quantum mechanics is incoherent, and thus is not even a reasonable theory to begin with.” This, you will not be surprised to learn, has been disputed.

In What Is Real? the physicist and science writer Adam Becker offers a history of what his subtitle calls “the unfinished quest for the meaning of quantum physics.” Although it is certainly unfinished, it is, as quests go, a few knights short of a Round Table. After the generation of pioneers, foundational work in quantum mechanics became stigmatized as a fringe pursuit, a career killer. So Becker’s well-written book is part science, part sociology (a study of the extrascientific forces that helped solidify the orthodoxy), and part drama (a story of the ideas and often vivid personalities of some dissenters and the shabby treatment they have often received).

The publisher’s blurb breathlessly promises “the untold story of the heretical thinkers who dared to question the nature of our quantum universe” and a “gripping story of this battle of ideas and the courageous scientists who dared to stand up for truth.” But What Is Real? doesn’t live down to that lurid black-and-white logline. It does make a heartfelt and persuasive case that serious problems with the foundations of quantum mechanics have been persistently, even disgracefully, swept under the carpet. (...)

At the end of the nineteenth century, fundamental physics modeled the constituents of the world as particles (discrete lumps of stuff localized in space) and fields (gravity and electromagnetism, continuous and spread throughout space). Particles traveled through the fields, interacting with them and with each other. Light was a wave rippling through the electromagnetic field.

Quantum mechanics arose when certain puzzling phenomena seemed explicable only by supposing that light, firmly established by Maxwell’s theory of electromagnetism as a wave, was acting as if composed of particles. French physicist Louis de Broglie then postulated that all the things believed to be particles could at times behave like waves.

Consider the famous “double-slit” experiment. The experimental apparatus consists of a device that sends electrons, one at a time, toward a barrier with a slit in it and, at some distance behind the barrier, a screen that glows wherever an electron strikes it. The journey of each electron can be usefully thought of in two parts. In the first, the electron either hits the barrier and stops, or it passes through the slit. In the second, if the electron does pass through the slit, it continues on to the screen. The flashes seen on the screen line up with the gun and slit, just as we’d expect from a particle fired like a bullet from the electron gun.

But if we now cut another slit in the barrier, it turns out that its mere existence somehow affects the second part of an electron’s journey. The screen lights up in unexpected places, not always lined up with either of the slits — as if, on reaching one slit, an electron checks whether it had the option of going through the other one and, if so, acquires permission to go anywhere it likes. Well, not quite anywhere: Although we can’t predict where any particular shot will strike the screen, we can statistically predict the overall results of many shots. Their accumulation produces a pattern that looks like the pattern formed by two waves meeting on the surface of a pond. Waves interfere with one another: When two crests or two troughs meet, they reinforce by making a taller crest or deeper trough; when a crest meets a trough, they cancel and leave the surface undisturbed. In the pattern that accumulates on the screen, bright places correspond to reinforcement, dim places to cancellation.

We rethink. Perhaps, taking the pattern as a clue, an electron is really like a wave, a ripple in some field. When the electron wave reaches the barrier, part of it passes through one slit, part through the other, and the pattern we see results from their interference.

There’s an obvious problem: Maybe a stream of electrons can act like a wave (as a stream of water molecules makes up a water wave), but our apparatus sends electrons one at a time. The electron-as-wave model thus requires that firing a single electron causes something to pass through both slits. To check that, we place beside each slit a monitor that will signal when it sees something pass. What we find on firing the gun is that one monitor or the other may signal, but never both; a single electron doesn’t go through both slits. Even worse, when the monitors are in place, no interference pattern forms on the screen. This attempt to observe directly how the pattern arose eliminates what we’re trying to explain. We have to rethink again.

At which point Copenhagen says: Stop! This is puzzling enough without creating unnecessary difficulties. All we actually observe is where an electron strikes the screen — or, if the monitors have been installed, which slit it passes through. If we insist on a theory that accounts for the electron’s journey — the purely hypothetical track of locations it passes through on the way to where it’s actually seen — that theory will be forced to account for where it is when we’re not looking. Pascual Jordan, an important member of Bohr’s circle, cut the Gordian knot: An electron does not have a position until it is observed; the observation is what compels it to assume one. Quantum mechanics makes statistical predictions about where it is more or less likely to be observed.

That move eliminates some awkward questions but sounds uncomfortably like an old joke: The patient lifts his arm and says, “Doc, it hurts when I do this.” The doctor responds, “So don’t do that.” But Jordan’s assertion was not gratuitous. The best available theory did not make it possible to refer to the current location of an unobserved electron, yet that did not prevent it from explaining experimental data or making accurate and testable predictions. Further, there seemed to be no obvious way to incorporate such references, and it was widely believed that it would be impossible to do so (about which more later). It seemed natural, if not quite logically obligatory, to take the leap of asserting that there is no such thing as the location of an electron that is not being observed. For many, this hardened into dogma — that quantum mechanics was a complete and final theory, and attempts to incorporate allegedly missing information were dangerously wrongheaded.

But what is an observation, and what gives it such magical power that it can force a particle to have a location? Is there something special about an observation that distinguishes it from any other physical interaction? Does an observation require an observer? (If so, what was the universe doing before we showed up to observe it?) This constellation of puzzles has come to be called “the measurement problem.”

Bohr postulated a distinction between the quantum world and the world of everyday objects. A “classical” object is an object of everyday experience. It has, for example, a definite position and momentum, whether observed or not. A “quantum” object, such as an electron, has a different status; it’s an abstraction. Some properties, such as electrical charge, belong to the electron abstraction intrinsically, but others can be said to exist only when they are measured or observed. An observation is an event that occurs when the two worlds interact: A quantum-mechanical measurement takes place at the boundary, when a (very small) quantum object interacts with a (much larger) classical object such as a measuring device in a lab.

Experiments have steadily pushed the boundary outward, having demonstrated the double-slit experiment not only with photons and electrons, but also with atoms and even with large molecules consisting of hundreds of atoms, thus millions of times more massive than electrons. Why shouldn’t the same laws of physics apply even to large, classical objects?

Enter Schrödinger’s cat...

by David Guaspari, The New Atlantis | Read more:
Image: Shutterstock