Saturday, August 17, 2019

The American Aristotle

[I intend] to make a philosophy like that of Aristotle, that is to say, to outline a theory so comprehensive that, for a long time to come, the entire work of human reason, in philosophy of every school and kind, in mathematics, in psychology, in physical science, in history, in sociology and in whatever other department there may be, shall appear as the filling up of its details.
C S Peirce, Collected Papers (1931-58)
The roll of scientists born in the 19th century is as impressive as any century in history. Names such as Albert Einstein, Nikola Tesla, George Washington Carver, Alfred North Whitehead, Louis Agassiz, Benjamin Peirce, Leo Szilard, Edwin Hubble, Katharine Blodgett, Thomas Edison, Gerty Cori, Maria Mitchell, Annie Jump Cannon and Norbert Wiener created a legacy of knowledge and scientific method that fuels our modern lives. Which of these, though, was ‘the best’?

Remarkably, in the brilliant light of these names, there was in fact a scientist who surpassed all others in sheer intellectual virtuosity. Charles Sanders Peirce (1839-1914), pronounced ‘purse’, was a solitary eccentric working in the town of Milford, Pennsylvania, isolated from any intellectual centre. Although many of his contemporaries shared the view that Peirce was a genius of historic proportions, he is little-known today. His current obscurity belies the prediction of the German mathematician Ernst Schröder, who said that Peirce’s ‘fame [will] shine like that of Leibniz or Aristotle into all the thousands of years to come’.

Some might doubt this lofty view of Peirce. Others might admire him for this or that contribution yet, overall, hold an opinion of his oeuvre similar to that expressed by the psychologist William James on one of his lectures, that it was like ‘flashes of brilliant light relieved against Cimmerian darkness’. Peirce might have good things to say, so this reasoning goes, but they are too abstruse for the nonspecialist to understand. I think that a great deal of Peirce’s reputation for obscurity is due, not to Peirce per se, but to the poor organisation and editing of his papers during their early storage at and control by Harvard University (for more on this, see André de Tienne’s insightful history of those papers).

Such skepticism, however incorrect, becomes self-reinforcing. Because relatively few people have heard of Peirce, at least relative to the names above, and because he has therefore had a negligible influence in popular culture, some assume that he merits nothing more than minor fame. But there are excellent reasons why it is worth getting to know more about him. The leading Peirce scholar ever, Max Fisch, described Peirce’s intellectual significance in this fecund paragraph from 1981:
Who is the most original and the most versatile intellect that the Americas have so far produced? The answer ‘Charles S Peirce’ is uncontested, because any second would be so far behind as not to be worth nominating. Mathematician, astronomer, chemist, geodesist, surveyor, cartographer, metrologist, spectroscopist, engineer, inventor; psychologist, philologist, lexicographer, historian of science, mathematical economist, lifelong student of medicine; book reviewer, dramatist, actor, short-story writer; phenomenologist, semiotician, logician, rhetorician [and] metaphysician … He was, for a few examples, … the first metrologist to use a wave-length of light as a unit of measure, the inventor of the quincuncial projection of the sphere, the first known conceiver of the design and theory of an electric switching-circuit computer, and the founder of ‘the economy of research’. He is the only system-building philosopher in the Americas who has been both competent and productive in logic, in mathematics, and in a wide range of sciences. If he has had any equals in that respect in the entire history of philosophy, they do not number more than two.
Peirce came from a well-to-do, prominent family of senators, businessmen and mathematicians. His father, Benjamin Peirce, was considered the greatest US mathematician of his generation, teaching mathematics and astronomy at Harvard for some 50 years. Charles’s brother, James, also taught mathematics at Harvard, eventually becoming a dean there. C S Peirce was, on the other hand, despised by the presidents of Harvard (Charles Eliot; where Peirce studied) and Johns Hopkins University (Daniel Gilman; where Peirce initially taught). Eliot and Gilman, among others, actively opposed Peirce’s employment at any US institution of higher education and thus kept him in penury for the latter years of his life. They falsely accused him of immorality and underestimated his brilliance due to input from jealous rivals, such as Simon Newcomb.

Though the story of Peirce’s life and thinking processes is inspiring and informative, this story is not told here. (I recommend Joseph Brent’s 1998 biography of Peirce as an excellent beginning. My own planned intellectual biography of Peirce intends to trace his life from his Pers family roots in Belgium in the 17th century to the history of the influence of his work on modern philosophy and science.) The objective here is rather to highlight some portions of Peirce’s thought to explain why his theories are so important and relevant to contemporary thinking across a wide range of topics.

The importance and range of Peirce’s contributions to science, mathematics and philosophy can be appreciated partially by recognising that many of the most important advances in philosophy and science over the past 150 years originated with Peirce: the development of mathematical logic (before and arguably better eventually than Gottlob Frege); the development of semiotics (before and arguably better than Ferdinand de Saussure); the philosophical school of pragmatism (before and arguably better than William James); the modern development of phenomenology (independently of and arguably superior to Edmund Husserl); and the invention of universal grammar with the property of recursion (before and arguably better than Noam Chomsky; though, for Peirce, universal grammar – a term he first used in 1865 – was the set of constraints on signs, with syntax playing a lesser role).

Beyond these philosophical contributions, Peirce also made fundamental discoveries in science and mathematics. A few of these are: the shape of the Milky Way galaxy; the first precise measurement of the Earth’s gravity and circumference; one of the most accurate and versatile projections of the 3D globe of the Earth onto 2D space; the chemistry of relations and working out the consequences of the discovery of the electron for the periodic table; the axiomisation of the law of the excluded middle, or Peirce’s Law: ((P→Q)→P)→P); existential graphs and the transformation of mathematics into an (quasi-)empirical component of studies on cognition; one of the first studies of the stellar spectra, particularly the spectral properties of argon; the invention of the then most accurate gravimetric pendulum; the first standardisation of the length of the metre by anchoring it to the length of a wavelength of light (which he figured out via his own experiments in multiple stations around Europe and North America). This is by no means an exhaustive list.

by Daniel Everett, Aeon | Read more:
Image: Harvard University Archives

Friday, August 16, 2019

Keanu Reeves, Explained


Keanu Reeves, explained (Vox)
[ed. A veritable link-fest. My nephew Tony (the model) reminds me of Keanu a lot and has the same demeanor and kindness of character (longer hair).]

Magazine


Damir Kurta, Fishing
via:

Purple Mountains


David Berman, one of the most incisive and raw voices in indie rock, who brought the acuity of poetry to his songs, took his own life earlier this month. He was 52.

Berman left a flock of devoted listeners. The albums he released under the Silver Jews moniker were wry, scarred, hyperliterate and deeply comedic. This year, he put out his first album under a new recording alias, Purple Mountains, and at the time of his death, was just a few days away from going out on the road.


Arthur Elgort, Wendy Whitelaw, Park Avenue, 1981
via:

Our Caesar: Can the Country Come Back? The Republic Already Looks Like Rome In Ruins.

[ed. Prefaced by a lengthy and interesting history of the Roman empire - how it flourished and how it died.] 

Class conflict — which, in America, has merged with a profound cultural clash — has split the country into two core interests: the largely white lower and middle classes in the middle of the country, roughly equivalent to Rome’s populares and susceptible to populist appeals by powerful men and women; and the multicultural coastal elites, whose wealth has soared as it has stagnated for the rest, and who pride themselves on their openness and meritocracy: the optimates. And just as in late-republican Rome, each side has begun not to complement but to delegitimize the other.

The result, as in Rome, is a form of deepening deadlock, a political conflict in which many on both sides profoundly fear their opponents’ power, and in which compromise through the existing republican institutions, particularly Congress, has become close to impossible. Think of Pompey’s and Caesar’s armies not as actual soldiers but as today’s political-party members and activists, mobilized for nonviolent electoral battle and dissatisfied, especially on the right, with anything less than total victory. The battles in this Cold Civil War take place all the time on the front lines of the two forces: in states where fights over gerrymandering and vote suppression are waged; in swing states in presidential elections; in the courts, where the notion of impartial justice has been recast in the public mind as partisan-bloc voting; in Congress, where regular order is a distant memory, disputes go constantly to the brink, the government is regularly shut down, the entire country’s credit is threatened, and long-established rules designed for republican compromise, like the filibuster, are being junked as fast as any Roman mos maiorum.

And the American system has a vulnerability Rome didn’t. We have always had a one-man executive branch, a head of state, with exclusive and total command of the armed forces. There is no need for an office like Rome’s dictator for when a systemic crisis hits, because we have an existing commander-in-chief vested with emergency powers who can, at any time, invoke them. The two consuls in Rome shared rule and could veto each other; what defines the American presidency is its individual, unitary nature. Over the past century, moreover, as America’s global clout has grown exponentially, and as the challenges of governing a vast and complicated country have spawned a massive administrative state under the president’s ultimate control, what was once designed as an office merely to enforce the laws made by the Congress has changed beyond recognition. (...)

When you think of how the Founders conceived the presidency, the 21st-century version is close to unrecognizable. Their phobia about monarchy placed the presidency beneath the Congress in the pecking order, stripping him of pomp and majesty. No newspaper bothered even to post a reporter at the White House until the 20th century. The “bully pulpit” was anathema, and public speeches vanishingly rare. As George F. Will points out in his new book on conservatism, the president of the United States did not even have an office until 1902, working from his living room until Teddy Roosevelt built the West Wing.

Some presidents rose above this level of modesty. Lincoln temporarily assumed far greater powers in the Civil War, of course, but it was Teddy Roosevelt who added celebrity and imperial aspirations to the office, Woodrow Wilson who began to construct an administrative state through which the executive branch could govern independently, FDR who, as president for what turned out to be life, revolutionized and metastasized the American government and bequeathed a Cold War presidency atop a military-industrial complex that now deploys troops in some 164 foreign countries.

Kennedy — and the Camelot myth that surrounded him — dazzled the elites and the public; Reagan ushered in a movie-star model for a commander-in-chief — telegenic, charismatic, and, in time, something of a cult figure; and then the 9/11 attacks created an atmosphere similar to that of Rome’s temporary, emergency dictatorships, except vast powers of war-making, surveillance, rendition, and even torture were simply transferred to an office for non-emergency times as well, as theorists of the unitary executive — relatively unbound by Congress or the rule of law — formed a tight circle around a wartime boss. And there was no six-month time limit; almost none of these powers has since been revoked.

Some hoped that Barack Obama would wind this presidency-on-steroids down. He didn’t. His presidency began with a flurry of executive orders. He launched a calamitous war on Libya with no congressional authorization; he refused to prosecute those who were involved with Bush’s torture program, who continued to rise through the ranks on his watch; he pushed his executive powers to fix a health-care law that constitutionally only Congress had the right to; and, in his second term, he ignored Congress’s legally mandated deportation of 800,000 Dreamers by refusing to enforce it. He had once ruled such a move out — “I’m president, I’m not king” — and then reinvented the move as a mere shift in priorities. To advance his environmental agenda, he used the EPA to drastically intensify regulations, bypassing Congress altogether. To push his cultural agenda, his Justice Department refused to defend the existing marriage laws and abruptly interpreted Title IX to cover transgender high-school kids without any public debate.

No Democrats regarded these moves as particularly offensive — although partisan Republicans were eager to broadcast their largely phony constitutional objections as soon as the president was not a Republican. And Congress had long since acquiesced to presidentialism anyway, wriggling out of any serious input on the war on terror, dodging the difficult task of amending the health-care law, bobbing and weaving on the environment. And although the worship of Trump is on a whole different level of fanaticism, if you didn’t see some cultish elements in the Obama movement, you weren’t looking very hard. Like Roman commanders slowly acquiring the trappings of gods, presidents have long since slipped the bounds of republican austerity into a world of elected monarchs, flying the world in a massive, airborne chariot, constantly photographed, and now commanding our attention every single day through Twitter.

But Obama was Obama, and Trump is Trump, obliterating most of what mos maiorum remained after his predecessor. Like Pompey, who bypassed all the usual qualifications for the highest office of consul, Trump stormed into party politics by mocking the very idea of political qualifications, violating norms with abandon. He had never been elected to office before; he was a businessman and a brand, not a public servant of any kind; he had no serious grip on the Constitution, liberal-democratic debate, the separation of powers, or limited government. His tangible proposals were slogans. He referred to his peers with crude nicknames, and his instincts were those of a mob boss. But he offered himself, rather like the populares in Rome, as a riposte and antidote to the political and cultural elite, the optimates. A brilliant if dangerous demagogue, he became the first presidential candidate to run not as the leader of a political party, or as a disciple of conservatism or liberalism, but as a fully fledged strongman who promised unilaterally to “make America great again.” It is hard to equate any kind of republican government with a leader who insists, of any American problem, “I, alone, can fix it.”

No one in the American system at this level has ever behaved like this before, crudely trampling on republican practices, scoffing at the rule of law, targeting individual citizens for calumny, openly demonizing his opponents, calling a free press treasonous, deploying deceit impulsively, skirting the boundaries of mental illness, bragging of sexual assault, delegitimizing his own government when it showed even a flicker of independence — and yet he almost instantly commanded the near-total loyalty of an entire political party, and of 40 percent of the country, and this loyalty has barely wavered.

If republicanism at its core is a suspicion of one-man rule, and that suspicion is the central animating principle of the American experiment in self-government, Trump has effectively suspended it for the past three years and normalized strongman politics in America. Nothing and no one in his administration matters except him, as he constantly reminds us. His Cabinet appeared to rein him in for a while, until most experienced adults left it as his demands for total subservience became more insistent. Vast tracts of the bureaucracy are simply ignored, the State Department all but shut down, foreign policy made by impulse, whim, nepotism, for financial gain, or from strange personal rapport with thugs like Kim Jong-un and Vladimir Putin, rather than by any kind of collective deliberation or policy process. Pliant nobodies fill administrative roles where real expertise matters and pushback against the president could have been effective in the past. Congress has very occasionally objected, but it has either been vetoed, as in the recent attempt to curtail U.S. support for Saudi Arabia’s wars, or, if it succeeded in passing legislation with a veto-proof majority, as in Russian sanctions, been slow-walked by the White House.

Writing honestly about this — and the extraordinary upping of the authoritarian ante this presidency has entailed — comes across at times like a dystopian portrait of a nightmare future, except it is very much the present and greeted either with enthusiastic support from the GOP or growing numbness and acceptance by the broader public. The old-school relative reticence of the republican concept of a president had already been transformed, of course, but Trump ramped up the volume to 11: a propaganda channel broadcasting round the clock, with memes almost instantly retweeted by the president, endless provocations to own the news cycle, and mass rallies to sustain his populist appeal. If the definition of a free society is that you don’t have to think about who governs you every minute of the day, then we no longer live in a free society. The press? Vilified, lied to, ignored, mocked, threatened.

When Trump has collided with the rule of law, moreover, he has had a remarkable string of victories. After a period in which he was amazed that his attorney general would follow legal ethics rather than the boss’s instructions, he has now finally appointed one to protect him personally, pursue his political opponents, and defend an extreme theory of presidential Article 2 power. Checked for the first time this year by a Democratic House, he has responded the way a monarch would — by simply refusing, in an unprecedentedly total fashion, to cooperate with any congressional investigation of anything in his administration. Far from being transparent to prove his lack of corruption, he has actively sued anyone seeking any information on his finances. He has declared a phony emergency to justify seizing and using congressional funds for a purpose specifically opposed by the Congress, building a wall on the southern border, and gotten away with it. He has taken his authority to negotiate tariffs in a national-security emergency and turned it into a routine part of presidential conduct to wage a general trade war. And he has enabled an army of grifters and opportunists to line their pockets or accumulate perks at public expense — as long as they never utter a word of criticism.

He has also definitively shown that a president can accept support from a foreign power to get elected, attempt to shut down any inquiry into his crimes, obstruct justice, suborn perjury from an aide, get caught … and get away with all of it. Asking for his tax returns or a radical distancing from his business interests strikes him as an act of lèse-majesté. He refers to “my military” and “my generals,” and claims they all support him, as if he were Pompey or Caesar. He muses constantly about extending his term of office indefinitely, just as those Roman populists did.

Does he mean it? It almost doesn’t matter. He’s testing those guardrails to see just how numb a public can become to grotesque violations of ethical or rhetorical norms, and he has found them exhilaratingly wanting. And he has an unerring instinct for where the weaknesses of our republican system lie. He has abused the limitless pardon powers of the president that were created for rare occasions of clemency, a concept that to Trump has literally no meaning. He has done so to reward political friends, enthuse his base, and, much more gravely, to corrupt the course of justice in the Mueller investigation. The concept even of a “self-pardon” has been added to the existing interdiction on prosecuting a sitting president.

He has also abused various laws allowing him to declare national emergencies in order to get his way even when no such emergencies exist.

Congress has passed several of these laws, assuming naïvely that in our system, a president can be relied on not to invent emergencies to seize otherwise unconstitutional powers — like executive control of legislative spending. This, of course, is not a minor matter; it’s an assault on the core principle of separation of powers that makes a republican government possible. But when the Supreme Court recently lifted a stay on the funds in a legal technicality, where was the outcry? The ruling registered as barely a blip.

The whims of one man now determine much of what happens in what we think of as a republic, where power should, in principle, be widely disseminated. And you don’t just see this in what has objectively happened. You can feel the difference in the culture. Every morning, Washington wakes up and needs to ask only one question to figure out what’s going on, as they did in the royal courts of old: What is the president’s mood today? If that isn’t a sign of a fast-eroding republic, what is? (...)

A republican president respects how the system works, treats power as if it is always temporarily held, interacts with other agents with civility, however strained, and feels responsible, for a while, for keeping the system alive. Trump simply has no understanding of any of this. His very psyche — his staggering vanity, narcissism, and selfishness — is far more compatible with monarchical government than a republican one. He takes no responsibility for failures on his watch and every single credit for anything successful, whatever its provenance. The idea that he would put the system’s interest above his own makes no sense to him. It is only ever about him.

And the public has so internalized this fact it can sometimes seem like a natural feature of the political landscape, not the insidiously horrifying turn in American political history and culture that it is. If there is a conflict between his and others’ interests, his must always win decisively. If he doesn’t win, he has to lie to insist that he did. Everything in strongman rule is related to the strongman, as we are all sucked into the vortex of his malignant, clinical narcissism. You never escape him, every news cycle is dominated by him, every conversation is befouled, if you are not careful. And this saps the republican spirit. It engenders a kind of passivity. It makes submission feel like a kind of relief.

by Andrew Sullivan, NY Magazine |  Read more:
Image: Joe Darrow
[ed. We need another Trump story like we need a hole in the head, but this one is particularly interesting (detailed Roman history notwithstanding). The important point being how dysfuntion in government and society is becoming normalized.]

Thursday, August 15, 2019

Foreign-Policy Crisis: Hong Kong in the Balance

For two and a half years, the world has wondered how President Donald Trump would cope with a real international crisis. That crisis may have finally arrived in Hong Kong, as Beijing appears poised to execute a massive, violent crackdown against protesters. And how it’s resolved will matter not just for Trump’s political fortunes—it will determine whether the United States and China can find a basis for managing competition with each other, or whether they will be locked in a new and volatile Cold War.

Unrest in Hong Kong would pose a particularly difficult challenge for any American president, who would have to balance support for democracy, human rights, and peaceful protest against the need to avoid interfering in China’s domestic affairs.

The shadow of Budapest in 1956 looms large. Hungarians believed, with good reason, that the United States would support them if they rose up against the Soviet Union. When they did so, President Dwight Eisenhower refused to intervene, believing it could lead to a general war. This tragic episode was a warning to future presidents not to overpromise. That lesson was learned again when President George H. W. Bush encouraged the Kurds to rise up against Saddam Hussein in 1991, only to abandon them.

During Hong Kong’s 2014 umbrella protests, which were not as far-reaching as those taking place now, President Barack Obama struck a cautious note, expressing America’s inherent sympathy for freedom of speech and association and saying his government’s primary message was the avoidance of violence. Republicans, including Senator Marco Rubio, criticized Obama for not being more supportive of the protesters. The White House worried that any support would lend credibility to Beijing’s claim that the protests were orchestrated by the United States—and was careful not to overpromise.

Presidents are constrained in what they can say. We should cut Donald Trump some slack. But even taking those constraints into account, Trump’s response could hardly have been worse. Not only was Trump silent on America’s core values. He also increased the risk of a major miscalculation by China with seismic geopolitical consequences. It may prove to be the greatest mistake of his presidency.

Trump’s folly began with a phone call to China’s president, Xi Jinping, on June 18. According to the Financial Times and Politico, Trump told the Chinese leader that he would not condemn a crackdown in Hong Kong. The commitment was made on the fly, without prior consultation with his national-security team. On August 1, Trump made good on that secret promise when he told the press:

“Something is probably happening with Hong Kong, because when you look at, you know, what’s going on, they’ve had riots for a long period of time. And I don’t know what China’s attitude is. Somebody said that at some point they’re going to want to stop that. But that’s between Hong Kong and that’s between China, because Hong Kong is a part of China. They’ll have to deal with that themselves. They don’t need advice.”

On August 13, Trump called it “a very tough situation,” but added that he hoped “it works out for everybody, including China.” Later he tweeted, “Many are blaming me, and the United States, for the problems going on in Hong Kong. I can’t imagine why?” His secretary of commerce, Wilbur Ross, told CNBC, “What would we do, invade Hong Kong? … It’s a question of what role is there for the U.S. in that manner? This is an internal matter.” Politico reports have quoted administration officials as saying that Trump is singularly focused on a trade deal and does not want human rights to get in the way. After a torrent of criticism, and a deluge of statements from congressional leaders, he tweeted last night, vaguely asking Xi to deal with Hong Kong “humanely” and hinting at a meeting. He could have said no violence, but he chose not to. This morning, he again praised Xi and suggested he meet with protesters. It was marginally better than the unconditional green light he had offered previously, but it is still far short of what is required. (...)

The Hong Kong crisis comes at a particularly sensitive moment in U.S.-China relations. Competition between the two global powers may be inevitable, but its scope and intensity depends on the decisions both countries make. In Hong Kong, Xi faces a crucial choice—a 21st-century version of the Tiananmen Square crackdown would make a new Cold War all but inevitable.

A violent crackdown would make it much more difficult to calibrate competition with China. China will have revealed itself to be a totalitarian dictatorship guilty of the excesses associated with such regimes. Cooperation will become difficult, if not impossible, even on matters of mutual interest. Having crossed the Rubicon and incurred the costs, Xi may be even more willing to flex China’s muscles in the South China Sea and East China Sea, increasing tensions with its neighbors and the United States. If China handles Hong Kong in a heavy-handed way, that would also have repercussions for Taiwan, which would see its suspicions of the mainland confirmed.

A violent crackdown would also accelerate economic decoupling, with Western investors fleeing Hong Kong as it becomes just another Chinese city. More than 1,300 U.S. firms have a presence in Taiwan, including nearly every major U.S. financial firm. There are 85,000 U.S. citizens in Hong Kong. They would likely leave. A violent crackdown would almost surely lead to the imposition of sanctions by the U.S. Congress, if necessary with a supermajority to overcome a presidential veto. The decoupling would not be confined to Hong Kong. The tariffs and restrictions imposed to generate leverage in trade negotiations would become permanent.

by Thomas Wright, The Atlantic |  Read more:
Image: Bobby Yip, Reuters 

What is the 'Salmon Cannon'?

Earlier this week, a video shot through the Twitter feed fray with the velocity of a fish hurtling through a pneumatic tube.

The short video (set to house music, strangely) is a compilation of clips showing variations of the fish-shooting technology that Washington-based company Whooshh first developed five years ago. Not only has the video given the internet an ideal subject of absurdist fascination to dethrone last week’s 30-50 feral hogs, it’s also raised a lot of questions, like, “Wait, what?”, and “How does the fish feel about this?” and, “Can they potentially do this with humans?” (I can’t be the only person who was wondering this.)


For answers, I got on the phone with Vince Bryan III, CEO of Whooshh Innovations and inventor of the Salmon Canon (purposely spelled with only one “n” to distinguish the eco-friendly invention from a murderous weapon). His company’s name is derived from the sound fish make as they fly over the high dams that otherwise may block their upstream migratory routes, preventing them from spawning (declining salmon stocks are an issue of critical importance in the Pacific north-west).

So, how does the Whooshh Passage Portal work?

The Whooshh Passage Portal is a system that you put into a river that automates the entire process of getting a fish over a dam. In those early videos five years ago you would see people hand-feeding the fish in; today the fish swim into the system on their own. Inside the tubes is a kind of an airlock where we make a small pressure differential to create a force so the fish moves through the tube. And that tube is irrigated, it’s misted on the inside, so the fish is able to breathe, and it’s a frictionless environment.

From the fish’s perspective it’s a completely smooth ride and it actually feels to them like they’re in the water. And that’s why when they come out the exit they just swim away. They swim in, they slide, they glide, and they swim off. There’s no shock to their system. (...)

How many fish have gone through the cannon?

I don’t have an exact number, but many, many millions. We’ve been operating a version of the system in Norway for three or four years now, and transporting between 5,000 and 10,000 fish a day. They’ve been using the traditional Salmon Canon in Washington for five years now, and those are a hand-feed system. But they do about 15,000 fish a year.

Is your goal to have your technology integrated into every dam if possible, and what would the costs of implementation be?

We would like to see the Whooshh system everywhere on every dam. In the United States, for example, there are 85,000 dams. And just if you did 11 systems a day, you’d have fish passage on every one of those dams in 20 years.

Do you have any plans to make a human-sized cannon?

Only to the extent that we’ll move sturgeon at some point. Large, 200-plus pound sturgeon, and that will require us to make a larger tube. And at that point we’ve got a long list of volunteers who have said that they would like to be the first, and if somebody wants to do it, they’re welcome to try.

by Adrienne Matei, The Guardian |  Read more:
Image: YouTube via USDE. Alternative video in the story.

The Truth About Wanting to Die

I grew to love the hospital’s intercom announcements. Code Blue for cardiac arrest; Code White for a violent patient; Code Yellow for a missing person, an elopee, as they’re called, a runaway for whom I’d silently cheer. Go, sixty-eight-year-old Caucasian man with short brown hair last seen wearing hospital pants and a brown wool cardigan and no shoes! Run! (...)

I was sure if I just acted normal enough they would let me go. I tried to be courteous, lucid and calm but not suspiciously upbeat. I didn’t weep or scream at my own frustration or impotence or exhaustion or insomnia or self-loathing. I met, as required, multiple times a day with nurses and social workers. And my I’m-totally-fine, suicide-was-a-one-time-aberration ploy almost worked: I was almost set free by the first psychiatrist I saw within a week of my admission without so much as a follow-up appointment.

This isn’t, incidentally, best practice: it’s a great way to ensure people fall through the cracks and (if they’re lucky) wind up back in hospital in worse shape than before.

Unbeknownst to me, as I paced by my bed and prepared for life outside the windowless ward, my parents had pushed for a second opinion, my dad writing desperate, pleading emails at three o’clock in the morning. The second psychiatrist was smart and sardonic and treated me like someone capable of communicating in multisyllabic sentences. He also had a far better bullshit detector. He did not buy my argument that this whole suicide thing was an anomalous one-off, a mental misunderstanding, never to recur. He decided I had major depression. And that I was fucked up enough to merit more time locked up lest I try to off myself again.

Eight hundred thousand people around the world kill themselves every year, which means about 2,200 a day, or three every two minutes. Statistically, two dozen people killed themselves in the time it took you to get out of bed, showered, and caffeinated. Maybe forty-five during your commute to work, another ninety in the time you spent making dinner. Unless you, like me, take an eternity to do any of those things, if they happen at all. In which case, think of it this way: every time you mull killing yourself and manage to talk yourself down because you have more to do and more to ask of life, a handful of people have lost that internal, wrenching wrestling match and ended it.

In Canada, where eleven people kill themselves daily, you’re almost ten times more likely to kill yourself than you are to be killed by someone else. About 120 Americans kill themselves every day: Americans are more than twice as likely to die by their own hands as someone else’s. Victims of America’s gun epidemic are almost twice as likely to have shot themselves to death than have been shot to death by someone else. If you die young, suicide’s much more likely to be the cause: in 2016 it was the second-leading cause of death for Americans between ten and thirty-four years old. Many, many more people try to kill themselves than actually do it—about half a million Americans are brought to emergency rooms every year after having tried to end their lives.

The reality is likely even worse: evidence indicates we’re undercounting suicides by a significant amount—by as much as a third, depending how you guesstimate. For one thing, despite the supposed decrease in shame in having a family member kill themselves, our persistent societal freak-out regarding suicide can make both relatives and authorities hesitant to classify deaths as such. There’s a very high burden of proof required for coroners and medical examiners to classify a death as a suicide. There’s rarely incontrovertible evidence: most people don’t leave suicide notes, and not everyone talks about killing themselves before killing themselves. Even if they had at some point in the past, how do you know this specific incident was a suicide? If someone is depressed, even suicidal, but also misuses drugs, how do you know for sure whether an overdose is purposeful? How do you know for sure whether a single-vehicle crash was careless driving or driven by a need for death? How can you be certain whether someone slipped or jumped?

You’re more likely to find suicides when you look for them. And, much of the time, we don’t. Grieving families would frequently prefer not to touch the issue. “The underreporting of suicide is a recognized concern in Canada and internationally,” reads a 2016 study based on data from the Public Health Agency of Canada. Suicide deaths are also examined a lot less closely, on average: a 2010 report found that about 55 percent of US suicide deaths get autopsied, compared to 92 percent of homicides. (...)

This has been a known issue for a while. The consequences of underreporting extend beyond public-health nerds who get off on accuracy. Undercounting suggests something is less of a problem than it is and therefore less deserving of our attention and our dollars. Which is convenient, given how icky suicide makes us feel in the first place. Finding fewer suicides can make it seem like suicide is less of an issue. “If you think about it, society hasn’t been that invested in suicide prevention,” Rockett points out. “If you more accurately portray the self-injury deaths and say, ‘This is mental health,’ there’s potential for rather more resources to be directed toward the problem.”

Botched suicide attempts also go underreported: many people who try to kill themselves either don’t seek medical help or lie about why they are seeking it. I’ve done both those things. I’d do them again. As I’ve said, telling anyone you’ve tried to kill yourself, let alone someone you don’t know, let alone someone who could suspend your right to freedom of movement, gives one enormous pause. (Not that telling someone you love is any easier.)

by Anna Mehler Paperny, The Walrus |  Read more:
Image: Paul Kim

Wednesday, August 14, 2019

The Mysteries of Menopause


The Mysteries of Menopause (The Stranger)
Image: Lisa Tegtmeier

What the Seas Will Swallow


What the Seas Will Swallow (Hakai Magazine)
Images: Alex MacLean

Tourism Is Eating the World

In 1953, mountaineers Tenzing Norgay and Edmund Hillary made the first confirmed summiting of Mount Everest, the world’s highest peak. Recently, Everest has grown so popular that photos are surfacing showing huge linesof climbers waiting to surmount that same peak. On rarefied ground where once only Norgay and Hillary tread, now climbers are dying because of overcrowding.

A less dramatic version of this scene is being played out around the world -- for both good and ill. The number of international tourist arrivals has been increasing more or less exponentially since the mid-20th century, and totaled about 1.4 billion in 2018. Europe has seen the biggest share, but the Asia-Pacific region is growing fast:

This growth has been driven by a confluence of factors. Most obviously, disposable incomes have grown around the world, with China’s gains being especially impressive in recent years. People are living longer and having fewer children, giving them the time and freedom to travel more. Areas that were once off-limits, now are accessible as the world has generally become a more peaceful and open since the end of the Cold War.

Technology has also played a key role. Air travel is cheap and ubiquitous. Tickets, hotels, tours and local transportation can now be booked online. The internet has also given the masses information about the world's tourist destinations, from Japanese hot springs to California wine country to Iceland's glaciers. Recently, Google Maps has made it much easier to find one’s way around a strange country, translation apps have made foreign-language communication less daunting, Uber offers easy transportation in many international cities and Airbnb has expanded the range of available accommodations.

Tourism is big business for the countries that manage to attract hordes of visitors. Direct receipts from tourism totaled $1.6 trillion in 2017, or 2% of the entire world economy:

The World Travel and Tourism council estimates that the amount of economic activity attributable to the sector is much larger, reaching $8.8 trillion in 2018, and supporting as much as 10% of all jobs on the planet.

But tourism has a down side as well. As the Everest example shows, travel to the most popular destinations is subject to what economists call congestion externalities -- when you go to a famous place, your presence makes the experience just a little less convenient and comfortable for everyone else. Multiply that effect by the millions, and the world’s tourists are crowding each other out of a good time. I felt this myself when I recently went to Golden Gai, a bar district that used to be one of Tokyo’s hidden gems, and found that it was packed with Western and Chinese tourists.

For cities, the experience can be even more harrowing. Even as tourist dollars flow into the coffers of local businesses, mobs of travelers strain infrastructure that was never built to handle so many human bodies. If a city tries to accommodate the inflow by building large amounts of new infrastructure, those streets and trains will sit empty during the off season, or if the city loses its tourist appeal. Travelers can be accommodated with Airbnb, but this can push up rents for locals. Logistically, it’s simply inefficient for every location in the world to always be prepared to house, feed and transport many more people than actually live there.

Unfortunately, there will come a point where over-tourism makes travel both logistically inconvenient and much less enjoyable for everyone. The problem can be ameliorated by spreading tourists around to less crowded destinations, as Japan is trying to do. Some destinations, like Amsterdam, are cutting back on advertising and self-promotion. But eventually there will be no choice but to start charging tourists a fee.

A few places are already trying this. Venice will soon start charging people to come to the city for day trips. New Zealand has introduced a tourist tax. Various other European countries and cities have implemented or plan to implement taxes on hotels and other overnight accommodation. This is a simple application of congestion pricing, the textbook economics solution to the problem of overcrowding.

    by Noah Smith, Bloomberg |  Read more:
    Image: Miguel Medina/AFP/Getty Images

    Tuesday, August 13, 2019

    Perversity! Futility! Jeopardy!

    Suppose you were a person who didn't want to think too hard; and suppose you were a centrist pundit for one of the nation’s leading newspapers. But I repeat myself.

    Your assignment at this point in the political cycle—still a year away from next summer’s nominating conventions—is to survey the field of candidates and find them all lacking. The reason? None of them are saying just what you think they should be saying. Your gut tells you that the ideal candidate is one who resembles that paragon of calm reason and moderation: the centrist pundit!

    You will see, as early as it is, a distinct threat on the horizon. There are candidates competing to lead the Democratic Party who want to pull the party to the far left. You must sound the alarm. These candidates will commit to stances that will scare away moderate voters. As they pander to the activist wing of their party, they are hurting their chance to win the general election next November. Being a seasoned political observer, you are well aware that this often happens in Democratic primaries: candidates veer left to get the nomination, and then pivot to the center for the general. But you also know this: it’s not going to work. Too many responsible moderates, such as yourself, will worry the candidate is still harboring those leftist plans. You don’t want someone who is going to bankrupt the nation.

    The column writes itself.

    The Democratic debates in June (and the internecine debates among congressional Democrats) set off a chorus of such columns, all making the same point: we do not like what we’re hearing. On the New York Times op-ed pages alone there were: 1) a Bret Stephens complaint in which he channeled the impressions of “ordinary people” (i.e., white nativists) and concluded that the Dems were off to “a wretched start”; 2) a Maureen Dowd column in defense of House Speaker Nancy Pelosi, featuring former Chicago mayor Rahm Emanuel assailing the left flank of the Democratic Party with the challenge, “Do they want to beat Trump or do they want to clear the moderate and centrists out of the party?”; 3) Thomas Friedman confessing that he was “shocked” by some of the rhetoric he heard in the debates; and 4) David Brooks, under the title “Dems, Please Don’t Drive Me Away,” warning that “the party is moving toward all sorts of positions that drive away moderates and make it more likely the nominee will be unelectable.”

    And now, just in time for the next round of Democratic debates, Dowd has returned with a second defense of Pelosi’s “pragmatism,” while deriding progressives as “modern Puritans,” while also blaming Democrats for spending too much time “knifing one another.” Her stated motivation was anger that Left Twitter roasted her recent Washington soiree, which was attended by Pelosi. The Washington Post’s Dave Weigel summarized the column as “the Democrats will lose if the left keeps making fun of my parties,” but it was even worse than that—Dowd was insisting that because Nancy Knows Best, any “puritan” push for impeachment reflected not just the nastiness of the left but its stupidity.

    You could gather examples of this kind of standard punditry from the archives, feed them into a computer, and produce this year’s batches through artificial intelligence. You’d have to plug in new names, but not new ideas.

    In the 1980s, the candidate who sparked pundit-panic was Jesse Jackson. The Chicago-based civil rights leader ran in the Democratic primary in 1984, and again in 1988. That first race was especially instructive. Imagine: an angry black man running for president, and in a year when the overarching mission for Democrats should have been to deny Ronald Reagan a second term. Obviously, Jackson could not win against Reagan, so what was the point?

    From the op-ed page of the New York Times, William Safire surveyed the field in June of 1983 and saw Jackson “marching out with the blacks,” as well as other emerging threats to the Democratic Party establishment’s preferred candidate, former vice-president Walter Mondale. Illinois congressman John Anderson, who had run as an independent in 1980, was considering another run (he decided against it). California senator Alan Cranston was campaigning for a freeze on nuclear weapons production. Cranston was winning the “greens,” according to Safire, “who make nominatable whomever they rally behind and make unelectable whomever they help nominate.” Safire saw a parallel in the landslide 1983 reelection of Margaret Thatcher in Britain: even if Reagan didn’t run for reelection in 1984, he surmised, “any Republican candidate would win, as Mrs. Thatcher did, on the dangerous kookiness of a far-left government.”

    As it turned out, of course, Mondale won the nomination and ran as a conventional middle-of-the-road Democrat. He went on to win thirteen electoral votes (Minnesota and the District of Columbia) to Reagan’s 525. Good times. (...)

    The foundational fallacy of most mainstream campaign punditry is that presidential elections are decided on some kind of left-right binary axis. It happens to be the belief of most of the operatives and funders of the Democratic Party establishment, as well. Their first principle is: if voters see policy proposals that seem to be coming from the American left, they will choose a conservative like Ronald Reagan or George W. Bush as the safer alternative. It’s remarkable that so many veteran political columnists and political “pros” seem to think it works this way. But that they cling to this article of faith in the third year of the Trump presidency is not just curious—it’s perverse.

    Even if it was mostly due to a series of freakish accidents that Trump found a narrow path to the presidency, does anyone believe his policy proposals ensured his success? That somehow American voters considered the details of the immigration issue, for example, and decided “yes, let’s build a wall and require Mexico to pay for it.” Or that perhaps he was correct in saying that Obamacare should be dismantled for some unspecified Republican approach to health care?

    You could list any number of factors that are more decisive in a presidential election than what’s in a candidate’s policy papers. It’s unfortunate, but one of the most determinative factors is how well a candidate performs in front of large audiences, especially on television—that is, does the candidate have what are essentially acting skills: looking good, speaking with confident facial expressions, attracting viewers instead of turning them off? (In television infotainment and newscasts, there are attempts to measure this appeal by “Q Scores.”)

    In a more general way, being able to move voters emotionally obviously has more relevance than where a candidate comes down on any particular policy proposal. Take the matter of what kind of health care system is best for the United States in the coming decade. Our centrist pundits are gnashing their teeth because several Democrats are willing to discuss the idea of universal health care. Any Medicare for All proposal is going to be too scary once voters realize it means “getting rid of private health insurance.” Supposedly the Republicans will have a field day by rallying people to the cause of corporate insurance. If you get a Democratic candidate who believes that and tries to deflect the charge with detailed explanation of how the system would gradually evolve— that if you like your employer-sponsored health insurance you can keep it, etc. etc.—you are on the losing side. But if you tap into what many people feel—that is, that big insurance companies are not your friend, and that the business model of most private insurance is to wriggle out of paying for health care and to saddle you with as much cost as possible . . . why not rescue people from the clutches of profit-driven insurance companies? These are businesses that, until it was disallowed, insisted they would not cover people with “pre-existing conditions.”

    It would be pretty to think that America’s course is decided by rational voters who closely examine the policy choices in front of them. Who can believe that in the age of Trump? Here’s a counter-theory then: in most presidential elections, the vast majority of voters will choose the Democratic or Republican based on ideology or partisan loyalty. The remaining small sliver of “persuadable voters” are responding to something that is not necessarily a preference or rejection of conservative or liberal policies. Often it is just a vague sense of which candidate seems more plausible in offering hope for a better politics, or a better economy, or a better country, or a better deal for people like them.

    by Dave Denison, The Baffler | Read more:
    Image: PBS NewsHour/The Baffler
    [ed. I just want a candidate who's sincere and authentic (vs. poll or media or advisor driven), who can articulate a path forward that feels like progress (instead of pandering to everyone).]

    Nutrition Science Is Broken

    It's been a tortuous path for the humble egg. For much of our history, it was a staple of the American breakfast — as in, bacon and eggs. Then, starting in the late 1970s and early 1980s, it began to be disparaged as a dangerous source of artery-clogging cholesterol, a probable culprit behind Americans’ exceptionally high rates of heart attack and stroke. Then, in the past few years, the chicken egg was redeemed and once again touted as an excellent source of protein, unique antioxidants like lutein and zeaxanthin, and many vitamins and minerals, including riboflavin and selenium, all in a fairly low-calorie package.

    This March, a study published in JAMA put the egg back on the hot seat. It found that the amount of cholesterol in a bit less than two large eggs a day was associated with an increase in a person’s risk of cardiovascular disease and death by 17 percent and 18 percent, respectively. The risks grow with every additional half egg. It was a really large study, too — with nearly 30,000 participants — which suggests it should be fairly reliable.

    So which is it? Is the egg good or bad? And, while we are on the subject, when so much of what we are told about diet, health, and weight loss is inconsistent and contradictory, can we believe any of it?

    Quite frankly, probably not. Nutrition research tends to be unreliable because nearly all of it is based on observational studies, which are imprecise, have no controls, and don’t follow an experimental method. As nutrition-research critics Edward Archer and Carl Lavie have put it, “’Nutrition’ is now a degenerating research paradigm in which scientifically illiterate methods, meaningless data, and consensus-driven censorship dominate the empirical landscape.”

    Other nutrition research critics, such as John Ioannidis of Stanford University, have been similarly scathing in their commentary. They point out that observational nutrition studies are essentially just surveys: Researchers ask a group of study participants — a cohort — what they eat and how often, then they track the cohort over time to see what, if any, health conditions the study participants develop.

    The trouble with the approach is that no one really remembers what they ate. You might remember today’s breakfast in some detail. But, breakfast three days ago, in precise amounts? Even the unadventurous creature of habit would probably get it wrong. That tends to make these surveys inaccurate, especially when researchers try to drill down to specific foods.

    Then, that initial inaccuracy is compounded when scientists use those guesses about eating habits to calculate the precise amounts of specific proteins and nutrients that a person consumed. The errors add up, and they can lead to seriously dubious conclusions.

    A good example is the 2005 study that suggested that eating a cup of endive once a week might cut a woman’s risk of ovarian cancer by 76 percent. There was even a possible mechanism to explain the effect: Endive is high in kaempferol, a flavonoid that has shown anticarcinogenic properties in laboratory experiments. It was a big study, based on a cohort of more than 62,000 women. This study was published in the prestigious journal Cancer, and many in the media were convinced. Dr. Mehmet Oz even touted it on his television show.

    But, as Maki Inoue-Choi, of the University of Minnesota, and her colleagues pointed out, the survey had asked about many other kaempferol-rich foods — including some that had higher levels of kaempferol than endive does — and not one of those other foods had the same apparent effect on ovarian cancer.

    The new study linking eggs and cardiovascular disease deserves similar scrutiny. Statistically speaking, 30,000 participants makes for a very powerful study. And in fairness, the study’s defenders say that it did a good job accounting for factors that might have influenced the findings, such as overall fat consumption, smoking, and lifestyle.

    But on the other hand, the study tracked participants’ health outcomes over periods ranging from 13 to more than 30 years, and participants were queried about their diet only once, at the beginning of the study. Can we assume that the participants gave a reliable depiction of their diet at the outset, and then that they maintained that same diet for the years — in many cases, decades — that followed? Probably not. Who eats the same way for 10 years?

    In light of these flaws, Dr. Anthony Pearson, a cardiologist at St. Luke’s Hospital in suburban St. Louis, had this advice: “Rather than drastically cutting egg consumption,” he wrote in a blog for MedPage Today, “I propose that there be a drastic cut in the production of weak observational nutrition studies and a moratorium on inflammatory media coverage of meaningless nutritional studies.” (...)

    Unfortunately, it is impractical — and probably impossible — for most researchers to carry out those types of studies on a large scale. Crunching the data from a big observational study is a much easier way to get a publication and some media attention. So we get what we get.

    In the meantime, what do the rest of us do with our diets?

    Most experts recommend avoiding processed foods as much as possible and sticking with a Mediterranean-like diet because it makes intuitive sense. It is not too restrictive. It is heavy in fruits and vegetables. It has the right kinds of fats and some grains. It includes fish and generally lean proteins.

    These experts contend that you should also be wary about foods that are said to have newly revealed healthy, or unhealthy, properties. In other words, don’t buy the notion of superfoods. The evidence is just not there.

    by Timothy F. Kirn, Undark | Read more:
    Image: kajakiki via Getty Images

    Gilda Radner


    "I base most of my fashion taste on what doesn't itch."
    - Gilda Radner
    via:

    Rolling Coal

    Freedom as sociopathy.

    A pickup truck "rolling coal" cruised by a downtown sidewalk crowded with tourists Saturday night here in the Cesspool of Sin. Rolling coal, says Wikipedia, "is the practice of modifying a diesel engine to increase the amount of fuel entering the engine in order to emit large amounts of black or grey sooty exhaust fumes into the air." Vice described it in 2014 as a way to "piss off cops, Prius drivers, and anyone else who happens to get in the way of their big-ass trucks."

    It's an in-your-face weapon in the culture wars. Like flying big Confederate and Trump flags from your truck, only no one sees it coming until you flip the switch. (...)

    Why bring it up in 2019? Because that grinning, "fuck your feelings" sociopathy behind rolling coal and Trump rally tee shirts has morphed into threatening passersby with a hail of bullets. A flip of the safety switch to "Fire" and watch people scatter. If they are not scattering already.

    20-year-old Dmitriy Andreychenko filmed himself strolling into a Springfield, Mo. Walmart Thursday carrying an AR-style rifle, a handgun, and wearing a tactical vest with over 100 rounds of ammunition. It was just days after mass shootings in El Paso, Texas and Dayton, Ohio left 31 dead. His wife and his sister warned him it was a bad idea.

    The manager had an employee pull the fire alarm to clear the store. A former member of the military held Andreychenko at gunpoint until police arrived.

    But Missouri is an open-carry state, he told police. It was just a “social experiment” to see if his Second Amendment rights were still intact. He didn't see a reason why "people would freak out." Police charged Andreychenko with making a terrorist threat. A Battlefield City officer and another driver went to the emergency room with "severe injuries" after a collision as police rushed to the scene.

    Dahlia Lithwick ponders the mentality it takes to practice belligerence as a form of free expression:
    [ed. Andreychenko didn’t die last week. Instead, officers took the man into custody “without incident.” That’s a tremendous surfeit of good fortune for a man who was apprehended both by an armed bystander and the police. By its very definition, white privilege is the ability to film yourself conducting a “social experiment” with military-grade weapons at the same chain where a mass shooting just happened, without being shot dead in your tracks. Trayvon Martin wasn’t even granted the luxury of being allowed to conduct a “social experiment” with a bag of Skittles.]
    I am mindful of privilege today more than most days because it is the second anniversary of the Charlottesville Unite the Right rally, and we all know how that ended. I am mindful of what privilege buys you in America: the right to not get shot when you’re armed to the teeth, and the right to not have to explain beyond the fact that you were just “experimenting” with constitutional freedoms. The privilege of violent white men is the privilege of an almost-perfect failure of empathy, imagination, or regard. It buys you the right to ignore your wife and sister, to ignore current events and history and murder statistics, to ignore the fact that reasonable people should reasonably fear being shot in a bloody massacre. It allows you to stagger blindly through the world and not get killed, while you practice the fine art of looking like you can and will shoot hundreds of others, without even wondering why people are fleeing the building with their children clutched tight. 
    White men with guns, quickly becoming the most lethal cohort of Americans, don’t just benefit, every day, from the presumption of innocence, and eternal boyhood. They benefit twice over—first from that, and then from the presumption that their perfect self-absorption and solipsism are themselves enduringly worthy of constitutional protection.
    That is Lithwick's polite way of saying some Americans' idea of freedom is the right to behave like an asshole. Now they have elected one to the White House who tells them every day to go to town. "Perceived grievance, either political or personal" motivates these shooters. Trump, who has built his life around grievance and revenge, validates theirs.

    by Tom Sullivan, Hullabaloo |  Read more:
    Image: uncredited