Thursday, December 22, 2016

Why Time Management is Ruining Our Lives

Given that the average lifespan consists of only about 4,000 weeks, a certain amount of anxiety about using them well is presumably inevitable: we’ve been granted the mental capacities to make infinitely ambitious plans, yet almost no time at all to put them into practice. The problem of how to manage time, accordingly, goes back at least to the first century AD, when the Roman philosopher Seneca wrote On The Shortness of Life. “This space that has been granted to us rushes by so speedily, and so swiftly that all save a very few find life at an end just when they are getting ready to live,” he said, chiding his fellow citizens for wasting their days on pointless busyness, and “baking their bodies in the sun”.

Clearly, then, the challenge of how to live our lives well is not a new one. Still, it is safe to say that the citizens of first-century Rome didn’t experience the equivalent of today’s productivity panic. (Seneca’s answer to the question of how to live had nothing to do with becoming more productive: it was to give up the pursuit of wealth or high office, and spend your days philosophising instead.) What is uniquely modern about our fate is that we feel obliged to respond to the pressure of time by making ourselves as efficient as possible – even when doing so fails to bring the promised relief from stress.

The time-pressure problem was always supposed to get better as society advanced, not worse. In 1930, John Maynard Keynes famously predicted that within a century, economic growth would mean that we would be working no more than 15 hours per week – whereupon humanity would face its greatest challenge: that of figuring out how to use all those empty hours. Economists still argue about exactly why things turned out so differently, but the simplest answer is “capitalism”. Keynes seems to have assumed that we would naturally throttle down on work once our essential needs, plus a few extra desires, were satisfied. Instead, we just keep finding new things to need. Depending on your rung of the economic ladder, it’s either impossible, or at least usually feels impossible, to cut down on work in exchange for more time.

Arguably the first time management guru – the progenitor of the notion that personal productivity might be the answer to the problem of time pressure – was Frederick Winslow Taylor, an engineer hired in 1898 by the Bethlehem Steel Works, in Pennsylvania, with a mandate to improve the firm’s efficiency. “Staring out over an industrial yard that covered several square miles of the Pennsylvania landscape, he watched as labourers loaded 92lb [iron bars] on to rail cars,” writes Matthew Stewart, in his book The Management Myth. “There were 80,000 tons’ worth of iron bars, which were to be carted off as fast as possible to meet new demand sparked by the Spanish-American war. Taylor narrowed his eyes: there was waste here, he was certain.”

The Bethlehem workers, Taylor calculated, were shifting about 12.5 tons of iron per man per day – but predictably, when he offered a group of “large, powerful Hungarians” some extra cash to work as fast as they could for an hour, he found that they performed much better. Extrapolating to a full work day, and guesstimating time for breaks, Taylor concluded, with his trademark blend of self-confidence and woolly maths, that every man ought to be shifting 50 tons per day – four times their usual amount.

Workers were naturally unhappy at this transparent attempt to pay them the same money for more work, but Taylor was not especially concerned with their happiness; their job was to implement, not understand, his new philosophy of “scientific management”. “One of the very first requirements for a man who is fit to handle pig iron,” wrote Taylor, is “that he shall be so stupid and phlegmatic that he more nearly resembles in his mental makeup the ox than any other type … he is so stupid that the word ‘percentage’ has no meaning for him.”

The idea of efficiency that Taylor sought to impose on Bethlehem Steel was borrowed from the mechanical engineers of the industrial revolution. It was a way of thinking about improving the functioning of machines, now transferred to humans. And it caught on: Taylor enjoyed a high-profile career as a lecturer on the topic, and by 1915, according to the historian Jennifer Alexander, “the word ‘efficiency’ was plastered everywhere – in headlines, advertisements, editorials, business manuals, and church bulletins.” In the first decades of the 20th century, in a Britain panicked by the rise of German power, the National Efficiency movement united politicians on left and right. (“At the present time,” the Spectator noted in 1902, “there is a universal outcry for efficiency in all the departments of society, in all aspects of life.”)

It is not hard to grasp the appeal: efficiency was the promise of doing what you already did, only better, more cheaply, and in less time. What could be wrong with that? Unless you happened to be on the sharp end of attempts to treat humans like machines – like the workers of Bethlehem Steel – there wasn’t an obvious downside.

But as the century progressed, something important changed: we all became Frederick Winslow Taylors, presiding ruthlessly over our own lives. As the doctrine of efficiency grew entrenched – as the ethos of the market spread to more and more aspects of society, and life became more individualistic – we internalised it. In Taylor’s day, efficiency had been primarily a way to persuade (or bully) other people to do more work in the same amount of time; now it is a regimen that we impose on ourselves. (...)

Time management promised a sense of control in a world in which individuals – decreasingly supported by the social bonds of religion or community – seemed to lack it. In an era of insecure employment, we must constantly demonstrate our usefulness through frenetic doing, and time management can give you a valuable edge. Indeed, if you are among the growing ranks of the self-employed, as a freelancer or a worker in the so-called gig economy, increased personal efficiency may be essential to your survival. The only person who suffers financially if you indulge in “loafing” – a workplace vice that Taylor saw as theft – is you.

Above all, time management promises that a meaningful life might still be possible in this profit-driven environment, as Melissa Gregg explains in Counterproductive, a forthcoming history of the field. With the right techniques, the prophets of time management all implied, you could fashion a fulfilling life while simultaneously attending to the ever-increasing demands of your employer. This promise “comes back and back, in force, whenever there’s an economic downturn”, Gregg told me.

Especially at the higher-paid end of the employment spectrum, time management whispers of the possibility of something even more desirable: true peace of mind. “It is possible for a person to have an overwhelming number of things to do and still function productively with a clear head and a positive sense of relaxed control,” the contemporary king of the productivity gurus, David Allen, declared in his 2001 bestseller, Getting Things Done. “You can experience what the martial artists call a ‘mind like water’, and top athletes refer to as ‘the zone’.”

As Gregg points out, it is significant that “personal productivity” puts the burden of reconciling these demands squarely on our shoulders as individuals. Time management gurus rarely stop to ask whether the task of merely staying afloat in the modern economy – holding down a job, paying the mortgage, being a good-enough parent – really ought to require rendering ourselves inhumanly efficient in the first place.

Besides, on closer inspection, even the lesser promises of time management were not all they appeared to be. An awkward truth about Taylor’s celebrated efficiency drives is that they were not very successful: Bethlehem Steel fired him in 1901, having paid him vast sums without any clearly detectable impact on its own profits. (One persistent consequence of his schemes was that they seemed promising at first, but left workers too exhausted to function consistently over the long term.)

Likewise, it remains the frequent experience of those who try to follow the advice of personal productivity gurus – I’m speaking from years of experience here – that a “mind like water” is far from the guaranteed result. As with Inbox Zero, so with work in general: the more efficient you get at ploughing through your tasks, the faster new tasks seem to arrive. (“Work expands to fill the time available for its completion,” as the British historian C Northcote Parkinson realised way back in 1955, when he coined what would come to be known as Parkinson’s law.)

Then there’s the matter of self-consciousness: virtually every time management expert’s first piece of advice is to keep a detailed log of your time use, but doing so just heightens your awareness of the minutes ticking by, then lost for ever. As for focusing on your long-term goals: the more you do that, the more of your daily life you spend feeling vaguely despondent that you have not yet achieved them. Should you manage to achieve one, the satisfaction is strikingly brief – then it’s time to set a new long-term goal. The supposed cure just makes the problem worse.

There is a historical parallel for all this: it’s exactly what happened when the spread of “labour-saving” devices transformed the lives of housewives and domestic servants across Europe and north America from the end of the 19th century. Technology now meant that washing clothes no longer entailed a day bent over a mangle; a vacuum-cleaner could render a carpet spotless in minutes.

Yet as the historian Ruth Cowan demonstrates in her 1983 book More Work for Mother, the result, for much of the 20th century, was not an increase in leisure time among those charged with doing the housework. Instead, as the efficiency of housework increased, so did the standards of cleanliness and domestic order that society came to expect. Now that the living-room carpet could be kept perfectly clean, it had to be; now that clothes never needed to be grubby, grubbiness was all the more taboo. These days, you can answer work emails in bed at midnight. So should that message you got at 5.30pm really wait till morning for a reply? (...)

At the very bottom of our anxious urge to manage time better – the urge driving Frederick Winslow Taylor, Merlin Mann, me and perhaps you – it’s not hard to discern a familiar motive: the fear of death. As the philosopher Thomas Nagel has put it, on any meaningful timescale other than human life itself – that of the planet, say, or the cosmos – “we will all be dead any minute”. No wonder we are so drawn to the problem of how to make better use of our days: if we could solve it, we could avoid the feeling, in Seneca’s words, of finding life at an end just when we were getting ready to live. To die with the sense of nothing left undone: it’s nothing less than the promise of immortality by other means.

But the modern zeal for personal productivity, rooted in Taylor’s philosophy of efficiency, takes things several significant steps further. If only we could find the right techniques and apply enough self-discipline, it suggests, we could know that we were fitting everything important in, and could feel happy at last. It is up to us – indeed, it is our obligation – to maximise our productivity. This is a convenient ideology from the point of view of those who stand to profit from our working harder, and our increased capacity for consumer spending. But it also functions as a form of psychological avoidance. The more you can convince yourself that you need never make difficult choices – because there will be enough time for everything – the less you will feel obliged to ask yourself whether the life you are choosing is the right one.

Personal productivity presents itself as an antidote to busyness when it might better be understood as yet another form of busyness. And as such, it serves the same psychological role that busyness has always served: to keep us sufficiently distracted that we don’t have to ask ourselves potentially terrifying questions about how we are spending our days. “How we labour at our daily work more ardently and thoughtlessly than is necessary to sustain our life because it is even more necessary not to have leisure to stop and think,” wrote Friedrich Nietzsche, in what reads like a foreshadowing of our present circumstances. “Haste is universal because everyone is in flight from himself.”

by Oliver Burkeman, The Guardian |  Read more:
Image: Pete Gamlen

A Telephone Call

[ed. See also: Ladies in Waiting]

Please, God, let him telephone me now. Dear God, let him call me now. I won't ask anything else of You, truly I won't. It isn't very much to ask. It would be so little to You, God, such a little, little thing. Only let him telephone now. Please, God. Please, please, please.

If I didn't think about it, maybe the telephone might ring. Sometimes it does that. If I could think of something else. If I could think of something else. Maybe if I counted five hundred by fives, it might ring by that time. I'll count slowly. I won't cheat. And if it rings when I get to three hundred, I won't stop; I won't answer it until I get to five hundred. Five, ten, fifteen, twenty, twenty-five, thirty, thirty-five, forty, forty-five, fifty.... Oh, please ring. Please.

This is the last time I'll look at the clock. I will not look at it again. It's ten minutes past seven. He said he would telephone at five o'clock. "I'll call you at five, darling." I think that's where he said "darling." I'm almost sure he said it there. I know he called me "darling" twice, and the other time was when he said good-by. "Good-by, darling." He was busy, and he can't say much in the office, but he called me "darling" twice. He couldn't have minded my calling him up. I know you shouldn't keep telephoning them--I know they don't like that. When you do that they know you are thinking about them and wanting them, and that makes them hate you. But I hadn't talked to him in three days-not in three days. And all I did was ask him how he was; it was just the way anybody might have called him up. He couldn't have minded that. He couldn't have thought I was bothering him. "No, of course you're not," he said. And he said he'd telephone me. He didn't have to say that. I didn't ask him to, truly I didn't. I'm sure I didn't. I don't think he would say he'd telephone me, and then just never do it. Please don't let him do that, God. Please don't.

"I'll call you at five, darling." "Good-by, darling.,' He was busy, and he was in a hurry, and there were people around him, but he called me "darling" twice. That's mine, that's mine. I have that, even if I never see him again. Oh, but that's so little. That isn't enough. Nothing's enough, if I never see him again. Please let me see him again, God. Please, I want him so much. I want him so much. I'll be good, God. I will try to be better, I will, If you will let me see him again. If You will let him telephone me. Oh, let him telephone me now.

Ah, don't let my prayer seem too little to You, God. You sit up there, so white and old, with all the angels about You and the stars slipping by. And I come to You with a prayer about a telephone call. Ah, don't laugh, God. You see, You don't know how it feels. You're so safe, there on Your throne, with the blue swirling under You. Nothing can touch You; no one can twist Your heart in his hands. This is suffering, God, this is bad, bad suffering. Won't You help me? For Your Son's sake, help me. You said You would do whatever was asked of You in His name. Oh, God, in the name of Thine only beloved Son, Jesus Christ, our Lord, let him telephone me now.

I must stop this. I mustn't be this way. Look. Suppose a young man says he'll call a girl up, and then something happens, and he doesn't. That isn't so terrible, is it? Why, it's gong on all over the world, right this minute. Oh, what do I care what's going on all over the world? Why can't that telephone ring? Why can't it, why can't it? Couldn't you ring? Ah, please, couldn't you? You damned, ugly, shiny thing. It would hurt you to ring, wouldn't it? Oh, that would hurt you. Damn you, I'll pull your filthy roots out of the wall, I'll smash your smug black face in little bits. Damn you to hell.

No, no, no. I must stop. I must think about something else. This is what I'll do. I'll put the clock in the other room. Then I can't look at it. If I do have to look at it, then I'll have to walk into the bedroom, and that will be something to do. Maybe, before I look at it again, he will call me. I'll be so sweet to him, if he calls me. If he says he can't see me tonight, I'll say, "Why, that's all right, dear. Why, of course it's all right." I'll be the way I was when I first met him. Then maybe he'll like me again. I was always sweet, at first. Oh, it's so easy to be sweet to people before you love them.

by Dorothy Parker, Classic Short Stories |  Read more:
Image: via:

Invasion of the Agency Snatchers

“At first glance, everything looked the same. It wasn’t. Something evil had taken possession of the town.”

Those lines are from the opening voice-over in a great midcentury American movie, “Invasion of the Body Snatchers.” Giant vegetable pods mysteriously arrive in a typical American town. Each takes over the identity of a local inhabitant, becoming an exact likeness except for the absence of emotion and of everything else that makes a person human. The town’s brave doctor tries to sound the alarm, but no one believes him, and it’s too late anyway. Trucks piled high with pods are rolling inexorably across the landscape.

The date was 1956. Many viewed the film as an allegory, although to what remains in dispute 60 years later. Some saw the soulless automatons that the pod people became as a reference to Communism. Others saw the target as McCarthyism. (The director, Don Siegel, denied any political message. “I think the world is populated by pods, and I wanted to show them,” he once explained.)

Personally, I see the Trump cabinet.

Stay with me and picture the first cabinet meeting. The white (almost all) men (almost all) sitting around the table will look like their predecessors, generations of them. But they won’t be the same as their predecessors, not at all. They will have been placed in their positions and handed the reins of power not to govern, but to destroy.

It’s not only Rick Perry, the former Texas governor whom President-elect Donald J. Trump has named to head the Department of Energy. Mr. Perry so disdained that department when he was running for the Republican presidential nomination in 2011 that he blanked on its name when listing the federal agencies he wanted to abolish. It’s also Scott Pruitt, the Oklahoma state attorney general, who has devoted his adult life to fighting environmental regulation in partnership with his financial backers in the oil industry, named to head the Environmental Protection Agency.

It’s Tom Price, the congressman-doctor from Georgia who doesn’t believe the federal government has an affirmative role to play in health care, named as secretary of health and human services. Or another congressman, Mick Mulvaney of South Carolina, a founder of the House Republicans’ Freedom Caucus who would rather shut down the government than pass a budget, named to be the White House budget director. It’s Wilbur Ross, named to head the Commerce Department after having made a fortune as an investor, buying and dismantling distressed industrial corporations. (Explain that to the voters who believed a Trump presidency would save their factory jobs.)

And it’s Ben Carson, at odds with the core mission of the Fair Housing Act insofar as he understands it, chosen as secretary of housing and urban development. And Betsy DeVos, named to head the Department of Education, for whom charter schools are the answer to the problems of public education.

Let’s not forget Senator Jeff Sessions, an Alabama good old boy, whose history of insensitive racial comments kept him from a Federal District Court seat in 1986, now picked to be attorney general. He’s Trent Lott without the charm. (You remember Trent Lott, the Mississippi senator who in 2002 lost his position as Senate majority leader for observing at the 100th birthday party for Strom Thurmond, the longtime Republican senator from South Carolina, that the country could have avoided “all these problems” if only Thurmond’s 1948 presidential bid for a segregationist third party had succeeded. In today’s America, that rhetorical gaffe might have propelled Senator Lott to the White House instead of out the door of the leadership suite.)

Then there is Andrew F. Puzder, the fast-food executive and opponent of raising the minimum wage, chosen as secretary of labor. He is a longtime anti-abortion activist who, as a lawyer defending people charged with blocking access to abortion clinics, has offered a “defense of necessity,” namely that abortion itself is a greater offense than a clinic blockage. I shouldn’t omit Rex Tillerson, chosen as secretary of state after a career spent at Exxon Mobil supporting fossil fuel and cultivating connections with Russia. (Am I the only one to notice that Mr. Tillerson and Darren Woods, named by Exxon Mobil to succeed him as its president, appear in their corporate headshots as eerily exact likenesses?) President-elect Trump’s selection of his bankruptcy lawyer, David M. Friedman, a shill for the Israeli right wing, to be ambassador to Israel is eyebrow raising, to say the least, in that Mr. Friedman’s outspoken support of West Bank settlements and opposition to a two-state solution is at odds with longstanding United States policy.

Maybe there really are giant pods waiting for the moment when simulacra of actual cabinet officers slip into the seats behind those big desks. A smart piece by Michael D. Shear in The Times earlier this week referred to most of the Trump nominees as “disrupters” who “aim to unnerve Washington.” Disrupters, destroyers — the scale of the degradation that will occur is so astonishing that no one word is adequate to encompass it. (Mr. Perry, the has-been politician, as secretary of energy may represent the most head-snapping degradation of all, given that it was not so long ago that President Obama placed a Nobel laureate physicist, Steven Chu, in that highly sensitive position.) A great phrase from Janet Malcolm’s “The Journalist and the Murderer” comes to mind: “the surrealism that is at the heart of journalism.” At such a time as this, words fail and only images remain. That’s why I can’t get the giant pods out of my mind.

by Linda Greenhouse, NY Times |  Read more:
Image: Invasion of the Body Snatchers, Allied Artists/Getty Images

One Problem for Democratic Leaders Is Democratic Voters

[ed. There's a lot to be learned including new attitudes toward Vladimir Putin and Wikileaks (and how quickly public opinion changes). More importantly, has anyone ever heard of the Industry Trade Advisory Committees and their roles in negotiating trade agreements? Not me. Editorial emphasis below:]

Leaders on the Democratic left who want to represent the have-nots face an obstacle: their own voters.

Keith Ellison, a congressman from Minnesota and a candidate for the chairmanship of the Democratic National Committee, argues that Democrats “have to stand for a strong, populist economic message.” He warns that “the way the working class is always controlled is that it’s divided.” (...)

Mark Muro, the director of the Metropolitan Policy Program at Brookings, analyzed the differences between those communities that supported Hillary Clinton and those that backed Donald Trump. The findings of Muro and Sifan Liu, a Brookings research assistant, suggest that Democrats who are calling for a return to progressive populism will encounter more hurdles than they expect.

In their Nov. 29 essay, “Another Clinton-Trump divide: High-output America vs low-output America,” Muro and Liu determined that:
The less-than-500 counties that Hillary Clinton carried nationwide encompassed a massive 64 percent of America’s economic activity as measured by total output in 2015. 
In other words, the Clinton counties are the ones in which the economy is booming; they are hardly fertile territory for a worker insurrection.
Muro enlarged on his findings in an email:
America’s most important, competitive, and often export-intensive industries — what we call its “advanced” industries — cluster tightly in such metro counties. Some 70 percent of these crown-jewel industries are concentrated in the 100-largest metros — the core of what Hillary won.
In a separate February 2015 study, “America’s Advanced Industries,” Muro and four colleagues report that the 50 industries in this heavily high-tech sector are crucial to America’s future growth:
These industries encompass the nation’s “tech” sector at its broadest and most consequential. Their dynamism is going to be a central component of any future revitalized U.S. economy. As such, these industries encompass the country’s best shot at supporting innovative, inclusive, and sustainable growth.
The importance of these industries does not stop there:
At the same time, the sector employs 80 percent of the nation’s engineers; performs 90 percent of private-sector R & D; generates approximately 85 percent of all U.S. patents; and accounts for 60 percent of U.S. exports. Advanced industries also support unusually extensive supply chains and other forms of ancillary economic activity. On a per worker basis, advanced industries purchase $236,000 in goods and services from other businesses annually, compared with $67,000 in purchasing by other industries. (...)
Democrats addressing trade and globalization concerns face not only a base sharply split over these issues, but also growing difficulties in the party’s traditional responses to employment dislocation. Both job training and education have become increasingly ineffective.

An August 2016 study by Robert G. Valletta, an economist at the Federal Reserve in San Francisco, “Recent Flattening in the Higher Education Wage Premium,” shows that since 2010 the steadily rising economic gains from completing college and, even more so, from a graduate degree, have leveled off.

The swelling number of workers with postsecondary education combined with the worldwide economic slowdown have resulted in a process economists call “skill downsizing.” Those with graduate degrees are forced to take jobs that a college graduate could do, college graduates are forced to take jobs that someone with less education could do, and so on down the line, leaving fewer and fewer good jobs for the newly trained or retrained.

What would a progressive approach to globalization look like? A call for a radical reform of the trade negotiation process to curb the leverage of corporate and special interests is one Democratic alternative.

This leverage has been institutionalized through the creation of Industry Trade Advisory Committees that grant special access to trade negotiations to corporations ranging from pharmaceuticals to aerospace, energy to investment banks, steel to textiles.

Two critics of current trade policy, Jared Bernstein, a former economics adviser to Vice President Biden, and Lori Wallach, the director of Public Citizen’s Global Trade Watch, wrote a September 2016 essay for the American Prospect, “The New Rules of the Road: A Progressive Approach to Globalization” in which they acknowledge some basic facts:
Despite Trump’s nostalgia for a bygone era when the United States was insulated from global trade, stopping or slowing trade is not at issue. Global trade volumes — imports plus exports — have grown from 25 percent of global GDP in the mid-1960s to 60 percent today. In the United States, that same metric has grown from 10 percent to 30 percent.
Bernstein and Wallach go on to point out that trade agreements
are not mainly about cutting tariffs to expand trade nor are they about jobs, growth, and incomes here in the United States. Rather, they’re about setting expansive “rules of the road” that determine who wins and who loses.
The problem is not with trade itself, which the authors recognize is both desirable and inevitable, but lies instead in the design of the negotiation process:
With 500 official U.S. trade advisers representing corporate interests having been given special access to the policy process while the public, press, and largely Congress have been shut out, it is not surprising that corporate interests have thoroughly captured the negotiating process and ensured they are the ‘winners’ under these rules.
Bernstein and Wallach make a potentially constructive attempt to deal with one aspect of the Democratic Party’s key dilemma: the struggle to prevail in national elections while accommodating the conflicting interests of diverse constituencies — including the conflict between the Sanders-Warren-Ellison wing and the free-trade wing.

Conciliation along these lines has become more difficult as international competition crosses national boundaries, indifferent to domestic regulation and legislation. The 2016 election demonstrates beyond a shadow of a doubt that ducking and weaving around the anguish of displaced workers guarantees sustained minority status.

The nation’s displaced work force includes not only the white working class but millions of Hispanics and African-Americans who are loyal to the Democratic Party. Effective and muscular policies focused on reversing the devastation that globalized trade, automation and competition with foreign workers have inflicted on middle and lower income Americans are essential to encourage defecting whites to return to Democratic ranks — and they are also crucial for reviving Election Day enthusiasm among the nation’s growing population of minority voters. In this regard, the political desires of the two groups are not irreconcilable.

by Thomas B. Edsall, NY Times |  Read more:
Image: Pew Research Center/NY Times

Wednesday, December 21, 2016


Matsuura Shiori 松浦シオリGozen niji 午前二時 (2 AM) - Japan - November 2016
via:

Red Flags Waving

“Happiness isn’t good enough for me! I demand euphoria!”
- Bill Watterson, Calvin & Hobbes

There are several instances across history when valuations have broken well-beyond their historical norms, as the speculative “animal spirits” of investors have scrambled off like greased pigs at a rodeo. Those speculative episodes were typically concluded by one of two events: 1) a combination of overvalued, overbought, overbullish conditions appearing as a joint syndrome, or 2) deteriorating uniformity and widening dispersion of market internals across a broad range of individual securities, industries, sectors, and security-types, indicating a subtle shift among investors toward risk-aversion (when investors are risk-seeking, they tend to be indiscriminate about it).

Indeed, in market cycles across history, those two events were regularly “stuck together,” in the sense that overvalued, overbought, overbullish extremes were typically either accompanied or closely followed by deterioration in market internals. That regularity turned out to be our Achilles Heel in the half-cycle since 2009. After admirably navigating previous complete market cycles, I insisted on stress-testing our methods against Depression-era data in 2009. The resulting methods picked up the fact that overvalued, overbought, overbullish extremes were consistently associated with market losses across history, and we responded by taking a hard-negative market outlook when they appeared. The problem in the half-cycle since 2009 was that zero interest rates - and specifically short-term interest rates below about 10 basis points - acted as a kind of “solvent” that separated the two events, and encouraged yield-seeking speculation by investors long after extreme overvalued, overbought, overbullish conditions had emerged. In the presence of zero-interest rate policy, one had to wait for market internals to deteriorate explicitly before adopting a hard-negative market outlook.

We presently observe the third most overvalued extreme in history based on the most reliable valuation measures we identify, in the presence of 1) the most extreme “overvalued, overbought, overbullish” syndrome we identify, and 2) explicitly deteriorating market internals. Based on a composite of measures best correlated with actual subsequent market returns across history, other two competing extremes were 1929 and 2000.

After more than three decades as a professional investor, it’s become clear that when investors are euphoric, they are incapable of recognizing euphoria itself. Presently, we hear inexplicable assertions that somehow euphoria hasn’t taken hold. Yet in addition to the third greatest valuation extreme in history for the market, the single greatest valuation extreme for the median stock, and expectations for economic growth that are inconsistent with basic arithmetic, both the 4-week average of advisory bullishness and the bull-bear spread are higher today than at either the 2000 or 2007 market peaks. In the recent half-cycle, extreme bullish sentiment and deteriorating market internals also preceded the near-20% decline in 2011, yet extreme bullish sentiment was also uneventful on a few occasions when interest rates were in the single digits and market internals were intact. That distinction is critical. The zero-rate “solvent” that allowed overvalued, overbought, overbullish extremes to detach from deteriorating market internals and downside risk is now gone, and investors should understand that subtlety.

As a side note, among popular alternatives, Investors Intelligence publishes one of the better surveys of bullish/bearish sentiment, while the AAII survey is far noisier. For our part, we focus on a slightly different balance, between trend-sensitive and value-conscious investor groups. As I detailed in Lessons From the Iron Law of Equilibrium:

“When prices are unusually elevated relative to the norm, it’s almost always because trend-followers (and other price-insensitive buyers) are ‘all in.’ Those positions are - and in fact have to be - offset by equal and opposite underweights by value-conscious investors. A sudden increase in the desired holdings of trend-sensitive traders has to be satisfied by inducing a price increase large enough to give value-conscious investors an incentive to sell. Conversely, a sudden decrease in the desired holdings of trend-sensitive traders has to be satisfied by inducing a price decline large enough to give value-conscious investors an incentive to buy. Any tendency of investors to buy on greed and sell on fear obviously amplifies this process.

“From this perspective, (and one can show this in simulation), what we’re really interested in is not the balance between bulls and bears per se, but the balance of sentiment between trend-sensitive and value-conscious investors. Market tops emerge when trend-followers are beating their chests while value-conscious investors are nursing bruises from their shorts. Market bottoms are formed when trend-followers wouldn’t even touch the market, and value-conscious investors are bleeding from all of the falling knives they’ve accumulated.”

Valuation update


Over a century ago, Charles Dow wrote “To know values is to comprehend the meaning of movements in the market.” To offer a long-term and full-cycle perspective of current market conditions, I published a chart last week of the ratio of nonfinancial market capitalization to corporate gross value added, including estimated foreign revenues (what I’ve called MarketCap/GVA), and a second chart relating that measure to the actual 12-year S&P 500 total returns that have followed. From present valuation extremes, we expect 12-year S&P 500 total returns averaging just 0.8% annually, with a likely interim market collapse over the completion of this cycle on the order of 50-60%. Valuations are poor tools to gauge near-term market outcomes, but they are both invaluable and brutally honest about potential consequences over the complete market cycle. They also offer a consistent framework to understand market fluctuations. Recall for example, my April 2007 estimate of a 40% loss to fair-value, and then following that 40% loss, my late-October 2008 comment observing that stocks had become undervalued. Over the complete market cycle, valuation is quite a strong suit for us.

Similarly, as I wrote at the March 2000 bubble peak:

“Investors have turned the market into a carnival, where everybody ‘knows’ that the new rides are the good rides, and the old rides just don’t work. Where the carnival barkers seem to hand out free money for just showing up. Unfortunately, this business is not that kind - it has always been true that in every pyramid, in every easy-money sure-thing, the first ones to get out are the only ones to get out... One of the things that you may have noticed is that our downside targets for the market don’t simply slide up in parallel with the market. Most analysts have an ingrained ‘15% correction’ mentality, such that no matter how high prices advance, the probable maximum downside risk is just 15% or so (and that would be considered bad). Factually speaking, however, that’s not the way it works... The inconvenient fact is that valuation ultimately matters. That has led to the rather peculiar risk projections that have appeared in this letter in recent months. Trend uniformity helps to postpone that reality, but in the end, there it is... Over time, price/revenue ratios come back into line. Currently, that would require an 83% plunge in tech stocks (recall the 1969-70 tech massacre). The plunge may be muted to about 65% given several years of revenue growth. If you understand values and market history, you know we’re not joking.”

As it happened, the S&P 500 lost half of its value by the October 2002 low, while the tech-heavy Nasdaq 100 Index lost an oddly precise 83% of its value.

With regard to the advancing half-cycle since 2009, I can be reasonably criticized for my insistence on stress-testing our methods in response to the global financial crisis (which we anticipated, but that also produced outcomes that were "out of sample" from a post-war perspective). My well-intended fiduciary inclination inadvertently shot us in the foot, because the resulting approach to classifying market return/risk profiles embedded a regularity of both Depression-era and post-war market cycles that, in this cycle, was disrupted by zero-interest rate policy. Our mid-2014 adaptations resolved that issue. Though I’m convinced that our methods have ultimately come out stronger, the criticism is legitimate, as is criticism about the time it took to disentangle and address the underlying issue. That said, investors are entirely misguided if they believe that those challenges in this cycle give them a "free pass" to ignore obscene valuations. If investors rule out the potential for the S&P 500 to lose 50-60% of its value over the completion of this cycle, they’re actually ruling out an outcome that would be wholly run-of-the-mill from a historical perspective, given current valuation extremes. They’re also ignoring that my previous risk estimates in prior cycles were devastatingly correct.

Disciplined investing isn't easy (and whenever it seems like it is, you're about to learn a costly lesson). The market has been in a more than two-year top-formation with internals lagging the major indices, with investors chasing high-beta stocks (those with amplified sensitivity to market fluctuations), and with rather shallow corrections from a full-cycle perspective. All of that has been a headwind for value-conscious hedged-equity strategies, but it won’t prevent the completion of this market cycle. Indeed, our impression is that the recent swing by investors from active to passive investment strategies represents nothing but performance-chasing, at a point where valuations imply historically low prospective 12-year returns for a conventional portfolio mix. If history is a guide, nobody will remember the patience, discipline, and tolerance for frustration that were required to avoid or to benefit from the 50-60% market loss that we estimate over the completion of this cycle. A focus on market internals may help, but even the less-extended 2000 and 2007 peaks were frustrating for us. As John Kenneth Galbraith wrote decades ago about the Great Crash, “Only a durable sense of doom could survive such discouragement.” Meanwhile, distinguish full-cycle outcomes from immediate outcomes. They can often be two quite different objects. (...)

Emphatically, our pointed concerns about market risk would quickly ease to a neutral outlook if our measures of market internals were to become favorable. Again, the reason is that overvaluation typically gives way to sharp market losses only during segments of the market cycle when investors have subtly shifted toward risk-aversion. Since risk-seeking speculators tend to be indiscriminate about that speculation, the best measure we’ve found to infer those risk-seeking or risk-averse preferences is the uniformity or divergence of market action across a broad range of internals.

So while our long-term (10-12 year) expectations for S&P 500 returns remain near zero, and we now expect a 50-60% market retreat over the completion of this cycle (an outcome that would be only run-of-the-mill from present valuation extremes), our expectations about more immediate market outcomes will remain heavily driven by the quality of market action we observe at each point in time. Presently, those measures are hostile, which is why we’ve got red flags waving, but a shift toward favorable uniformity across our measures of market internals would defer those concerns. Put simply, market conditions don’t forecast or require a near-term market collapse. Rather, they are currently permissive of a market collapse, and on average, market returns under such conditions have historically been quite negative until those conditions have cleared.

by John P. Hussman, Ph.D., Hussman Funds |  Read more:
Image: Bill Waterson

$4,000 for Slicing a Leg of Spanish Cured Ham

[ed. Follow your dream.]

Florencio Sanchidrián has been slicing Iberian ham (jamon) for the last three decades and today his name is synonymous with the Spanish delicacy. The 55-year-old is regarded as the world’s best ham slicer in the world, and he charges accordingly for his services – a reported $4,000 to slice a leg of ham.

Born in the city of Avila, Spain, Sanchidrián trained as a professional bullfighter in his youth, but eventually put his red cape away and moved to Barcelona to work as a waiter. One day, he started cutting ham and simply fell in love with it. He started taking jamon slicing courses, and before long, he was winning slicing competitions as well as national and international awards. Florencio is now known as an ambassador of Iberian ham around the world, and he tours the five continents “with a leg of ham under his arm” at least once or twice a year.

Floren, as he likes to be called, has sliced ham for a number of celebrities, including President Barrack Obama, Robert De Niro, or David Beckham, and for his majesty King Juan Carlos of Spain. He has performed his jamon-slicing art at the Oscars, Hollywood private parties and at casinos in Las Vegas and Macau. Throughout the year, he follows the Formula 1 circuit, cutting ham for VIPs in the paddocks and lounges of the top racing teams.

Slicing machines are apparently out of the question, as far as jamon enthusiasts are concerned, as heat generated by the friction can alter the taste of the ham and melt the fat, thus ruining the whole experience. But while professional ham slicers are present at any decent cocktail party or event in Spain, they usually make around $250 per ham leg. That’s not nearly enough for them to make a living, which is why most of them have multiple jobs. Florencio Sanchidrián, on the other hand, charges around $4,000 for cutting a leg of ham, a process that takes him around an hour and a half to complete.

He considers jamon slicing an art form – part cutting skill, part storytelling. While he masterfully cuts slices of ham thin enough to see through, he entertains his audience by giving them information about the pig’s breeding, the history of Iberian ham and the type of jamon they are about to enjoy. It’s an artistic performance, and people are crazy about it. At least crazy enough to pay him $4,000 for it.

by Spooky, Oddity Central |  Read more:
Image: Escuela Superior de Corte de Jamon
h/t Marginal Revolution

Ernst Maass
. At night, in the bedroom Silence (Nachts, in der Schlgafstille), 1944
via:

Celebrity - the Smiling Face of the Corporate Machine

Now that a reality TV star is preparing to become president of the United States, can we agree that celebrity culture is more than just harmless fun – that it might, in fact, be an essential component of the systems that govern our lives?

The rise of celebrity culture did not happen by itself. It has long been cultivated by advertisers, marketers and the media. And it has a function. The more distant and impersonal corporations become, the more they rely on other people’s faces to connect them to their customers.

Corporation means body; capital means head. But corporate capital has neither head nor body. It is hard for people to attach themselves to a homogenised franchise owned by a hedge fund whose corporate identity consists of a filing cabinet in Panama City. So the machine needs a mask. It must wear the face of someone we see as often as we see our next-door neighbours. It is pointless to ask what Kim Kardashian does to earn her living: her role is to exist in our minds. By playing our virtual neighbour, she induces a click of recognition on behalf of whatever grey monolith sits behind her this week.

An obsession with celebrity does not lie quietly beside the other things we value; it takes their place. A study published in the journal Cyberpsychology reveals that an extraordinary shift appears to have taken place between 1997 and 2007 in the US. In 1997, the dominant values (as judged by an adult audience) expressed by the shows most popular among nine- to 11 year-olds were community feeling, followed by benevolence. Fame came 15th out of the 16 values tested. By 2007, when shows such as Hannah Montana prevailed, fame came first, followed by achievement, image, popularity and financial success. Community feeling had fallen to 11th, benevolence to 12th.

A paper in the International Journal of Cultural Studies found that, among the people it surveyed in the UK, those who follow celebrity gossip most closely are three times less likely than people interested in other forms of news to be involved in local organisations, and half as likely to volunteer. Virtual neighbours replace real ones.

The blander and more homogenised the product, the more distinctive the mask it needs to wear. This is why Iggy Pop was used to promote motor insurance and Benicio del Toro is used to sell Heineken. The role of such people is to suggest that there is something more exciting behind the logo than office blocks and spreadsheets. They transfer their edginess to the company they represent. As soon they take the cheque that buys their identity, they become as processed and meaningless as the item they are promoting.(...)

You don’t have to read or watch many interviews to see that the principal qualities now sought in a celebrity are vapidity, vacuity and physical beauty. They can be used as a blank screen on to which anything can be projected. With a few exceptions, those who have least to say are granted the greatest number of platforms on which to say it.

This helps to explain the mass delusion among young people that they have a reasonable chance of becoming famous. A survey of 16-year-olds in the UK revealed that 54% of them intend to become celebrities. (...)

Celebrity has a second major role: as a weapon of mass distraction. The survey published in the IJCS I mentioned earlier also reveals that people who are the most interested in celebrity are the least engaged in politics, the least likely to protest and the least likely to vote. This appears to shatter the media’s frequent, self-justifying claim that celebrities connect us to public life.

The survey found that people fixated by celebrity watch the news on average as much as others do, but they appear to exist in a state of permanent diversion. If you want people to remain quiescent and unengaged, show them the faces of Taylor Swift, Shia LaBeouf and Cara Delevingne several times a day.

by George Monbiot, The Guardian |  Read more:
Image: Eduardo Munoz/Reuters

$16,255 Dinner Bill Hints at NFL hazing culture

Here’s what passes for a good time in the NFL. A group of veteran players gather up an unsuspecting rookie and take him out to dinner. Over the next few hours they order heaping slabs of steak and bottle upon bottle of wine from the restaurant’s secret storage chest. After, they wash it all down with a couple bottles of Dom Perignon.

When the bill arrives they propose a game. How about a little credit card roulette? Everyone throws their card in a hat and the waiter or waitress is asked to close their eyes then pull out a card. This is “the winner,” the one stuck with the check. Of course the game is rigged to be sure the rookie’s card will be selected. The bill makes his heart sink.

Maybe it’s $15,000.

Or $17,000.

Or $22,599.

Everybody laughs. Ha. Ha. Can you believe it? Oh the poor sap. There you go kid! Welcome to the league!

On Monday, that rookie was Houston Texans safety KJ Dillon who was handed a $16,255.20 bill for dinner with teammates. Included was $349.65 in sea bass and a whopping $7,770 of Hennessy Pardis Imperial. Not included was the gratuity, which should have been about $3,200 if Dillon was being only mildly generous. Buried deep in the check was a $12.95 Caesar salad that was apparently all Dillon ate according to his Twitter feed, where he placed a photograph of the bill and informed his followers that he didn’t even partake in the $7,770 of Hennessy since he doesn’t even drink.

The whole thing made for a good joke around the Internet. Hey, look at the silly rookie! Until former NFL punter Adam Podlesh tweeted a public note to Dillon that read: “For those who don’t think the NFL player bankruptcy epidemic has anything to do with veterans passing down the culture…Exhibit A. [Dillon] is a rookie on IR with a split in his contract. After tip he spent almost 7% of his post-tax paragraph 5 salary this year ... that is the same relative spending as a $50k a year new employee spending almost $3000 on his co-workers.”

Leave it to a punter to shame the meat-and-potato behemoths in the locker room. At some point athletes have to understand there is little redeeming social value in burying a supposedly beloved and trusted teammate with a staggering dinner bill. The legacy of professional athletes squandering their money is extreme. Ten thousand here. Twenty thousand there. Suddenly the whole pile is gone. Everybody shakes their head and mutters about another dumb athlete who couldn’t take care of what he earned.

by Les Carpenter, The Guardian |  Read more:
Image: K.J. Dillon

Tuesday, December 20, 2016

Bud-Sex

A lot of men have sex with other men but don’t identify as gay or bisexual. A subset of these men who have sex with men, or MSM, live lives that are, in all respects other than their occasional homosexual encounters, quite straight and traditionally masculine — they have wives and families, they embrace various masculine norms, and so on. They are able to, in effect, compartmentalize an aspect of their sex lives in a way that prevents it from blurring into or complicating their more public identities. Sociologists are quite interested in this phenomenon because it can tell us a lot about how humans interpret thorny questions of identity and sexual desire and cultural expectations.

Last year, NYU Press published the fascinating book Not Gay: Sex Between Straight White Men by the University of California, Riverside, gender and sexuality professor Jane Ward. In it, Ward explored various subcultures in which what could be called “straight homosexual sex” abounds — not just in the ones you’d expect, like the military and fraternities, but also biker gangs and conservative suburban neighborhoods — to better understand how the participants in these encounters experienced and explained their attractions, identities, and rendezvous. But not all straight MSM have gotten the same level of research attention. One relatively neglected such group, argues the University of Oregon sociology doctoral student Tony Silva in a new paper in Gender & Society, is rural, white, straight men (well, neglected if you set aside Brokeback Mountain).

Silva sought to find out more about these men, so he recruited 19 from men-for-men casual-encounters boards on Craigslist and interviewed them, for about an hour and a half each, about their sexual habits, lives, and senses of identity. All were from rural areas of Missouri, Illinois, Oregon, Washington, or Idaho, places known for their “social conservatism and predominant white populations.” The sample skewed a bit on the older side, with 14 of the 19 men in their 50s or older, and most identified exclusively as exclusively or mostly straight, with a few responses along the lines of “Straight but bi, but more straight.”

Since this is a qualitative rather than a quantitative study, it’s important to recognize that the particular men recruited by Silva weren’t necessarily representative of, well, anything. These were just the guys who agreed to participate in an academic’s research project after they saw an ad for it on Craigslist. But the point of Silva’s project was less to draw any sweeping conclusions about either this subset of straight MSM, or the population as a whole, than to listen to their stories and compare them to the narratives uncovered by Ward and various other researchers.

Specifically, Silva was trying to understand better the interplay between “normative rural masculinity” — the set of mores and norms that defines what it means to be a rural man — and these men’s sexual encounters. In doing so, he introduces a really interesting and catchy concept, “bud-sex”:
Ward (2015) examines dudesex, a type of male–male sex that white, masculine, straight men in urban or military contexts frame as a way to bond and build masculinity with other, similar “bros.” Carrillo and Hoffman (2016) refer to their primarily urban participants as heteroflexible, given that they were exclusively or primarily attracted to women. While the participants in this study share overlap with those groups, they also frame their same-sex sex in subtly different ways: not as an opportunity to bond with urban “bros,” and only sometimes—but not always—as a novel sexual pursuit, given that they had sexual attractions all across the spectrum. Instead, as Silva (forthcoming) explores, the participants reinforced their straightness through unconventional interpretations of same-sex sex: as “helpin’ a buddy out,” relieving “urges,” acting on sexual desires for men without sexual attractions to them, relieving general sexual needs, and/or a way to act on sexual attractions. “Bud-sex” captures these interpretations, as well as how the participants had sex and with whom they partnered. The specific type of sex the participants had with other men—bud-sex—cemented their rural masculinity and heterosexuality, and distinguishes them from other MSM.
This idea of homosexual sex cementing heterosexuality and traditional, rural masculinity certainly feels counterintuitive, but it clicks a little once you read some of the specific findings from Silva’s interviews. The most important thing to keep in mind here is that rural masculinity is “[c]entral to the men’s self-understanding.” Quoting another researcher, Silva notes that it guides their “thoughts, tastes, and practices. It provides them with their fundamental sense of self; it structures how they understand the world around them; and it influences how they codify sameness and difference.” As with just about all straight MSM, there’s a tension at work: How can these men do what they’re doing without it threatening parts of their identity that feel vital to who they are?

In some of the subcultures Ward studied, straight MSM were able to reinterpret homosexual identity as actually strengthening their heterosexual identities. So it was with Silva’s subjects as well — they found ways to cast their homosexual liaisons as reaffirming their rural masculinity. One way they did so was by seeking out partners who were similar to them. “This is a key element of bud-sex,” writes Silva. “Partnering with other men similarly privileged on several intersecting axes—gender, race, and sexual identity—allowed the participants to normalize and authenticate their sexual experiences as normatively masculine.” In other words: If you, a straight guy from the country, once in a while have sex with other straight guys from the country, it doesn’t threaten your straight, rural identity as much as it would if instead you, for example, traveled to the nearest major metro area and tried to pick up dudes at a gay bar. You’re not the sort of man who would go to a gay bar — you’re not gay!

It’s difficult here not to slip into the old middle-school joke of “It’s not gay if …” — “It’s not gay” if your eyes are closed, or the lights are off, or you’re best friends — but that’s actually what the men in Silva’s study did, in a sense...

by Jesse Singal, Science of Us |  Read more:
Image: Hero Images Inc./Getty Images/Hero Images

Monday, December 19, 2016

Moderation As a Virtue

[ed. This includes moderation in everything, not just politics: drinking, eating, exercise, pornography, internet, social networking, drugs, work, Netflix... everything. As my doctor says, "moderation in everything, including moderation" (I love that guy). Too often people seem to want to embrace a binary perspective these days... black or white, good or bad, regular or organic.]

The Trump era will be unpredictable in many ways. But there’s one thing that we can reasonably count on. Moderation, an ancient virtue, will be viewed with contempt. After all, the most temperamentally immoderate major party nominee in American history ran for president and won because of it. Victory spawns imitation, and the Trump template is likely to influence our politics for some time to come.

Moderation, then, is out of step with the times, which are characterized by populist anger and widespread anxiety, by cross-partisan animosity and dogmatic certainty. Those with whom we have political disagreements are not only wrong; they are often judged to be evil and irredeemable.

In such a poisonous political culture, when moderation is precisely the treatment we need to cleanse America’s civic toxins, it invariably becomes synonymous with weakness, lack of conviction and timidity. For many, moderation is what the French existentialist Jean-Paul Sartre called a “tender souls philosophy.”

This is quite a serious problem, as Aurelian Craiutu argues in his superb and timely new book, “Faces of Moderation: The Art of Balance in an Age of Extremes,” in which he profiles several prominent 20th-century thinkers, including Raymond Aron, Isaiah Berlin and Michael Oakeshott. Mr. Craiutu, a professor of political science at Indiana University, argues that the success of representative government and its institutions depends on moderation because these cannot properly function without compromise, which is the governing manifestation of moderation.

The case for political moderation requires untangling some misconceptions.

Moderation does not mean truth is always found equidistant between two extreme positions, nor does it mean that bold and at times even radical steps are not necessary to advance moral ends. Moderation takes into account what is needed at any given moment; it allows circumstances to determine action in the way that weather patterns dictate which route a ship will follow.

But there are general characteristics we associate with moderation, including prudence, the humility to recognize limits (including our own), the willingness to balance competing principles and an aversion to fanaticism. Moderation accepts the complexity of life in this world and distrusts utopian visions and simple solutions. The way to think about moderation is as a disposition, not as an ideology. Its antithesis is not conviction but intemperance.

Moderates “do not see the world in Manichaean terms that divide it into forces of good (or light) and agents of evil (or darkness),” according to Professor Craiutu. “They refuse the posture of prophets, champion sobriety in political thinking and action, and endorse an ethics of responsibility as opposed to an ethics of absolute ends.” This allows authentic moderates to remain open to facts that challenge their assumptions and makes them more likely to engage in debate free of invective. The survival of a functioning parliamentary system, Sir William Harcourt said, depends on “constant dining with the opposition.”

The charge that moderates lack courage is easily put to rest by people like the French journalist and philosopher Raymond Aron. He was a man of deep, reasoned convictions who possessed a sense of proportion. A nonconformist, Aron was fearless in taking on the leading intellectuals of his time, including his friend Sartre. (Parisian students in 1968 avowed that it was “better to be wrong with Sartre than right with Aron.”) Aron strongly defended liberal democracy when it was fashionable to denigrate it. Playing off the Marxist claim that religion was the opium of the masses, Aron argued that Marxism was the opium of the intellectuals.

For Aron, political moderation was a fighting creed. Allergic to ideological thinking, he conformed his views to evidence. He retained his intellectual and political independence throughout his life. Aron believed that history teaches us humility, modesty and the limits of our knowledge. He was also skilled at the art of dialogue, engaging those he disagreed with critically but civilly. “As the last great representative of a distinguished tradition of European liberalism,” Professor Craiutu writes, “Aron attempted to disintoxicate minds and calm fanaticism in dark times.” Aron put it this way: “Freedom flourishes in temperate zones; it does not survive the burning faith of prophets and crowds.”

by Peter Wehner, NY Times |  Read more:
Image: via:

The World According to Stanislaw Lem

[ed. I read Lem when I was fresh out of college working in a bookstore (His Master's Voice). Back then employees would trade secret recommendations on favorite books, which is how I heard about him (now you can find those same recommendations on the shelves of any bookstore, titled "Employee Favorites", which seems kind of sad for some reason). Anyway, I thought he was a genius, along the lines of David Foster Wallace. Glad to see him finally getting some recognition. I'll have to revist him again soon.]

There's a paradox at the heart of science fiction. The most basic aspiration of the genre — its very essence, really — is to transcend time and place. Not just to predict the future, but to imagine things that are totally foreign to human experience. How would an alien life form have evolved, compared with those on Earth? What will human society look like 10,000 years from now? What is artificial intelligence, anyway? SF tries to imagine the unimaginable, to comprehend the incomprehensible, to describe the indescribable, and to do it all in entertaining, accessible prose.

But SF, like everything else, is also a product of its time. Jules Verne’s tales of trips around the globe and voyages to the center of the Earth reflected the scientific optimism of the late 19th century, before World War I blew open technology’s dark side. During its midcentury golden age in the United States, the pulpy genre cheered on the rising economic and military dominance of the United States, forecasting an American empire that stretched to the stars. Not long after, New Wave authors like Philip K. Dick, Samuel R. Delany, and Ursula K. Le Guin wrestled with the social and political upheavals of the 1960s and ’70s, from Cold War paranoia to the Civil Rights Movement, second-wave feminism, and the drug culture. What kind of stories the Trump era might inspire is still unknown, but they probably won’t be cheerful.

Stanisław Lem, the Polish novelist, futurologist, literary theorist, satirist, and philosophical gadfly, tried mightily to free his work from the shackles of the present. In dozens of novels, short stories, essays, metaliterary experiments, and futurological treatises, he attempted to imagine everything from a living ocean that could read human minds (Solaris) to a swarm of nonbiological mechanical insects (The Invincible) to a supercomputer many times more intelligent than its human creators (Golem XIV). In his 1964 book Summa Technologiae, Lem mocked writers whose works were merely historical fiction recast in the future — “corsairs and pirates of the thirtieth century.” It’s easy to find targets for Lem’s criticism; most SF movies are exercises in wish fulfillment, projections of a space-age Columbus in search of a final frontier. For Lem, science fiction meant thinking harder and imagining more.

But even Lem could not transcend his own history. Born in 1921 in Lviv (then called Lwów as part of the Second Polish Republic), he survived World War II, served in the Polish resistance, and lived for most of his life under Polish Communism. In his work, he turned repeatedly to themes reflecting those experiences, including the role of chance in determining fate, the oppressive bureaucracy of authoritarian regimes, and the possibility of a runaway arms race that escapes human control. Ironically, Lem’s effort to think outside of history often provides the best descriptions of the period he lived through.

Lem died in 2006, having lived to see many of his ideas come true. Yet today he has fallen into quasi-obscurity, at least in the English-speaking world. Not even in his heyday did he have the cachet in the United States of writers like Isaac Asimov or Robert A. Heinlein. But Lem was phenomenally popular in Eastern and Central Europe. According to a recent estimate, his books have been translated into more than 40 languages and have sold almost 40 million copies, and he was repeatedly nominated for the Nobel Prize. By all measures he was one of the most successful writers of the 20th century.(...)

Compared to most science fiction writers, Lem’s thinking was both disinterested and far-reaching. In works like the nonfictional Summa Technologiae, he explored the possibilities of artificial intelligence, virtual reality, and genetic engineering, comparing technological advancement to biological evolution. Just as evolution had no moral agenda, he argued, technological developments were neither inherently good nor bad, but followed their own internal logic. Unlike most would-be prophets, who predict the future with warnings of dystopia or promises of a better tomorrow, Lem approached the subject without a moralizing tone.

But in his fiction Lem did explore the pitfalls that the future might hold. His experience living under German occupation impressed on him the role of chance in life and the ease with which that life could be snuffed out. The absurdities of authoritarian communism and the perils of the Cold War further illustrated the danger humanity posed to itself. Worst of all, the construction of oppressive ideological systems seemed to occur through processes that its participants were unable to prevent, or even fully understand. (...)

These ideas evoke comparisons to Orwell, and to the British novelist’s famous depiction of Stalinism in 1984 (1949). But in his letters to Kandel, Lem claimed that Orwell had gotten Stalinism wrong. Whereas Orwell described his dystopian regime as “a boot stamping on a human face — forever,” Lem argued that communist oppression was not a sadistic evil pursued for its own sake but a natural result of turning state ideology into dogma. Similarly, Lem critiqued Hannah Arendt’s analysis of totalitarianism, writing that “she made out these systems to be fruit of strictly intentional evil.” Rather, he writes, “Stalin’s times concocted a myth, never concretely or cogently expressed, of the state as a machine that was not only perfect, but also omniscient and omnipotent.” For Lem, the tragic consequences weren’t the result of premeditated cruelty, but the logical outcome of turning politics into faith.

Lem may have been critical of the Soviet Union, but that didn’t mean he had a positive view of the West. “Say, one country permits eating little children right before the eyes of crazed mothers,” he wrote to Kandel in 1977, “and another permits eating absolutely anything, whereupon it turns out that the majority of people in that country eat shit. So what does the fact that most people eat shit demonstrate […] ?” In other words, just because life behind the Iron Curtain was bad, that didn’t make the United States good. For Lem the world wasn’t divided between good and evil, but between bad and even worse.

Starting in the late 1960s, Lem turned away from conventional SF in favor of experimental works of literary and cultural criticism. These included books like The Philosophy of Chance (1968), in which he attempted to produce an empirical form of literary theory, and the Borgesian A Perfect Vacuum (1971) and One Human Minute (1986), in which he reviewed nonexistent books. While Lem’s literary experiments displayed a playful dexterity, his cultural criticisms were often clichéd, focusing on the West’s supposed vulgarity, tastelessness, and excess. In a 1992 interview with Swirski, he commented on the exploding number of TV channels, calling them “simply appalling. It is like having two thousand shirts or pairs of shoes.” While Lem’s main argument was about the unmanageable explosion of media, neither TV channels nor an excessive wardrobe seems like humanity’s greatest crime.

If Lem didn’t think much of American popular culture, neither did he have much esteem for its literature. (...)

Lem’s criticisms may have been curmudgeonly and, as Swirksi suggests, rooted in his frustrated desire for greater American recognition. But to Lem the country also represented dangers that we are only now beginning to appreciate. He foresaw dystopia not only in resource-starved wastelands, but also in technological prisons of pleasure and excess. “The idea would be to expand the gamut of pleasurable sensations to the maximum, and perhaps even to bring into being […] other, as yet unknown, kinds of sensual stimulation and gratification,” he wrote in His Master’s Voice. This possibility became the premise for The Futurological Congress, in which humanity becomes trapped in a pharmacologically induced paradise, unaware of its own looming extinction.

Most presciently, Lem understood that even mundane varieties of information could be disastrous in overwhelming quantities. What happens, he asked in His Master’s Voice, when “the technologies of information have led to a situation in which one can receive best the message of him who shouts the loudest, even when the most falsely?” Or, as he wrote in the same novel, “freedom of expression sometimes presents a greater threat to an idea, because forbidden thoughts may circulate in secret, but what can be done when an important fact is lost in a flood of impostors […] ?” Facebook and the deluge of fake news sites didn’t exist when Lem wrote this, but their creation wouldn’t have surprised him. The future of the United States, he wrote to Kandel, is “dark, most likely.” (...)

Lem considered any effort to make accurate predictions a fool’s errand — “Nothing ages as fast as the future,” he once wrote — but he did try to think rigorously about the paths our civilization might take. At first technology is applied toward our environment, he argued, as we enter the Anthropocene era on Earth. But eventually it is turned toward the human organism itself, leading to a stage of existence that is as yet unpredictable. “Man remains the last relic of Nature, the last ‘authentic product of Nature’ for an indefinite period of time,” he writes. But “the invasion of technology created by man into his body is inevitable.”

Most importantly, Lem viewed biological evolution and technological development as part of the same process. Following Norbert Wiener’s formulation that there exist in the universe “islands of locally decreasing entropy” — that is, areas of space-time that tend naturally toward greater complexity and organization — Lem posited that evolution was not just a biological process guiding life on Earth but a phenomenon that could include any form of matter or energy. While these islands might sometimes result in biological life, they might also result in other kinds of complex systems, including our own creations. “Who causes whom?” he asked in Summa Technologiae. “Does technology cause us, or do we cause it?” Or, as he put it more pointedly in His Master’s Voice, “The roles are now reversed: humanity becomes, for technology, a means, an instrument for achieving a goal unknown and unknowable.”

by Ezra Glinter, LARB |  Read more:
Image: Goodreads

Roger Miller


Two broken hearts lonely, looking like
Houses where nobody lives
Two people each having so much pride inside
Neither side forgives

The angry words spoken in haste
Such a waste of two lives
It's my belief
Pride is the chief cause in the decline
In the number of husbands and wives

A woman and a man, a man and a woman
Some can and some can't
And some can

Two broken hearts lonely, looking like
Houses where nobody lives
Two people each having so much pride inside
Neither side forgives

Lyrics: via:

Turning Your Vacation Photos Into Works of Art

It’s the season for family travel and photos — and perhaps enlarging some of those images of snowy landscapes or tropical getaways to decorate your home.

There are, of course, the usual print services and methods. You can choose a glossy or matte finish, print a photo on canvas, or make it into a poster with a few clicks online at photo sites like Snapfish and Shutterfly, professional photo shops like Adorama and Mpix, or drugstores and big-box chains like Walgreens and Costco. But the web is also home to many lesser-known printing services, as well as uncommon surfaces on which to enlarge photos for display, be it burlap, wood boards, acrylic or stick-and-peel fabric. Why not try some fresh sites and methods?

I recently sent some ho-hum quality iPhone vacation photos to a handful of companies that I’d never used before and had them enlarged to various sizes and printed on different surfaces. I’ve also offered some guidance about bulk digitizing those boxes of old travel photos sitting in your closet or basement so that you can begin the New Year if not with a vacation, then with a clutter-free home.

Engineer Prints

Of all the ways to turn photos into wall art, I was most interested in trying engineer prints, named for the large, lightweight prints used by architects. For less than the cost of a couple of movie tickets, you can make huge enlargements. Mind you, it’s a particular aesthetic, one that’s most likely to appeal to people who are after an industrial, shabby chic or bohemian look. The paper is thin and the lines of the images are softer than a fine art print. And engineer prints need not be formally framed. People stick them to their walls with washi tape, a crafting tape that comes in innumerable colors and prints; or they hang the prints using wood poster rails or skeleton clips. For a while, engineer prints from photos were primarily available in black and white, but now you can find them in color, too.

One of the easiest ways to order them online is through Parabo Press, which is run by Photojojo, an online photography gear shop, and Zoomin, a photo printing service in Asia. As with all printing sites, you upload your image, zoom in closer if you like, and then click to buy.

The site’s engineer prints are 4 feet by 3 feet, and cost $20 in black and white, and $25 in color. I sent out two different photos to be made in black and white, and they came out, to my surprise, beautifully. I was impressed that they were able to be enlarged to such a degree and not look blurry. And the paper (while so thin I was worried about accidentally tearing it) lends it an artful, careless look rather than the expected framed print over the couch.

Parabo Press is a breeze to use: It’s clean and easy to read, your options are straightforward, and there are no annoying upsells. The site also offers prints on metal, glass, newsprint and Zines (handmade magazines); calendars; photo books; and prints from its Risograph machine, which uses soy-based ink and is described by Parabo as having “a cult following since its invention in 1980s Japan.”

Fabric Prints

A fabric print — not soft like a bedsheet, more like a place mat made of matte woven fabric — is another departure from a traditional photo enlargement. Order one from a site such as SnapBox and instead of framing it, you can peel and stick it on your wall. The site’s fabric posters adhere to (and can be peeled off) smooth surfaces such as untextured walls, glass, ceilings, tile and finished wood surfaces (avoid surfaces like stucco, concrete blocks, brick, unfinished wood, canvas or freshly painted walls). SnapBox offers fabric posters in more than a dozen sizes from 4x4 to 36x54, from less than $2 to about $80.

I ordered a 24x36 fabric poster for $34.99, a discounted price thanks to a holiday coupon — not cheap (you can buy fine art prints on other sites for less), but you’re printing on special material. Regardless of the cost, I expected the finished product to look like the sort of cheap thing one might see in a dorm room (it sticks to walls, after all), but I was pleasantly surprised. The fabric was durable and the details in the photo — crevasses in a glacier; onlookers on a bridge — were nicely defined.

SnapBox is a user-friendly site with clear instructions and pricing. In addition to fabric posters, it also offers fine art prints, photo books and prints on canvas and pillows.

by Stephanie Rosenbloom, NY Times |  Read more:
Image: Stephanie Rosenbloom