Thursday, February 25, 2021

A Digital Catch-22

Today’s organisations are facing a digital catch-22. On the one hand, digital transformation is difficult and costly, and short-term investment may be needed elsewhere to where it’s really hurting. On the other hand, today’s organisations cannot afford not to become tomorrow’s digital businesses. In this article I will point up the dimensions and intractability of this digital catch-22, before suggesting some ways forward.

But not so fast; firstly, what is digital transformation? Digital transformation requires digitisation – converting something non-digital (e.g., a health record, an identity card) into a digital format that can then be used by a computer system. Digital transformation also requires digitalisation – enabling, improving, or changing business operations, functions, or activities by utilising digital technologies and using digitised data to create management intelligence and actionable knowledge. All three—digitisation, digitalisation, and digital transformation—are needed to build a digital business. Digitisation and digitalisation are necessary but insufficient. My academic colleague George Westerman put it rather well: When digital transformation is done right, it’s like a caterpillar turning into a butterfly, but when done wrong (or we might add, incompletely), all you get is a really fast caterpillar. Digital transformation must focus on the whole organisation, and large-scale change. It involves radical redesign, then deployment of business models, activities, processes, and competencies to fully leverage the opportunities provided by digital technologies. I would guess you already have some idea of why it is so difficult.

Let’s find some evidence for the high level of challenge. It is notable, firstly, that organisations are surprisingly slow into digital transformation, given that this has been on many executive agendas since at least 2010. Many organisations digitise, digitalise even, but this does not add up to digital transformation, though many might think it does. The reasons for the lack of speed are complex, but failure is five times more likely than success. The high failure rate is indicative of the large number of stumbling blocks and can be very dissuading for others. Our work suggests that slow progress reflects how ‘siloed’ many organisations have become. What we call the ‘seven-siloed organisation’ points to the barriers to change inherited from older business models. The siloes include processes, technology, data, culture, structures, skills sets, and managerial mindsets. When it comes to digital transformation, any organisation with all these siloes is severely hamstrung from the start.

Yet there is another side to the digital catch-22. What happens if, putting it colloquially, you fail to sail? There are relatively few best performers on digital transformation. These are getting disproportionate gains, recording markedly higher profitability and revenues, accelerating away from the others, and may well establish a competitive advantage that becomes irreversible. What are they achieving? According to one study, they had increased the agility of their digital-strategy practices, enabling first- mover opportunities. They had taken advantage of digital platforms to access broader eco-systems and innovate new digital products and business models. They had used mergers and acquisitions to build new digital capabilities and digital businesses. A significant feature was that they had invested ahead of their competitors in digital talent.

Our own work (here and here) suggests that the best performers on digital transformation add up to around only 20% of organisations, all recording up to a 30% increase in revenues as a part outcome of their digital technology investments over the previous four years. They come from most sectors and regions of the world and are not limited to the obvious hi-tech US and Chinese firms. To add even more urgency, our evidence, consistent with other studies, shows that being slow to adopt digital technologies may reduce risk in the short term, in terms of cost, talent and time, but builds growing business risk and reduced competitiveness in the long term. And this trend will be repeated and magnified by the growing adoption of automation and ‘AI’ over the next five years.

So, there is plenty of evidence for a digital catch-22, but is there an unlikely saviour here in the form of the pandemic and economic crisis? This has undoubtedly accelerated corporate moves toward digitisation and digitalisation – primarily to survive in the short term, establish resilience, and to maintain competitiveness. But we found motives mixed, capabilities variable, and planning horizons mostly short term. That said, a McKinsey survey suggests that COVID-19 has pushed companies over a technology tipping point. Between January and October 2020, the digitisation of customer and supply-chain interactions and of internal operations had accelerated by three to four years. The share of digital or digitally enabled products in corporate portfolios had accelerated by seven years. Nearly all respondents had put in place quickly at least temporary solutions, to meet many of the new demands on them. Funding for digital initiatives increased more than for anything else. Moreover, the largest shifts in the crisis were also the ones most likely to stick – think changing customer needs and expectations, more remote working/collaboration, cloud migration, customer demand for online products and services, and increasing spend on security. Those who had invested heavily into digital technologies over the previous three years also reported a range of facilitating technology-related capabilities that others lacked in the crisis. This meant they were better prepared for the crisis.

Did COVID-19, then, make digital transformation easier? Well, the evidence is that the digital catch-22 has not gone away. Digital technologies are gaining a higher profile amongst the executives who make the key decisions, but the difficulties and complexities of large-scale organizational change on many fronts are not easily circumvented, and there remain many other pressing matters to deal with, distracting executive attention. 

by Leslie Willcocks, LSE | Read more:
Image: uncredited

Wednesday, February 24, 2021

So Empty

"Rock and roll, which entered global consciousness in the mid-1950s, has lost the great majority of its founders, and enough time has gone by to amply confirm the thesis, which we have already considered above, that rock and roll is for the young, and aside from a few exceptions a middle-aged balding schlub on a rock-and-roll reunion tour is as painful a sight as a man in his forties who suffers from the knowledge that his life peaked in his brief season as a high-school quarterback."

So Empty (Hinternet, Substack)
Image: dead link
[ed. Beg to differ.]

Top Shot, the New Crypto Highlight Phenomenon, Explained

Take a stroll around social media and you’ll see no shortage of people talking about “NBA Top Shot,” a collectible, blockchain-based highlight repository that has been around since July of 2019, but caught fire in the last week with over $50M in revenue hauled in by people still trying to get in on the ground floor of the pseudo crypto currency.

The scarcity is what is bringing people in to Top Shot, and the system is going wild. Right now the highest-priced Top Shot available for auction is a block by Zion Williamson against the Nuggets from January, 2020 — with a ludicrous asking price of $250,000.

It’s addictive and exciting for those involved, and to outsiders the dumbest thing in the world. Why are people paying for trading card-esque “packs” of random highlights, which you can watch for free on YouTube, with no material value? Is this the future of sports collectibles, or a massive grift? And will early adopters be millionaires in 10 years, or the new generation of Beanie Baby collectors?

What is NBA Top Shot?

NBA Top Shot in an online-only collection of NBA highlights which can be obtained by buying “packs” or purchased via auction. Think of it like buying sports cards, but in video form. You might crack a pack and get a highlight of a Steph Curry three-pointer, which is only being produced 99 times. When those 99 clips are gone nobody else will ever get that same highlight, and Top Shot claims you’ll own that clip forever.

The clips are created through Blockchain, which is the same technology that powers Bitcoin, Etherium, and other cryptocurrencies. I’ll spare you doing an extremely poor job trying to explain complicated Blockchain technology, which you can read in detail about here, but the important part is that it’s completely encrypted, impossible to hack, and ensures that it’s impossible to duplicate these files. So, for whatever it’s worth, when you buy an NBA Top Shot it is absolutely yours as a collectible.

So, for the price of an entire house, you could instead buy a highlight of Zion blocking a shot, that exists only on the internet, and can only be bought and sold on Top Shot.

Before you say “well, people can ask whatever they want, it doesn’t mean they get it,” understand that Top Shots of Zion Williamson and LeBron James both sold last week on the site for $100,000 each.

So you can make money off NBA Top Shot?

Theoretically yes, but it’s a little more complicated than you might expect. The tech behind Top Shot is a product of Dapper Labs, a Blockchain service that boasts on its own website that it “uses the power of play to deliver blockchain-based experiences that are made for you and ready for the real world.” What this means is that the NBA and a Blockchain service teamed up to replicate the sporting card market in an online medium, replicating scarcity and rarity to turn these moments into a commodity.

by James Dator, SB Nation |  Read more:
Image: Top Shot
[ed. See also: What is Top Shot? (The Irrelevant Investor).]

Tuesday, February 23, 2021

Thomas Danthony
via:

Tiger Woods Car Accident Eerily Similar to Ben Hogan’s

Golf legend Ben Hogan was once also seriously injured in a violent car wreck eerily similar to Tiger Woods’ crash on Tuesday — and came back to win six more majors.

Hogan, who died in 1997 at the age of 84, nearly lost his life in February 1949 when a Greyhound bus smashed into his black Cadillac while he was driving with his wife in Texas, according to the Golf Channel.

The crash left Hogan with a broken collarbone, fractured ribs, a broken ankle, a double fracture to his pelvis, and deep contusions to his left leg.

It took nearly one hour for emergency personnel to extract him from the wreckage.

His wife, Valerie, escaped with minor injuries, in large part because the golf great shielded her with his body just moments before the impact.

Doctors at the El Paso hospital where Hogan was taken feared he wouldn’t survive his injuries — but Hogan did recover and was back on the green in under a year.

Hogan played in pain for the rest of his life but continued a stellar career that has him ranked as one of golf’s greatest players with 64 total Professional Golfers Association Tour victories, according to his PGA Tour profile.

On Tuesday, Woods suffered injuries to both legs and was rushed into surgery after his SUV rolled over in Rancho Palos Verde in Los Angeles County.

by Jorge Fitz-Gibbon, NY Post | Read more:
Image: AP Photo/Dennis Lee Royle
[ed. Just home from a golf outing today and heard the news. I'm afraid this is it (for competitive PGA Tour golf). We may see Tiger on the Champion's Tour someday, maybe not, but there's always coaching, broadcasting, course design and other pursuits he could still be successful at, too. Still, a very sad end to a phenomenal career.]


Tadeusz Baranowski

The Limbaugh Whisperer

His radio show was once a vital outlet of conservative news—and I was one of his sources. But it became increasingly divorced from reality, like much of right-wing media.
New Republic, February 18, 2021

As most of my readers know, I was a card-carrying conservative for many years. I was working in the Reagan White House when Rush Limbaugh went on the air in 1988 and remember having to go out and buy a desk radio so I could listen to him, which I did almost every day. Even then, however, I didn’t care for his callers—I thought they were ignorant, obsequious fools. But I liked Limbaugh’s monologues at the top of the hour because I learned useful stuff from him.

I know many liberals will disagree with me on this, but in 1988 there really was a liberal media. I found it very hard to get honest-to-God news that interested me as a conservative, even as a White House staffer. It had to be sought out in small-circulation magazines like Human Events and National Review, or from the very few conservative columnists in major newspapers.

I didn’t need validation of my views, as was the case with many grassroots conservatives. I wanted intellectual ammunition I could use to design and promote conservative policies in government. Contrary to popular belief, the Reagan administration took analysis and research seriously. Unlike the Trump White House, which often sent out documents with typos in them (a firing offense when I worked there), the policy development process in the Reagan White House was reasonably competent.

A key reason for making sure that there was proper analysis and documentation for administration proposals is that they would have been picked apart in the media otherwise. Not only was the American press generally skeptical of our philosophy, but it was vastly more powerful in those days and could make or break a policy proposal very easily. Frankly, I think Democrats on Capitol Hill, who controlled the House of Representatives during Reagan’s entire eight-year term, tended to outsource their criticism of Republicans to The New York Times and The Washington Post.

Beat reporters for the major newspapers were gatekeepers, refusing to even mention any proposal or idea that was insufficiently worked out, lacked empirical data or academic support, or just seemed stupid. Back when I worked for Jack Kemp, it took me years to get the Wall Street Journal tax reporter to mention Kemp’s tax cut plan—even after it had been endorsed by the Journal’s editorial page.

And in those days before the internet, politicians were very heavily dependent on the mainstream media to get their message out. About the only other way of doing so was direct mail. But printing and mailing newsletters was very expensive, and it took an enormous amount of effort to build a mailing list. Like it or not, conservatives in the pre–talk radio, pre–Fox News, pre-internet era had to work through the liberal media and play by its rules.

I should add that the rules of the once-dominant mainstream media were mostly good ones. When the established media lost its gatekeeper function, it led to a vast proliferation of crackpot ideas that circulate unimpeded today. Even members of the prestige media have found themselves unable to keep nutty conspiracy theories from affecting their reporting, as they document what is in fact motivating Republican voters and politicians. But in reporting the existence of crackpot ideas and fake news, the mainstream media implicitly validates them and publicizes them.

When Limbaugh first went on the air, he was a breath of fresh air for conservatives—even those working in the White House—and an essential source of news. As all of his listeners know, he had a vast “stack of stuff” consisting of news clippings, press releases, faxes, and whatnot that caught his eye and formed the basis for his monologues. He was as much a news consolidator and reviewer as he was a commentator in those days. And he frequently had an intelligent spin on the news, often picked up from the many politicians and policymakers he talked to off the air.

Of course, Limbaugh was also a blowhard, and his massive hubris was off-putting. But it was part of his schtick and one of the reasons he was popular. Say whatever else you like about him, but Rush was a masterful radio personality. He really understood and loved the medium. His foray into television just didn’t suit his style and was soon abandoned. (...)

Perhaps the most important long-term effect Limbaugh had on the media is that his success helped convinced Australian press baron Rupert Murdoch to launch Fox News. Longtime Republican political consultant and television producer Roger Ailes drew up the plans for Fox and helped Limbaugh go national with his radio show. (For almost 20 years before meeting Ailes, Limbaugh had labored in the vineyards of small radio stations in Kansas City, Sacramento, and elsewhere.) Without Ailes’s help, Limbaugh would have never become what he was.

It’s also well known that liberal commentators have never been able to duplicate the success of Limbaugh. Even Al Franken, a skilled entertainer with deep political knowledge, failed to find an audience for a contra-Limbaugh radio show. I think the reason for this failure is simpler than it appears: Progressives already have their own talk radio network with a broad reach—National Public Radio. It’s not as ideological as conservative talk radio, of course, but NPR produces exactly what liberals want radio to do, and it does so very, very well. Moreover, I think liberals are basically content with the mainstream media: The New York Times fulfills their news needs almost perfectly. That’s why they get so upset when it strays from the liberal path by publishing conservative commentary.

In truth, the Times attracts precisely zero conservative subscribers by publishing the likes of Bret Stephens. I know this from many years in the conservative movement. I even remember the first moment when I realized how closed the conservative mind had become.

by Bruce Bartlett, Big Picture |  Read more:
Image: Jim Watson/Getty

The Elephant in the Room

The Top 1% of Americans Have Taken $50 Trillion From the Bottom 90%

Like many of the virus’s hardest hit victims, the United States went into the COVID-19 pandemic wracked by preexisting conditions. A fraying public health infrastructure, inadequate medical supplies, an employer-based health insurance system perversely unsuited to the moment—these and other afflictions are surely contributing to the death toll. But in addressing the causes and consequences of this pandemic—and its cruelly uneven impact—the elephant in the room is extreme income inequality.

How big is this elephant? A staggering $50 trillion. That is how much the upward redistribution of income has cost American workers over the past several decades.

This is not some back-of-the-napkin approximation. According to a groundbreaking new working paper by Carter C. Price and Kathryn Edwards of the RAND Corporation, had the more equitable income distributions of the three decades following World War II (1945 through 1974) merely held steady, the aggregate annual income of Americans earning below the 90th percentile would have been $2.5 trillion higher in the year 2018 alone. That is an amount equal to nearly 12 percent of GDP—enough to more than double median income—enough to pay every single working American in the bottom nine deciles an additional $1,144 a month. Every month. Every single year.

Price and Edwards calculate that the cumulative tab for our four-decade-long experiment in radical inequality had grown to over $47 trillion from 1975 through 2018. At a recent pace of about $2.5 trillion a year, that number we estimate crossed the $50 trillion mark by early 2020. That’s $50 trillion that would have gone into the paychecks of working Americans had inequality held constant—$50 trillion that would have built a far larger and more prosperous economy—$50 trillion that would have enabled the vast majority of Americans to enter this pandemic far more healthy, resilient, and financially secure.

As the RAND report [whose research was funded by the Fair Work Center which co-author David Rolf is a board member of] demonstrates, a rising tide most definitely did not lift all boats. It didn’t even lift most of them, as nearly all of the benefits of growth these past 45 years were captured by those at the very top. And as the American economy grows radically unequal it is holding back economic growth itself.

Even inequality is meted out unequally. Low-wage workers and their families, disproportionately people of color, suffer from far higher rates of asthma, hypertension, diabetes, and other COVID-19 comorbidities; yet they are also far less likely to have health insurance, and far more likely to work in “essential” industries with the highest rates of coronavirus exposure and transmission. It is no surprise then, according to the CDC, that COVID-19 inflicts “a disproportionate burden of illness and death among racial and ethnic minority groups.” But imagine how much safer, healthier, and empowered all American workers might be if that $50 trillion had been paid out in wages instead of being funneled into corporate profits and the offshore accounts of the super-rich. Imagine how much richer and more resilient the American people would be. Imagine how many more lives would have been saved had our people been more resilient. (...)

Of course, America’s chronic case of extreme inequality is old news. Many other studies have documented this trend, chronicled its impact, and analyzed its causes. But where others have painted the picture in terms of aggregate shares of GDP, productivity growth, or other cold, hard statistics, the RAND report brings the inequality price tag directly home by denominating it in dollars—not just the aggregate $50 trillion figure, but in granular demographic detail. For example, are you a typical Black man earning $35,000 a year? You are being paid at least $26,000 a year less than you would have had income distributions held constant. Are you a college-educated, prime-aged, full-time worker earning $72,000? Depending on the inflation index used (PCE or CPI, respectively), rising inequality is costing you between $48,000 and $63,000 a year. But whatever your race, gender, educational attainment, urbanicity, or income, the data show, if you earn below the 90th percentile, the relentlessly upward redistribution of income since 1975 is coming out of your pocket.

by Nick Hanauer and David M. Rolf, Time |  Read more:
Image: Spencer Platt—Getty Images

Sunday, February 21, 2021

Liziqi Channel

[ed. Liziqi. She can do everything (even make her own furniture). Some favorites: here and here.]

Beyond Burgers: 3D-Printed Steaks

Plant-based burgers that taste a heck of a lot like the real thing are now at your local Burger King. And you can find realistic meatless ground beef and sausages at grocery stores. As the next big thing in sustainable and cruelty-free meat, some startups are growing it in labs from animal cells. In December, Singapore became the first country to allow sales of lab-grown chicken from U.S. startup Just Eat.

But the founders of Barcelona-based Novameat want to take a bigger leap. They plan to go beyond chicken strips and processed “meat” to the chewy, muscle-y, juicy taste of whole meat cuts. “We want to create the Tesla Roadster or iPhone moment for the future of food,” says CEO and founder Giuseppe Scionti. “Alternative meats shouldn’t just be for the environment or animals or health, they should be superior compared to what they’re trying to compete with. The Holy Grail is pork and steak.”

The company is using 3D-printing to get there. In what could be a game-changer for the alternative meat industry, they have now made the world’s largest piece of 3D-printed whole-cut meat analog. And they say their 3D-printing process 150 times faster than their competitors, allowing them to make 1.5 tons of meat substitute per hour.

Creating a sirloin steak, with its fibrous protein and marbled fat, from plant-based proteins is a tough recipe to perfect. Novameat’s microextrusion technology, which produces 100–500 micrometer-wide fibers from different ingredients and combines them in precise ratios and organized microstructures, is key to mimicking the mouthfeel, taste, appearance, and nutritional properties of animal meat, says senior food engineer Joan Solomando Martí. The three-year old startup has been using vegetable fat and non-soy plant proteins to make realistic 3D-printed steaks.

The latest 3D-printed whole-cut prototype was made with the company’s new hybrid meat analog, which they make by adding mammalian fat cells to a biocompatible plant-based scaffold. The cells are grown separately using traditional cell culturing techniques, and then added to the scaffolds, where they produce fatty acids or proteins. “This allows us to create beef muscle cuts, pork muscle cuts, and we are now also exploring fish and seafood.”

by Prachi Patel, IEEE Spectrum | Read more:
Image: Novameat


Images: Douglas Friedman
[ed. Rich people with nice houses. More pictures at the link.]

The Problem With Influencers


The Problem With Influencers (Current Affairs)
Image: Kostsov/Shutterstock
[ed. Hey... I learned two new things today: Sadfishing and Mukbang. And the day's just getting started.]

Saturday, February 20, 2021

On Your Own

[ed. M. Emmet Walsh from the Coen Brother's Blood Simple. See also: this (CNN).]

How to Write About Iran: A Guide for Jounalists, Analysts, and Policymakers

1. Always refer to Iran as the “Islamic Republic” and its government as “the regime” or, better yet, “the Mullahs.”

2. Never refer to Iran’s foreign policy. The correct terminology is its “behavior.” When U.S. officials say Iran “must change its behavior” and “behave like a normal country,” write those quotes down word for word. Everyone knows that Iran is a delinquent kid that always instigates trouble and must be disciplined.

3. Omit that Iran has a population of 80 million with half a dozen ethnicities, languages, and religions. Why complicate when you can do simple? Just write “Iranians” or “the Iranians.” They are all the same and consequently think alike – when they get to think, that is.

4. To illustrate your article, pick a photo of brown, bearded men screaming with fists punching the air. An image of brown, bearded men setting a U.S. flag on fire with fists punching the air is also on point. A photo of brown, bearded men sitting crossed-legged on the floor of a mosque harboring their habitual anger just before they explode into raised fists punching the air is perfectly fine too.

5. If your article is about Iran-U.S. relations and even if it is not, include a photo of a woman in a head-to-toe black chador walking past the famous anti-U.S. mural in Tehran. (Note: that go-to mural in downtown Tehran of the Statue of Liberty with a skull face set against the American flag has been painted over, but it can easily be found in online image archives.) Always include a picture of a woman in a black chador walking down the street so it’s clear that this is Iran where women are oppressed, voiceless, and invisible.

6. For a business story, choose a photo of long queues at the gas station and a brown man filling his tank to show Iran is a dysfunctional country with a dysfunctional economy. Or one of Tehran’s busy Haft-e-Tir square to show Iran has roundabouts and shops while still being dysfunctional and chaotic. Remember the random woman walking by in a black chador? Make sure there is one somewhere in the photo. (...)

8. If you travel to Iran, refer to yourself not as being “in” Iran but “inside” Iran. Be transparent about the risks you are taking to spend as many as five consecutive days in the Iranian capital. Start your dispatch with the queasy feeling that you — a white man — have upon landing in Tehran.

9. When inside Iran, write about meeting key sources to shed light on the realities in the Islamic Republic: the exclusive interview with your cab driver, the secret meeting with a female student in a café in northern Tehran, that overwhelming expedition to a mosque in southern Tehran. Wrap up your article with comments from an English-speaking political analyst with loose ties to the regime who can predict the next impulses of the Mullahs in one quote.

10. Never mention that there are theatres, cinemas, art galleries, museums, concert halls, bookshops, gyms, yoga studios, hair salons, or bakeries in Iran. It’s more informative to write about how you experience the Islamic Republic during your short stay rather than how Iranians live every day.

11. Always remind readers that Iran is a dangerous country, more dangerous than any other country in the Middle East. Underline at any chance you get that it poses an imminent threat to the future of the entire world and more particularly to the U.S. and Israel, both of which have nuclear weapons.

by Ladane Nasseri, McSweeny's |  Read more:
Image:Abedin Taherkenareh/EPA/EFE via

Kazuo Ishiguro: Klara and the Sun

For the Ishiguro household, 5 October 2017 was a big day. After weeks of discussion, the author’s wife, Lorna, had finally decided to change her hair colour. She was sitting in a Hampstead salon, not far from Golders Green in London, where they have lived for many years, all gowned up, and glanced at her phone. There was a news flash. “I’m sorry, I’m going to have to stop this,” she said to the waiting hairdresser. “My husband has just won the Nobel prize for literature. I might have to help him out.”

Back home, Kazuo Ishiguro was having a late breakfast when his agent called. “It’s the opposite to the Booker prize, where there’s a longlist and then a shortlist. You hear the rumbling thunder coming towards you, often not striking. With the Nobel it is freak lightning out of the blue – wham!” Within half an hour there was a queue of journalists outside the front door. He called his mother, Shizuko. “I said: ‘I’ve won the Nobel, Shon.’ Oddly, she didn’t seem very surprised,” he recalls. “She said: ‘I thought you’d win it sooner or later.’” She died, aged 92, two years ago. His latest novel Klara and the Sun, in part about maternal devotion and his first since winning the Nobel, is dedicated to her. “My mother had a huge amount to do with my becoming a writer,” he says now. (...)

In Nobel terms, at 62 Ishiguro was a relative whippersnapper. Precocity is part of the Ishiguro myth: at 27 he was the youngest on Granta’s inaugural best of young British novelists list in 1983 (with Martin Amis, Ian McEwan, Julian Barnes et al), appearing again the following decade. In between he won the Booker prize for The Remains of the Day, which was given the full Merchant Ivory treatment in 1993. Indeed, his claim that most great novels were produced by writers in their 20s and 30s has become part of literary legend. “It is Martin Amis who goes round repeating this, not me,” Ishiguro says, laughing. “He became obsessed with the idea.” But he still maintains that your 30s are the crucial years for novel writing: “You do need some of that cerebral power.” (Which is lucky for Naomi, who at 28 also has her first novel, Common Ground, out this month, much to her father’s delight.) Whenever anybody brought up the question of the Nobel, his standard line used to be: “Writers won their Nobel prizes in their 60s for work they did in their 30s. Now perhaps it applies to me personally,” the 66-year-old notes drily.

He remains the supreme creator of self-enclosed worlds (the country house; the boarding school), his characters often under some form of lockdown; his fastidious attention to everyday details and almost ostentatiously flat style offsetting fantastical plot lines and pent-up emotional intensity. And Klara and the Sun is no exception.

Set in an unspecified America, in an unspecified future, it is – ostensibly at least – about the relationship between an artificial “friend”, Klara, and her teenage owner/charge, Josie. Robots (AFs) have become as commonplace as vacuum cleaners, gene-editing is the norm and biotechnological advances are close to recreating unique human beings. “This isn’t some kind of weird fantasy,” he says. “We just haven’t woken up to what is already possible today.” “Amazon recommends” is just the beginning. “In the era of big data, we might start to be able to rebuild somebody’s character so that after they’ve died they can still carry on, figuring out what they’d order next online, which concert they’d like to go to and what they would have said at the breakfast table if you had read them the latest headlines,” he continues.

He deliberately didn’t read either the recent Ian McEwan novel Machines Like Me or Jeanette Winterson’s Frankissstein, which also take on artificial intelligence, but from very different angles. Klara is a sort of robotic parent, “Terminator-like in her determination to look after Josie”, but she is also a potential surrogate child: when Josie gets sick, Klara is being programmed to take her place. “What happens to things like love in an age when we are changing our views about the human individual and the individual’s uniqueness?” he asks. “There was this question – it always sounds very pompous – about the human soul: do we actually have one or not?”

The book revisits many of the ideas behind Never Let Me Go, his 2005 novel about three teenage clones whose organs will be harvested, leading to certain death before their 30s: “only a slight exaggeration of the human condition, we all have to get ill and die at some point”, he says now. Both novels hold out the possibility that death can be postponed or defeated by true love, which must be tested and proved in some way; a fairytale bargaining that is also made explicit in the boatman’s challenge to Axl and Beatrice in his previous novel The Buried Giant. This hope, even for those who don’t believe in an afterlife, “is one of the things that makes us human,” he reflects. “It perhaps makes us fools as well. Perhaps it is a lot of sentimental hogwash. But it is very powerful in people.”

He is unapologetic about repetition, citing the “continuity” of great film directors (he is a huge cinephile), and likes to claim that each of his first three books was essentially a rewrite of its predecessor. “Literary novelists are slightly defensive about being repetitive,” he says. “I think it is perfectly justified: you keep doing it until it comes closer and closer to what you want to say each time.” He gets away with it, he says, by changing location or genre: “People are so literal they think I’m moving on.” For him, genre is like travel, and it is true that he has enjoyed genre-hopping: When We Were Orphans (detective fiction); Remains of the Day (period drama); The Unconsoled (Kafkaesque fable); Never Let Me Go (dystopian sci-fi) and The Buried Giant (Tolkienish fantasy). Now, as the title Klara and the Sun hints, he visits what he calls “children’s storyland”. But be warned, we are still very much in Ishiguroland. (...)

Each novel takes him around five years: a long build-up of research and thinking, followed by a speedy first draft, a process he compares to a samurai sword fight: “You stare at each other silently for ages, usually with tall grass blowing away and moody sky. You are thinking all the time, and then in a split second it happens. The swords are drawn: Wham! Wham! Wham! And one of them falls,” he explains, wielding an imaginary sword at the screen. “You had to get your mind absolutely right and then when you drew that sword you just did it: Wham! It had to be the perfect cut.” As a child, he was mystified by swashbuckling Errol Flynn films when he first came to the UK, in which the sword fights consisted of actors going “ching, ching, ching, ching, for about 20 minutes while talking to each other,” he says. “Perhaps there’s a way of writing fiction like that, where you work it out in the act, but I tend towards the ‘Don’t do anything, it’s all internal’ approach.”

by Lisa Allardice, The Guardian | Read more:
Image: Howard Sooley
[ed. See also: this extract from Klara and the Sun.]

Friday, February 19, 2021


via:

History of Zork

Zork
a.k.a Dungeon
by Tim Anderson, Marc Blank, Bruce Daniels, and Dave Lebling
First Appeared: late June 1977
First Commercial Release: December 1980
Language: MDL
Platform: PDP-10

Opening Text: You are in an open field west of a big white house, with a boarded front door. There is a small mailbox here.

[Note: contains spoilery discussion of the jeweled egg, cyclops, and robot puzzles.]
If Adventure had introduced hackers to an intriguing new genre of immersive text game, Zork was what brought it to the public at large. In the early 1980s, as the personal computer revolution reached into more and more homes, a Zork disk was a must-buy for first-time computer owners. By 1982 it had become the industry’s bestselling game. In 1983, it sold even more copies. Playboy covered it; so did Time, and American astronaut Sally Ride was reportedly obsessed with it. In 1984 it was still topping sales charts, beating out much newer games including its own sequels. At the end of 1985 it was still outselling any other game for the Apple II, half a decade after its first release on the platform, and had become the bestselling title of all time on many other systems besides.

Its creation can be traced to a heady Friday in May 1977 on the MIT campus in Cambridge, Massachusetts. It was the last day of finals week, and summer was kicking off with a bang for the school’s cohort of tech-obsessed engineers: a new movie called Star Wars opened that day in theaters, the groundbreaking Apple II had just been released, and Adventure was exploding across the terminals of computer labs nationwide, thousands of students having no further distractions, at last, to keep them from solving it.

Among those obsessive players were four friends at a campus research lab, the Dynamic Modeling Group. Within two weeks they’d solved Adventure, squeezing every last point from it through meticulous play and, eventually, the surgical deployment of a machine-language debugger. Once the game was definitively solved, they immediately hatched plans to make something better. Not just to prove the superiority of their school’s coding prowess over Don Woods at Stanford—though that was undoubtedly part of it—nor simply because many were dragging their feet on graduating or finding jobs, and a challenging new distraction seemed immensely appealing—though that was part of it too. But the most important factor was that Adventure had been so incredibly fun and, regrettably, there wasn’t any more of it. “It was like reading a Sherlock Holmes story,” one player recalled, “and you wanted to read another one of them immediately. Only there wasn’t one, because nobody had written it.”

The four friends were an eclectic group of grad students ranging in age from 22 to 28, united by shared sensibilities and a love of hacking. Dave Lebling had a political science degree and had started programming only because of an accidental hole in his freshman year schedule. A “voracious reader” and “frustrated writer,” he’d helped design Maze in 1973, one of the earliest graphical exploration games and first-person shooters. Marc Blank was young, tall, thin, and technically enrolled in med school, but found messing around with computers an addictive distraction. Bruce Daniels was nearing thirty and increasingly bored with his PhD topic; he’d helped develop the lab’s pet project, the MDL programming language, and was always eager to find new ways of showing it off. And Tim Anderson was close to finishing his master’s but none too excited about leaving the heady intellectual community at MIT. With Adventure solved, the four sat down to hack together a prototype for an improved version, which would run like the earlier game on a PDP-10 mainframe. Needing a placeholder name for the source file, they typed in zork, one of many nonsense words floating around campus that could, among other usages, be substituted for an offensive interjection.

The game they began to create was at first quite similar to Adventure, so much so that historian Jimmy Maher has noted parts of it are more remake than homage. Both games begin in a forest outside a house containing supplies for an underground expedition, including food, water, and a light source with limited power; in both you search for treasures in a vast underground cave system and score points by returning them to the building on the surface; both feature underground volcanoes, locked grates, trolls, and a “maze of twisty little passages, all alike.” Hacker tropes and nods to other early text games abound, like a huge bat who whisks you off to another location like in Hunt the Wumpus. But as Zork expanded, it began to develop its own character: less realistic than the caverns sketched from Will Crowther’s real-life experience, but also more whimsical, more threatening, and driven by an improved parser and world model.

>open trap door
The door reluctantly opens to reveal a rickety staircase descending into darkness.
>down
It is pitch black. You are likely to be eaten by a grue. Your sword is glowing with a faint blue glow.
>what is a grue?
The grue is a sinister, lurking presence in the dark places of the earth. Its favorite diet is adventurers, but its insatiable appetite is tempered by its fear of light. No grue has ever been seen by the light of day, and few have survived its fearsome jaws to tell the tale.

The infamous grues were invented as a solution to a sort of bug: not with the game’s code, but with the player’s suspension of disbelief. In early versions of Zork, as in Adventure, you’d fall into a bottomless pit if you tried to move through a dark room without a portable light source. But someone noticed this could happen in Zork even in the dark attic of the above-ground house. Lebling, stealing the word “grue” from a Jack Vance novel, invented a new and more broadly applicable threat for dark places.

by Aaron A. Reed, Substack | Read more:
Image: via
[ed. A great classic. From the series 50 Years of Text Games.]


via:

Racoon Trouble

Soon after he became a raccoon trapper, Musa Ramada began having nightmares, waking with the sensation that one of the animals was on his chest. Again and again this happened, upsetting him more and more; eventually he told his boss, an old tough guy named Steve. Steve knew these dreams well and offered advice for escaping them: when you trap a raccoon and its babies, release them together, he said. Ramada started doing so religiously, even when it cost him time and money. His sleep has been undisturbed by raccoons ever since.

No longer consigned to the urban edge, raccoons have infiltrated New York City, occupying homes and generating steady business for people who catch them. The past five years has seen a rise in raccoon trouble—subway lines shut down, brownstones vandalized—that has become even more noticeable during the pandemic, with New Yorkers holed up indoors. Raccoons have been spotted in the West Village, on the Upper East Side, in groups of more than twenty (the collective noun is a “gaze”) among the trees of Prospect Park. In 2016, the New York Times ran a feature headlined “Raccoons Invade Brooklyn,” with tales of backyard chickens being mauled and baby raccoons, known as kits, tumbling from apartment roofs. When a tourist inquired on Reddit about where one might encounter raccoons in the city, someone responded: “Come to Queens! They’re everywhere! Just had to throw one out of my bathtub.” (...)

A Syrian immigrant in his 40s, Ramada moved to New York in the 1990s to study computer science but dropped out when he couldn’t afford tuition. Since then, he’s passed through a series of marginal jobs, from construction and fixing cars to being a cook in an Italian restaurant. About four years ago he became a trapper entirely by accident, after responding to an online advertisement that he believed was related to house painting. At the interview, Steve, the boss, asked Ramada if he knew how to climb a ladder. “I said sure,” Ramada told me. “He said, high ladder—like 40 feet.” When Ramada found out he’d be catching raccoons, he was taken aback; that Americans would pay you to remove wild animals from their houses was something he’d never imagined. Back home, he said, this would be the duty of young men—a son or a cousin looking to demonstrate his bravery. But home was here now, in this city of money and exclusion that creates its own forms of opportunity. “Sure,” he told Steve, rolling the r, and just like that he was hired.

Ramada is in his forties, with grey stubble, scruffy hair, and tobacco-stained teeth. He rolls stubby joints of cheap pot purchased from a contact in the Bronx and cruises around, lightly stoned, with his traps and the junk that fills his panel van: tools for keeping the engine going; scraps of paper with the scribbled addresses of his customers (which he otherwise forgets); husks of sunflower seeds (he says he’s addicted); and books by Naguib Mahfouz, the Egyptian Nobel laureate who published 34 novels and more than 350 short stories, nearly all of which Ramada says he has read.

I first met him in November 2019, soon after he stopped working for Steve and started his own trapping company. I had been calling raccoon trappers for days, but most of them declined to speak to me; even Steve had fobbed me off, claiming that his bosses were “at their holiday house upstate.” (It was only later, when I met Ramada, that I began piecing the story together: Steve is one of the biggest trappers in New York, and emphatically has no boss.) Another trapper spoke to me on background for 45 minutes and made me promise not to refer to a word of our conversation. Others put the phone down when they heard the word “journalist.” But Ramada, when I reached him, acted as if he’d been waiting for me to call. His accent was very different from those of the gruff New Yorkers I’d been dealing with. First, with conviction, he told me he could communicate with raccoons. Then he asked me if I could help him design a website.

He’d spent hours trying to find customers, printing business cards and creating listings on Google Maps, but his rivals were far ahead of him, often operating multiple businesses, each with their own phone numbers. (I had discovered this on my earlier calls when the deep background man listened to me for a minute and then said, “I already explained this to you, pal.”) So Ramada tried another strategy: undercutting the competition. Initially he charged $600 to remove up to five raccoons—“that’s like a family,” he said—no matter the number of return visits; this was about a third of the going rate. Slowly, business picked up until he was averaging a raccoon a week. He abandoned the idea of a website.

“Best job I ever had,” he told me, splitting open one sunflower seed and then another. Ramada traverses the five boroughs in his van, laying traps in attics and basements; sometimes a raccoon falls through a ceiling and he must chase after it with a sack and noose. He is always on call but seldom busy, and in between jobs he cooks, walks his dog, watches Syrian news on YouTube, or— when he has the money—plays golf; sometimes a customer calls mid-round and he has to leave to deal with a raccoon. He lives in Queens with his wife, who is Italian American, and her young son, who, according to Ramada, is fond of raccoons. This is not surprising, given Ramada’s own rich and unusual affection for them.
* * *
On the internet, there is an outpouring of love for raccoons, with Instagram accounts like @cutest.raccoons and @raccoonfeeds amassing hundreds of thousands of followers. Clips of pet raccoons (#trashpandas) rolling on giant hamster wheels, falling off furniture, and skirmishing with cats rack up hundreds of thousands of likes and shares. Influencer raccoon accounts come replete with merchandise and sponsored content; one of the most successful, an unusually pale raccoon named Uni, who lives in Taiwan, has been featured on BuzzFeed and People.com.

This online adoration is at stark odds with reality, where, for the most part, we treat raccoons as pests. And not without reason: They are destructive visitors, ripping through drywall, gnawing pipes, robbing food, making noises (“whissing,” as Ramada puts it), and leaving droppings that carry a parasite which causes nausea and sometimes blindness. To deter raccoons, you can buy ultrasonic noise machines, wall spikes, and heat-activated sprinklers. You can buy urine (“100 percent pure”) from coyotes held in cages. An animal welfare activist I spoke to abhors such tactics and recommended, instead, blasting “hard rock music” in the attic. When I asked if this ever became irritating, she said: “It doesn’t bother me as much as having a raccoon gassed would bother me.” (...)

It is Ramada’s job to clean up after raccoons and thwart their attempts at havoc—the leaks and ruined ceilings and foul, crusted dens. Unlike some of his rivals, though, he refuses to blame the animals. “They been here in America before us,” he said. “That’s why they’re invading our home, just like we invade theirs.”

He has come to believe, too, that he is able to communicate with them, just as one might communicate with a dog. “They live in the house, so they understand our language,” he said. “So many times the raccoon can see us and we cannot see it. It’s in the trees and houses, just listening.”

And so a raccoon might hear a man speaking tenderly to his wife or children, Ramada said.

“So when I say to them I love you, they understand that.”

by Kimon de Greef, Guernica | Read more:
Image: via