Thursday, July 24, 2014

Failing the Third Machine Age

[ed. See also: And So It Begins.]

A cheerily written op-ed in the New York Times proclaims: “It’s time for robot caregivers”.

Why? We have many elderly people who need care, and children—especially those with disabilities—the piece argues, and not enough caregivers.

Call in the machines, she says:
“We do not have anywhere near enough human caregivers for the growing number of older Americans.”
This how to fail the third machine age.

This is not just an inhuman policy perspective, it’s economically destructive and rests on accepting current economic policies and realities as if they were immutable.

Let me explain. When people confidently announce that once robots come for our jobs, we’ll find something else to do like we always did, they are drawing from a very short history. The truth is, there’s only been one-and-a-three-quarters of a machine age—we are close to concluding the second one—we are moving into the third one.

And there is probably no fourth one.

Humans have only so many “irreplaceable” skills, and the idea that we’ll just keep outrunning the machines, skill-wise, is a folly. (...)

But wait, you say, there’s a next set of skills, surely?

That has been the historical argument: sure, robots may replace us, but humans have always found a place to go.

As I recounted, there are really only one and a maybe two thirds examples of such shifts, so far, so forgive me if I find such induction unconvincing. Manual labor (one), mental labor (still happening) and now mental skills are getting replaced, we are retreating, partially into emotional labor—i.e. care-giving.

And now machines, we are told, are coming for care-giving.

We are told that this is because there aren't enough humans?

Let’s just start with the obvious: Nonsense.

Of course we have enough human caregivers for the elderly. The country –and the world— is awash in underemployment and unemployment, and many people find caregiving to be a fulfilling and desirable profession. The only problem is that we –as a society— don’t want to pay caregivers well and don’t value their labor. Slightly redistributive policies that would slightly decrease the existing concentration of wealth to provide subsidies for childcare or elder care are, unfortunately, deemed untouchable goals by political parties beholden to a narrow slice of society.

Remember: whenever you hear there’s a shortage of humans (or food), it is almost always a code for shortage of money. (Modern famines are also almost always a shortage of money, not food). Modern shortages of “labor” are almost always a shortage of willingness to pay well, or a desire to avoid hiring the “wrong” kind of people. (...)

Next, consider that emotional labor is all that’s left to escape to as humans workers after manual and mental labor have been already been mostly taken over by machines.

(Creative labor is sometimes cited as another alternative but I am discounting this since it is already discounted—it is very difficult, already, to make a living through creative labor, and it’s getting harder and not easier. But that’s another post).

US Bureau of Labor Statistics projects the following jobs as the ones with the largest growth in the next decade: Personal care aides, registered nurses, retail salespersons, home health aides, fast-food, nursing assistants, secretaries, customer service representatives, janitors…

It’s those face-to-face professions, ones in which being in contact with another human being are important, that are growing in numbers—almost every other profession is shrinking, numerically.

(No there won’t be a shortage of engineers and programmers either—engineers and programmers, better than anyone, should know that machine intelligence is coming for them fairly soon, and will move up the value chain pretty quickly. Also, much of this “shortage”, too, is about controlling workers and not paying them—note how Silicon Valley colluded to not pay its engineers too much, even as the companies in question had hoarded billions in cash. In a true shortage under market conditions, companies would pay more to that which was scarce).

Many of these jobs BLS says will grow, however, are only there for the grace-of-the-generation that still wants to see a cashiers while checking out—and besides, they are low-paid jobs. Automation plus natural language processing by machines is going to obliterate through those jobs in the next decade or two. (Is anyone ready for the even worse labor crisis that will ensue?) Machines will take your order at the fast-food joint, they will check out your groceries without having to scan them, it will become even harder to get a human on the customer service line.

What’s left as jobs is those transactions in which the presence of the human is something more than a smiling face that takes your order and enters into another machine—the cashier and the travel agent that has now been replaced by us, in the “self-serve” economy.

What’s left is deep emotional labor: taking care of each other.

by Zeynep Tufekci, Medium |  Read more:

Wednesday, July 23, 2014

Arctic Man

Wild rides and crazed nights at America's most extreme ski race. 

It's April in Alaska so the traffic on the Glenn Highway can't be blamed on either winter snow or summer tourists. The line of yellowing motorhomes, bulbous camper trailers, jacked-up pickups and shopworn Subarus inching out of Wasilla onto the hairpins and steep climbs of the Glenn is, as the bumper stickers say, "Alaska Grown," the annual migration of the state's Sledneck population to Arctic Man. Once clear of the sprawl of Wasilla, the signs along the way read like pages flying back on a calendar, flipping past the state's prospector and homestead era — "Jackass Creek," "Frost Heave," "Eureka" — to the Native names, from long before there was English to write them down: "Matanuska," "Chickaloon," "Tazlina." Then there's the highway itself, named for Edwin Glenn, a Spanish-American war vet and Army officer who was the first American soldier ever court-martialed for waterboarding. But earlier in his career, in the late 1890s, Glenn led two expeditions into this wilderness.

Maybe that's the lesson: If you put your name in the ground up here, it stays. Your life outside the state is your own concern.

After the Glenn, you head up past Gulkana — Athabascan for "winding river" — and then a final rush out onto the frozen moonscape of Summit Lake, where the peaks of the Alaska Range fill the horizon, all the way to mighty Denali, which might be the best counterexample of Alaskan identity: William McKinley may have been president, but he never set foot in Alaska, so most Alaskans call the nation's largest mountain by its native name, Denali.

You turn off the highway, down a road piled with eight feet of snow on both sides. This is Camp Isabel, once the single biggest work camp along the Trans-Alaska Pipeline, now a forgotten gravel airstrip at the base of the Hoodoo Mountains. Perhaps 1,000 motorhomes, RVs and trailers are already here, strewn like fallen Jenga pieces inside the frozen walls. Snowmachines buzz past your doors, above your head on the snow banks and over the distant peaks like swarming gnats. The temperature is way below freezing, but the air still carries the smell of gasoline, grilled meat and alcohol. A four-wheeler rumbles past pulling a big sled and on the big sled is a couch, a so-called Alaskan Rickshaw. Four people are riding, holding drinks. One of them is wearing a full wolf pelt, snout, eyes, ears and all. He nods and tips his cup "Hello."

Arctic Man is a weeklong, booze and fossil-fueled Sledneck Revival bookended around the world's craziest ski race. Both the festival and the race at its heart have been firing off every year in these mountains for more than half as long as Alaska has been a state. Over the course of a week, something like 10,000 partiers and their snowmachines disgorge onto Camp Isabel's 300-acre pad to drink, grill, fight, drink and, at least while the sun is out, blast their sleds through the ear-deep powder in the surrounding hills one last time before it all melts away. Then on Friday morning, anyone not hopelessly hungover or already drunk by noon swarms up the valley south of camp to watch the damnedest ski race on earth.

by Matt White, SBNation |  Read more:
Image: Brian Montalbo

Henri Matisse, Bathers by a River
via:

Tuesday, July 22, 2014

Shining Light on Cutoff Culture


Most of us don’t blink when a friend says they’ve cut off an ex. But if you’ve ever been cut off by someone you care deeply for, then you know how distinctly painful an experience it can be. While it may be socially acceptable to cut off communication with our exes, we’re not always cognizant of the impacts on ourselves and our former partners. When we cut off, we may do so from anger but often we may be avoiding feelings of discomfort. Furthermore, if the person being cut off has trauma in their background, the psychological impacts can be devastating.

I’m not talking about distancing ourselves from those we casually date or asking for space after a breakup or simply choosing not to be friends with our exes. I’m talking about breaking off all contact with the most intimate person in our lives without civility — refusing to answer the phone, reply to emails, or acknowledge any aspect of their communication or needs — often without explanation.

Few of my friends know I’ve been nursing a broken heart, for nearly two and a half years. It’s not a typical broken heart but one that combines the end of a romance with the bewilderment and sadness of being cut off by a dear and trusted partner without explanation. It’s also one that echoes painful experiences from my childhood. (...)

Cutting off contact with exes seems to be a common practice. A friend of mine related being told by another friend to break up with her boyfriend via “JSC”; just stop communicating. “Love is a battlefield,” goes the saying.

When personal safety is involved, cutoff is warranted. But most times this isn’t the case. When it’s not, this kind of behavior dehumanizes the other and sends the message “your needs don’t matter, you don’t matter.” University of Chicago neuroscientist John Cacioppo told Psychology Today, “‘The pain of losing a meaningful relationship can be especially searing in the absence of direct social contact.’ With no definitive closure, we’re left wondering what the heck happened, which can lead to the kind of endless rumination that often leads to depression.”

Emma once told me, “You’re the first one to want me for me,” but her abrupt about-face might make you think I ran off with her best friend or boiled her rabbit … I did neither. In fact, to this day, I have only guesses to make sense of her hostility to me.

Because Emma’s withdrawal and eventual cutoff surprised me so much, I had a lot of intense emotions and questions about what she’d experienced and the choices she’d made. Rather than face my need for explanation and desire for resolution, she chose to withdraw.

Our society supports you when a loved one dies, but when someone dumps you and cuts off communication, you’re supposed to just get over it. Friends are often uncomfortable talking with you about these kinds of feelings. They want you to let go, move on, and definitely stop talking about it.

In The Journey from Abandonment to Healing, Susan Anderson writes, “When a loved one dies, the loss is absolutely final…[but] abandonment survivors may remain in denial and postpone closure, sometimes indefinitely.” We’re not comfortable witnessing the process of grief and acceptance when it stems from the loss of romantic attachment, especially when it’s extended.

When there are emotional loose ends — unanswered questions, mistrust, betrayal, disbelief, bewilderment (as it was for me with Emma) — it can be very difficult to heal. Our culture is very hostile to people in this situation. We often judge those who don’t move on right away. Being the one struggling without answers is one of the most difficult human experiences.

by Jeff Reifman, Medium |  Read more:
Image: Jeff Reifman

The Majority Of Today’s App Businesses Are Not Sustainable

Though the app stores continue to fill up with ever more mobile applications, the reality is that most of these are not sustainable businesses. According to a new report out this morning, half (50%) of iOS developers and even more (64%) Android developers are operating below the “app poverty line” of $500 per app per month.

This detail was one of many released in VisionMobile’s latest Developer Economics report (for Q3 2014), which was based on a large-scale online developer survey and one-to-one interviews with mobile app developers. This report included the responses from over 10,000 developers from 137 countries worldwide, taking place over 5 weeks in April and May.

That mobile app developers are challenged in getting their apps discovered, downloaded and then actually used, is a well-known fact. But seeing the figures associated with exactly how tough it is out there is rather revealing. It seems the “1%” is not only a term applicable to the economy as a whole – it’s also taking place within the app store economy, too.

The report’s authors detail the specifics around the trend where a tiny fraction of developers – actually, it’s 1.6% to be exact – generate most of the app store revenue. Slyly referencing the “disappearing middle class of app developers,” the report’s analysis groups the estimated 2.9 million mobile app developers worldwide into a handful of different categories for easy reference: the “have-nothings,” the “poverty-stricken,” the “strugglers,” and the “haves.” And, as you can tell, most of these categories don’t sound too great.

by Sarah Perez, TechCrunch |  Read more:
Image: uncredited

David Joly, Woods in Winter
via:

Lessons From Late Night

[ed. In advance of installing a paywall for all future New Yorker material, the magazine has opened its archive back to 2007 (for who knows how long?). You can find a good sampling here.]

In 1997, I realized one of my childhood dreams. (Not the one where I’m being chased by Count Chocula.) I flew to New York from Chicago, where I was working as a performer at Second City, to interview for a writing position at “Saturday Night Live.” It seemed promising, because I’d heard that the show was looking to diversify. Only in comedy, by the way, does an obedient white girl from the suburbs count as diversity. I arrived for my job interview in the only decent clothes I had: my “show clothes”—black pants and a lavender chenille sweater from Contempo Casuals. I went up to the security guard at the elevator and I heard myself say, “I’m here to see Lorne Michaels.” I couldn’t believe the words that were coming out of my mouth. This must be how people feel when they really do go to school naked by accident.

I went up to the seventeenth-floor offices, whose walls were lined with archival photographs from the show—Jane Curtin ripping her shirt open on “Weekend Update,” Gilda Radner in a “Beach Blanket Bingo” sketch, Al Franken’s head shot! Then I sat on a couch and waited for my meeting with Lorne. About an hour into the wait, some assistants started making popcorn in a movie-theatre popcorn machine—something that I would later learn signalled Lorne’s imminent arrival. To this day, the smell of fresh popcorn causes me to experience stress, hunger, and sketch ideas for John Goodman.

The only advice anyone had given me about meeting with Lorne was “Whatever you do, don’t finish his sentences.” A Chicago actress I knew had apparently made that mistake, and she believed it had cost her the job. So, when I was finally ushered into his office, I sat down, determined not to blow it.

Lorne said, “So, you’re from . . .”

The words seemed to hang there forever. Why wasn’t he finishing the sentence? If I answered now, would it count as talking over him? I couldn’t remember how normal human speech patterns worked. Another five seconds went by, and still no more sentence from Lorne. Oh, God! When I flew back to Chicago the next day, they were going to say, “How was your meeting with Lorne Michaels?” And I would have to reply, “He said, ‘So, you’re from,’ and then we sat there for an hour and then a girl came in and asked me to leave.”

After what was probably, realistically, ten seconds, I couldn’t take it anymore, and I blurted out, “Pennsylvania. I’m from Pennsylvania, a suburb of Philadelphia,” just as Lorne finally finished his thought—“Chicago.” I was sure I had blown it. I don’t remember anything else that happened in the meeting, because I just kept staring at the nameplate on his desk that said “Lorne Michaels” and thinking, This is the guy with the Beatles check! I couldn’t believe I was in his office. I could never have guessed that in a few years I’d be sitting in that office at two, three, four in the morning, thinking, If this meeting doesn’t end soon, I’m going to kill this Canadian bastard. Somehow, I got the job.

During my nine years at “Saturday Night Live,” my relationship with Lorne transitioned from Terrified Pupil and Reluctant Teacher, to Small-Town Girl and Streetwise Madam Showing Her the Ropes, to Annie and Daddy Warbucks (touring company), to a bond of mutual respect and friendship. Then it transitioned to Sullen Teen-Age Girl and Generous Stepfather, then to Mr. and Mrs. Michael Jackson, then, for a brief period, to Boy Who Doesn’t Believe in Christmas and Reclusive Neighbor Who Proves That Miracles Are Possible, then back to a bond of mutual respect and friendship.

I’ve learned many things from Lorne—in particular, a managerial style that was the opposite of my usual Bossypants mode. Here are some Things I Learned from Lorne Michaels:

(1) Producing is about discouraging creativity.

by Tina Fey, New Yorker |  Read more:
Image: Mary Ellen Mathews

The Fun Stuff

My life as Keith Moon.

[ed. See also: The Ginger Boy.]

I had a traditional musical education, in a provincial English cathedral town. I was sent off to an ancient piano teacher with the requisite halitosis, who lashed with a ruler at my knuckles as if they were wasps; I added the trumpet a few years later, and had lessons with a younger, cheerier man, who told me that the best way to make the instrument “sound” was to imagine spitting paper pellets down the mouthpiece at the school bully. I sang daily in the cathedral choir, an excellent grounding in sight-reading and performance.

But what I really wanted to do, as a little boy, was play the drums, and, of those different ways of making music, only playing the drums still makes me feel like a little boy. A friend’s older brother had a drum kit, and as a twelve-year-old I gawped at the spangled shells of wood and skin, and plotted how I might get to hit them, and make a lot of noise. It wouldn’t be easy. My parents had no time for “all that thumping about,” and the prim world of ecclesiastical and classical music, which meant so much to me, detested rock. But I waited until the drums’ owner was off at school, and sneaked into the attic where they gleamed, fabulously inert, and over the next few years I taught myself how to play them. Sitting behind the drums was like the fantasy of driving (the other great prepubescent ambition), with my feet established on two pedals, bass drum and high hat, and the willing dials staring back at me like a blank dashboard.

Noise, speed, rebellion: everyone secretly wants to play the drums, because hitting things, like yelling, returns us to the innocent violence of childhood. Music makes us want to dance, to register rhythm on and with our bodies. The drummer and the conductor are the luckiest of all musicians, because they are closest to dancing. And in drumming how childishly close the connection is between the dancer and the dance! When you blow down an oboe, or pull a bow across a string, an infinitesimal hesitation—the hesitation of vibration—separates the act and the sound; for trumpeters, the simple voicing of a quiet middle C is more fraught than very complex passages, because that brass tube can be sluggish in its obedience. But when a drummer needs to make a drum sound he just . . . hits it. The stick or the hand comes down, and the skin bellows. The narrator in Thomas Bernhard’s novel “The Loser,” a pianist crazed with dreams of genius and obsessed with Glenn Gould, expresses the impossible longing to become the piano, to be at one with it. When you play the drums, you are the drums. “Tom-tom, c’est moi,” as Wallace Stevens put it.

The drummer who was the drums, when I was a boy, was Keith Moon, though he was dead by the time I first heard him. He was the drums not because he was the most technically accomplished of drummers but because his joyous, semaphoring lunacy suggested a man possessed by the antic spirit of drumming. He was pure, irresponsible, restless childishness. At the end of early Who concerts, as Pete Townshend smashed his guitar, Moon would kick his drums and stand on them and hurl them around the stage, and this seems a logical extension not only of the basic premise of drumming, which is to hit things, but of Moon’s drumming, which was to hit things exuberantly. “For Christ’s sake, play quieter,” the manager of a club once told Moon. To which Moon replied, “I can’t play quiet, I’m a rock drummer.”

The Who had extraordinary rhythmic vitality, and it died when Keith Moon died, thirty-two years ago. I had hardly ever heard any rock music when I first listened to albums like “Quadrophenia” and “Who’s Next.” My notion of musical volume and power was inevitably circumscribed by my fairly sheltered, austerely Christian upbringing—I got off on classical or churchy things like the brassy last bars of William Walton’s First Symphony, or the densely chromatic last movement of the “Hammerklavier” Sonata, or the way the choir bursts in at the start of Handel’s anthem “Zadok the Priest,” or the thundering thirty-two-foot bass pipes of Durham Cathedral’s organ, and the way the echo, at the end of a piece, took seven seconds to dissolve in that huge building. Those things are not to be despised, but nothing had prepared me for the ferocious energy of The Who. The music enacted the mod rebellion of its lyrics: “Hope I die before I get old”; “Meet the new boss, same as the old boss”; “Dressed right, for a beach fight”; “There’s a millionaire above you, / And you’re under his suspicion.” Pete Townshend’s hard, tense suspended chords seemed to scour the air around them; Roger Daltrey’s singing was a young man’s fighting swagger, an incitement to some kind of crime; John Entwistle’s incessantly mobile bass playing was like someone running away from the scene of the crime; and Keith Moon’s drumming, in its inspired vandalism, was the crime itself.

Most rock drummers, even very good and inventive ones, are timekeepers. There is a space for a fill or a roll at the end of a musical phrase, but the beat has primacy over the curlicues. In a regular 4/4 bar, the bass drum sounds the first beat, the snare the second, the bass drum again hits the third (often with two eighth notes at this point), and then the snare hits the bar’s final beat. This results in the familiar “boom-DA, boom-boom-DA” sound of most rock drumming. A standard-issue drummer, playing along, say, to the Beatles’ ”Carry That Weight,” would keep his 4/4 beat steady through the line “Boy, you’re gonna carry that weight, carry that weight, a long time,” until the natural break, which comes at the end of the phrase, where, just after the word “time,” a wordless, two-beat half-bar readies itself for the repeated chorus. In that half-bar, there might be space for a quick roll, or a roll and a triplet, or something fancy with snare and high hat—really, any variety of filler. The filler is the fun stuff, and it could be said, without much exaggeration, that nearly all the fun stuff in drumming takes place in those two empty beats between the end of one phrase and the start of another. Ringo Starr, who interpreted his role modestly, does nothing much in that two-beat space: mostly, he provides eight even, straightforward sixteenth notes (da-da-da-da / da-da-da-da). In a good cover version of the song, Phil Collins, a sophisticated drummer who was never a modest performer with Genesis, does a tight roll that begins with featherlight delicacy on a tomtom and ends more firmly on his snare, before going back to the beat. But the modest and the sophisticated drummer, whatever their stylistic differences, share an understanding that there is a proper space for keeping the beat, and a much smaller space for departing from it, like a time-out area in a classroom. The difference is just that the sophisticated drummer is much more often in time-out, and is always busily showing off to the rest of the class while he is there.

Keith Moon ripped all this up. There is no time-out in his drumming, because there is no time-in. It is all fun stuff. The first principle of Moon’s drumming was that drummers do not exist to keep the beat. He did keep the beat, and very well, but he did it by every method except the traditional one. Drumming is repetition, as is rock music generally, and Moon clearly found repetition dull. So he played the drums like no one else—and not even like himself. No two bars of Moon’s playing ever sound the same; he is in revolt against consistency. Everyone else in the band gets to improvise, so why should the drummer be nothing more than a condemned metronome? He saw himself as a soloist playing with an ensemble of other soloists. It follows from this that the drummer will be playing a line of music, just as, say, the guitarist does, with undulations and crescendos and leaps. It further follows that the snare drum and the bass drum, traditionally the ball-and-chain of rhythmic imprisonment, are no more interesting than any of the other drums in the kit; and that you will need lots of those other drums. By the mid-nineteen-seventies, when Moon’s kit was “the biggest in the world,” he had two bass drums and at least twelve tomtoms, arrayed in stacks like squadrons of spotlights; he looked like a cheerful boy who had built elaborate fortifications for the sole purpose of destroying them. But he needed all those drums, as a flute needs all its stops or a harp its strings, so that his tremendous bubbling cascades, his liquid journeys, could be voiced: he needed not to run out of drums as he ran around them.

by James Wood, New Yorker |  Read more:
Image: Ross Halfin

Daria Petrilli
via:

Katsuro Yoshida, Work “9”, 1970
via:

Monday, July 21, 2014

The Real 10 Algorithms That Dominate Our World


The other day, while I was navigating Reddit I found an interesting post that was called The 10 Algorithms That Dominate Our World by the author George Dvorsky which was trying to explain the importance that algorithms have in our world today and which ones are the most important for our civilization.

Now if you have studied algorithms the first thing that could come to your mind while reading the article is “Does the author know what an algorithm is?” or maybe “Facebook news feed is an algorithm?” because if Facebook news feed is an algorithm then you could eventually classify almost everything as an algorithm. So I’m going to try to explain in this post what an algorithm is and which are the real 10 (or maybe more ) algorithms that rule our world.

What is an algorithm?

Informally, an algorithm is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values, as output. An algorithm is thus a sequence of computational steps that transform the input into the output. Source: Thomas H. Cormen, Chales E. Leiserson (2009), Introduction to Algorithms 3rd edition.

In simple terms, it is possible to say that an algorithm is a sequence of steps which allow to solve a certain task ( Yes, not just computers use algorithms, humans also use them). Now, an algorithm should have three important characteristics to be considered valid:
  1. It should be finite: If your algorithm never ends trying to solve the problem it was designed to solve then it is useless
  2. It should have well defined instructions: Each step of the algorithm has to be precisely defined; the instructions should be unambiguously specified for each case.
  3. It should be effective: The algorithm should solve the problem it was designed to solve. And it should be possible to demonstrate that the algorithm converges with just a paper and pencil.
Also it is important to point out that algorithms are not just used in Computing Sciences but are a mathematical entity. In fact the first recorded mathematical algorithms that we have date from 1600 BC — Babylonians develop earliest known algorithms for factorization and finding square roots. So here we have the first problem with the post mentioned before, it treats algorithms as computing entities, but if you take the formal meaning of the word the real top 10 algorithms that rule the world can be found in a book of arithmetic (addition, subtraction, product, etc).

But lets take computing algorithms as our definition of algorithm in this post, so the question remains: Which are the 10 algorithms that rule the world? Here I’ve put together a little list, in no particular order.

by Marcos Otero, Medium |  Read more:
Image: uncredited

Paul Manship, Diana and a Hound, 1925
via:

James Garner (April,1928 – July, 2014)


American actor James Garner has died at his home in Los Angeles. He was 86.

Garner was perhaps best known for his rakish charm and eye-twinkling good looks. He was the sort of guy you wanted on your side in a jam, because he was the sort of guy who would know how to get out of that jam, whether it meant resorting to his fists or his wits.

Much has been made of how Garner ably hopped from television to film and back again (even if potential big-screen bosses worried he was too associated with his small-screen roles), but just as much could be discussed about how Garner so ably shoehorned his basic persona into just about every genre imaginable. If you want Garner in action mode, there's plenty of room for that in his filmography. Want to see him outsmarting criminals in crime stories? He can handle that, too. And if you just want to see him playing romance or even comedy, he's more than able to. Garner was a star, to be sure, but he was the rare kind of star who could make his essential James Garnerness work in just about any situation. He was versatile, but always somehow himself, a rare blend that many actors strive for but few achieve. (...)

Garner was already slowing down by the time Rockford reached its end in 1980, and he spent most of the last decades of his career starring in smaller films (with the occasional Space Cowboys interspersed for good measure). This means he did yet more romances, and while 2004's The Notebook is probably the best known of these films, check out the 1985 film Murphy's Romance instead. Garner is more central to that film's story (which is about an unlikely relationship that develops between his character and a younger woman played by Sally Field), and he's so charming in it that he managed to score his only Oscar nomination for the role.

by Todd VanDerWerff , Vox | Read more:
Image: YouTube

The 'Fingerprinting' Tracking Tool That's Virtually Impossible to Block


A new, extremely persistent type of online tracking is shadowing visitors to thousands of top websites, from WhiteHouse.gov to YouPorn.com.

The type of tracking, called canvas fingerprinting, works by instructing the visitor’s web browser to draw a hidden image, and was first documented in a upcoming paper by researchers at Princeton University and KU Leuven University in Belgium. Because each computer draws the image slightly differently, the images can be used to assign each user’s device a number that uniquely identifies it.

Like other tracking tools, canvas fingerprints are used to build profiles of users based on the websites they visit — profiles that shape which ads, news articles or other types of content are displayed to them.

But fingerprints are unusually hard to block: They can’t be prevented by using standard web browser privacy settings or using anti-tracking tools such as AdBlock Plus.

The researchers found canvas fingerprinting computer code, primarily written by a company called AddThis, on 5% of the top 100,000 websites. Most of the code was on websites that use AddThis’ social media sharing tools. Other fingerprinters include the German digital marketer Ligatus and the Canadian dating site Plentyoffish. (A list of all the websites on which researchers found the code is here).

Rich Harris, chief executive of AddThis, said that the company began testing canvas fingerprinting earlier this year as a possible way to replace “cookies,” the traditional way that users are tracked, via text files installed on their computers.

“We’re looking for a cookie alternative,” Harris said in an interview.

by Julia Angwin, Mashable |  Read more:
Image: Mashable composite, Getty Creative, Eyematrix, Derrrek
[ed. Google is putting this post behind some warning screen "because it contains sensitive content as outlined in Blogger’s Community Guidelines." I've gotten several of these warnings lately (which were eventually rescinded) and can only assume it's because they have some insane new algorithm that looks for anything sexual. In this case, a mention of the YouPorn.com website (or maybe the real dysfunctional website WhiteHouse.gov). Who knows? There's no specifics. I wish they would get their act together before they start screwing around with personal websites (repeatedly).]

Team Discovers Achilles' Heel in Antibiotic-Resistant Bacteria

Scientists at the University of East Anglia have made a breakthrough in the race to solve antibiotic resistance.

New research published today in the journal Nature reveals an Achilles' heel in the defensive barrier which surrounds drug-resistant bacterial cells.

The findings pave the way for a new wave of drugs that kill superbugs by bringing down their defensive walls rather than attacking the bacteria itself. It means that in future, bacteria may not develop drug-resistance at all.

The discovery doesn't come a moment too soon. The World Health Organization has warned that antibiotic-resistance in bacteria is spreading globally, causing severe consequences. And even common infections which have been treatable for decades can once again kill.

Researchers investigated a class of bacteria called 'Gram-negative bacteria' which is particularly resistant to antibiotics because of its cells' impermeable lipid-based outer membrane.

This outer membrane acts as a defensive barrier against attacks from the human immune system and antibiotic drugs. It allows the pathogenic bacteria to survive, but removing this barrier causes the bacteria to become more vulnerable and die.

Until now little has been known about exactly how the defensive barrier is built. The new findings reveal how bacterial cells transport the barrier building blocks (called lipopolysaccharides) to the outer surface.

Group leader Prof Changjiang Dong, from UEA's Norwich Medical School, said: "We have identified the path and gate used by the bacteria to transport the barrier building blocks to the outer surface. Importantly, we have demonstrated that the bacteria would die if the gate is locked."

by Phys.org |  Read more:
Image: Diamond Light Source

Dating Startups Don’t Stand a Chance Against This Corporate Matchmaker


Aaron Schildkrout and I were sitting in the bar of the Gansevoort Hotel in New York City when I asked him whether he planned on selling his dating site, HowAboutWe, to the giant of the online match-making business, InterActiveCorp, aka IAC. A deal would make good sense, after all. IAC runs a near monopoly in this market, fueled by popular properties like Match.com, OkCupid, and Tinder. Competing with all of that seemed like a nearly insurmountable task for HowAboutWe, an 85-employee startup Schildkrout founded with his friend Brian Schechter.

Still, Schildkrout rejected the idea. “Our road will be really obvious,” he answered, “and not the one you just described.” He was a little vague at first, but he went on to lay out his vision for how the company would expand far beyond the dating space. It had already launched a media vertical called HowAboutWe Media, a local deal site for people in relationships called HowAboutWe for Couples, and, most recently, a couples messaging app called You&Me. Just as Nike built the all-purpose brand for fitness, Schildkrout said, he wanted HowAboutWe to be the all-purpose brand for love.

It was a deliberate strategy, he explained, that would prevent HowAboutWe from competing exclusively in the dating space, a market that IAC effectively owns. “Dating is going to be capped at something like a $200 million revenue business even if it goes really well,” he said. “We want to build something much bigger.”

But this week, just two months after our conversation, HowAboutWe announced that it is indeed selling off its dating and media properties to IAC’s Match Group for an undisclosed amount, leaving the fate of its fledgling couples platforms and the rest of the founders’ long-term plans in limbo. The deal is further proof of what Schildkrout and Schechter seem to have known all along: for dating startups, resistance against IAC is futile. That’s a big problem for innovation in this space. And, yes, online dating needs innovation—just like any other internet market.

With the notable exception of Tinder, the wildly successful mobile dating app launched out of IAC’s Hatch Labs, the brands under IAC’s Match Group just aren’t in tune with their younger audience. The companies it acquires tend to stay the same. Schildkrout himself described them as “weird” and “not aligned with the millennial spirit.” “It’s like early 2000s, even late 90s design, and getting trapped in an endless upgrade,” he said this spring, back when IAC was still a competitor. “I was single, and I was like: ‘I would never use this.’”

by Issie Lapowsky, Wired |  Read more:
Image: Getty 

Sunday, July 20, 2014

The Lights Are On but Nobody’s Home

Who needs the Internet of Things? Not you, but corporations who want to imprison you in their technological ecosystem.

Prepare yourself. The Internet of Things is coming, whether we like it or not apparently. Though if the news coverage — the press releases repurposed as service journalism, the breathless tech-blog posts — is to be believed, it’s what we’ve always wanted, even if we didn’t know it. Smart devices, sensors, cameras, and Internet connectivity will be everywhere, seamlessly and invisibly integrated into our lives, and it will make society more harmonious through the gain of a million small efficiencies. In this vision, the smart city isn’t plagued by deteriorating infrastructure and underfunded social services but is instead augmented with a dizzying collection of systems that ensure that nothing goes wrong. Resources will be apportioned automatically, mechanics and repair people summoned by the system’s own command. We will return to what Lewis Mumford described as a central feature of the Industrial Revolution: “the transfer of order from God to the Machine.” Now, however, the machines will be thinking for themselves, setting society’s order based on the false objectivity of computation.

According to one industry survey, 73 percent of Americans have not heard of the Internet of Things. Another consultancy forecasts $7.1 trillion in annual sales by the end of the decade. Both might be true, yet the reality is that this surveillance-rich environment will continue to be built up around us. Enterprise and government contracts have floated the industry to this point: To encourage us to buy in, sensor-laden devices will be subsidized, just as smartphones have been for years, since companies can make up the cost difference in data collection. (...)

In advertising from AT&T and others, the new image of the responsible homeowner is an informationally aware one. His house is always accessible and transparent to him (and to the corporations, backed by law enforcement, providing these services). The smart home, in turn, has its own particular hierarchy, in which the manager of the home’s smart surveillance system exercises dominance over children, spouses, domestic workers, and others who don’t have control of these tools and don’t know when they are being watched. This is being pushed despite the fact that violent crime has been declining in the United States for years, and those who do suffer most from crime — the poor — aren’t offered many options in the Internet of Things marketplace, except to submit to networked CCTV and police data-mining to determine their risk level.

But for gun-averse liberals, ensconced in low-crime neighborhoods, smart-home and digitized home-security platforms allow them to act out their own kind of security theater. Each home becomes a techno-castle, secured by the surveillance net.

The surveillance-laden house may rob children of essential opportunities for privacy and personal development. One AT&T video, for instance, shows a middle-aged father woken up in bed by an alert from his security system. He grabs his tablet computer and, sotto voce, tells his wife that someone’s outside. But it’s not an intruder, he says wryly. The camera cuts to shows a teenage girl, on the tail end of a date, talking to a boy outside the home. Will they or won’t they kiss? Suddenly, a garish bloom of light: the father has activated the home’s outdoor lights. The teens realize they are being monitored. Back in the master bedroom, the parents cackle. To be unmonitored is to be free — free to be oneself and to make mistakes. A home ringed with motion-activated lights, sensors, and cameras, all overseen by imperious parents, would allow for little of that.

In the conventional libertarian style, the Internet of Things offloads responsibilities to individuals, claiming to empower them with data, while neglecting to address collective, social issues. And meanwhile, corporations benefit from the increased knowledge of consumers’ habits, proclivities, and needs, even learning information that device owners don’t know themselves. (...)

As the Internet of Things expands, we may witness an uncomfortable feature creep. When the iPhone was introduced, few thought its gyroscopes would be used to track a user’s steps, sleep patterns, or heartbeat. Software upgrades or novel apps can be used to exploit hardware’s hidden capacities, not unlike the way hackers have used vending machines and HVAC systems to gain access to corporate computer networks. To that end, many smart thermostats use “geofencing” or motion sensors to detect when people are at home, which allows the device to adjust the temperature accordingly. A company, particularly a conglomerate like Google with its fingers in many networked pies, could use that information to serve up ads on other screens or nudge users towards desired behaviors. As Jathan Sadowski has pointed out here, the relatively trivial benefit of a fridge alerting you when you’ve run out of a product could be used to encourage you to buy specially advertised items. Will you buy the ice cream for which your freezer is offering a coupon? Or will you consult your health-insurance app and decide that it’s not worth the temporary spike in your premiums?

This combination of interconnectivity and feature creep makes Apple’s decision to introduce platforms for home automation and health-monitoring seem rather cunning. Cupertino is delegating much of the work to third-party device makers and programmers — just as it did with its music and app stores — while retaining control of the infrastructure and the data passing through it. (Transit fees will be assessed accordingly.) The writer and editor Matt Buchanan, lately of The Awl, has pointed out that, in shopping for devices, we are increasingly choosing among competing digital ecosystems in which we want to live. Apple seems to have apprehended this trend, but so have two other large industry groups — the Open Interconnect Consortium and the AllSeen alliance — with each offering its own open standard for connecting many disparate devices. Market competition, then, may be one of the main barriers to fulfilling the prophetic promise of the Internet of Things: to make this ecosystem seamless, intelligent, self-directed, and mostly invisible to those within it. For this vision to come true, you would have to give one company full dominion over the infrastructure of your life.

by Jacob Silverman, TNI |  Read more:
Image: umcredited 

Banksy
via: