Wednesday, October 23, 2013

To Paul, With Lox and Squalor

I’m leaving New York this week. I don’t know how long I’ll be gone, but I do know that there is no Russ & Daughters where I’m going, and there is no Paul.

Since moving to New York City twelve years ago, I have lived in nine apartments. The first was university housing on the Upper West Side. The next six formed a Brooklyn daisy chain, fragrant with interpersonal drama. The eighth, which brought me back across the East River, was chosen more or less because of a sandwich.

I discovered this sandwich during my ten-month tenure at the seventh apartment, a place I landed by necessity after a breakup. The apartment had a sloping floor, a ceiling pregnant with water damage, and a space heater that emitted the smell of burning hair, but not much in the way of heat. But it also had a panoramic view of Manhattan, and it was this view that pulled me from my slowly decomposing residence to the Lower East Side, where I found Russ & Daughters, a nearly century-old smoked-fish emporium. Between its neon sign, glorious aroma, and sense of history so palpable it could almost be sliced as surely as a side of lox, it was, in my eyes, a halfway house between a grandmother’s embrace and a place of worship. I ordered a sandwich involving whitefish salad, horseradish cream cheese, and an everything bagel, took a bite, and knew I was home. A few months later, I moved into my eighth apartment, four blocks away from the shop.

I can’t remember exactly when I met Paul, though I do remember being moderately afraid of him. Among the chorus of white-coated Russ & Daughters countermen, he stood out as a crank par excellence, a silver-haired middle finger pointed in the direction of the city’s capitulation to customer service with a smile. He could smite indecisive tourists with a single sneer, and dispatch more high-maintenance specimens with an eye-roll so world-weary it almost required its own cane. He was a throwback to a New York that predated frozen-yogurt chains and nail salons, a New York of daylight muggings and nighttime shooting galleries. He was old-school, full stop.

Over time, we developed what one might call a rapport. I’d ask him about his day. He’d ask me about my boyfriend. We’d bump fists over the counter. One afternoon, I brought in an old Polaroid land camera, prompting Paul to reminisce about photos he took of various lady friends in, shall we say, compromising positions. “What do you want?” he asked, noting the look I had failed to suppress. “I was a red-blooded young man.”

I came to think of Paul as the dirty uncle I never had. He could work blue, but the only time he made me blush was when he told me, with a disarming degree of sincerity, that I was his “precious little angel.”

by Rebecca Flint Marx, Medium |  Read more:

Houses in Greenland
via:

Cass McCombs

The Decline of Wikipedia


The sixth most widely used website in the world is not run anything like the others in the top 10. It is not operated by a sophisticated corporation but by a leaderless collection of volunteers who generally work under pseudonyms and habitually bicker with each other. It rarely tries new things in the hope of luring visitors; in fact, it has changed little in a decade. And yet every month 10 billion pages are viewed on the English version of Wikipedia alone. When a major news event takes place, such as the Boston Marathon bombings, complex, widely sourced entries spring up within hours and evolve by the minute. Because there is no other free information source like it, many online services rely on Wikipedia. Look something up on Google or ask Siri a question on your iPhone, and you’ll often get back tidbits of information pulled from the encyclopedia and delivered as straight-up facts.

Yet Wikipedia and its stated ambition to “compile the sum of all human knowledge” are in trouble. The volunteer workforce that built the project’s flagship, the English-language Wikipedia—and must defend it against vandalism, hoaxes, and manipulation—has shrunk by more than a third since 2007 and is still shrinking. Those participants left seem incapable of fixing the flaws that keep Wikipedia from becoming a high-quality encyclopedia by any standard, including the project’s own. Among the significant problems that aren’t getting resolved is the site’s skewed coverage: its entries on Pokemon and female porn stars are comprehensive, but its pages on female novelists or places in sub-Saharan Africa are sketchy. Authoritative entries remain elusive. Of the 1,000 articles that the project’s own volunteers have tagged as forming the core of a good encyclopedia, most don’t earn even Wikipedia’s own middle-­ranking quality scores.

The main source of those problems is not mysterious. The loose collective running the site today, estimated to be 90 percent male, operates a crushing bureaucracy with an often abrasive atmosphere that deters newcomers who might increase participation in Wikipedia and broaden its coverage.

In response, the Wikimedia Foundation, the 187-person nonprofit that pays for the legal and technical infrastructure supporting Wikipedia, is staging a kind of rescue mission. The foundation can’t order the volunteer community to change the way it operates. But by tweaking Wikipedia’s website and software, it hopes to steer the encyclopedia onto a more sustainable path.

The foundation’s campaign will bring the first major changes in years to a site that is a time capsule from the Web’s earlier, clunkier days, far removed from the easy-to-use social and commercial sites that dominate today. “Everything that Wikipedia is was utterly appropriate in 2001 and it’s become increasingly out of date since,” says Sue Gardner, executive director of the foundation, which is housed on two drab floors of a downtown San Francisco building with a faulty elevator. “This is very much our attempt to get caught up.” She and Wikipedia’s founder, Jimmy Wales, say the project needs to attract a new crowd to make progress. “The biggest issue is editor diversity,” says Wales. He hopes to “grow the number of editors in topics that need work.”

Whether that can happen depends on whether enough people still believe in the notion of online collaboration for the greater good—the ideal that propelled Wikipedia in the beginning. But the attempt is crucial; Wikipedia matters to many more people than its editors and students who didn’t make time to read their assigned books. More of us than ever use the information found there, both directly and via other services. Meanwhile, Wikipedia has either killed off the alternatives or pushed them down the Google search results.

by Tom Simonite, MIT Technology Review | Read more:
Image: Wikipedia

Edison’s Revenge


Fiddly cables, incompatible plugs and sockets, and the many adaptors needed to fit them all together used to be the travellers’ bane. But the USB (Universal Serial Bus) has simplified their life. Most phones and other small gadgets can charge from a simple USB cable plugged into a computer or an adaptor. Some 10 billion of them are already in use. Hotel rooms, aircraft seats, cars and new buildings increasingly come with USB sockets as a standard electrical fitting.

Now a much bigger change is looming. From 2014, a USB cable will be able to provide power to bigger electronic devices. In the long term this could change the way homes and offices use electricity, cutting costs and improving efficiency.

The man who invented the USB, Ajay Bhatt of Intel, a chipmaker, barely thought about power. His main aim was to cut the clutter and time-wasting involved in plugging things into a computer. The keyboard, mouse, speakers and so forth all required different cables, and often drivers (special bits of software) as well. The USB connection’s chief role was to help computers and devices negotiate and communicate.

Mr Bhatt did not think he was creating a new charging system. Indeed, the trickle of electricity (up to ten watts on the existing standard) is still barely enough for devices such as an iPad. Yet USB charging is now the default for phones, e-readers and other small gadgets. Some mobile-phone manufacturers are already shipping their products without a power adaptor. Ingenious inventors have eked out the slender USB power supply to run fans, tiny fridges and toy rocket-launchers.

The big change next year will be a new USB PD (Power Delivery) standard, which brings much more flexibility and ten times as much oomph: up to 100 watts. (...)

Current affairs

That could presage a much bigger shift, reviving the cause of direct current (DC) as the preferred way to power the growing number of low-voltage devices in homes and offices. DC has been something of a poor relation in the electrical world since it lost out to alternating current (AC) in a long-ago battle in which its champion Nikola Tesla (pictured, left) trounced Thomas Edison (right). Tesla won, among other reasons, because it was (in those days) easier to shift AC power between different voltages. It was therefore a better system for transmitting and distributing electricity.

But the tide may be turning. Turning AC into the direct current required to power transistors (the heart of all electronic equipment) is a nuisance. The usual way is through a mains adaptor. These ubiquitous little black boxes are now cheap and light. But they are often inefficient, turning power into heat. And they are dumb: they run night and day, regardless of whether the price of electricity is high or low. It would be better to have a DC network, of the kind Mr Daniel has rigged up, for all electronic devices in a home or office.

This is where USB cables come in. They carry direct current and also data. That means they can help set priorities between devices that are providing power and those that are consuming it: for example, a laptop that is charging a mobile phone. “The computer can say ‘I need to start the hard disk now, so no charging for the next ten seconds’,” says Mr Bhatt. The new standard, with variable voltage and greater power, enlarges the possibilities. So does another new feature: that power can flow in any direction.

This chimes with another advantage. A low-voltage DC network works well with solar panels. These produce DC power at variable times and in variable amounts. They are increasingly cheap, and can fit in windows or on roofs. Though solar power is tricky to feed into the AC mains grid, it is ideally suited to a low-voltage local DC network. When the sun is shining, it can help charge all your laptops, phones and other battery-powered devices.

by The Economist |  Read more:
Image: Matt Herring

$10 Smartphone to Digital Microscope Conversion


The world is an interesting place, but it's fascinating up close. Through the lens of a microscope you can find details that you would otherwise never notice. But now you can.

This instructable will show you how to build a stand for about $10 that will transform your smartphone into a powerful digital microscope. This DIY conversion stand is more than capable of functioning in an actual laboratory setting. With magnification levels as high as 175x, plant cells and their nuclei are easily observed! In addition to allowing the observation of cells, this setup also produces stunning macro photography.

The photos in this instructable were taken with an iPhone 4S.
 

by Yoshinok, Instructables |  Read more:
Image: Yoshinok

Emil Nolde, Alps Mountain Landscape 1930.
via:

Tuesday, October 22, 2013

Are We Puppets in a Wired World?

Internet activities like online banking, social media, web browsing, shopping, e-mailing, and music and movie streaming generate tremendous amounts of data, while the Internet itself, through digitization and cloud computing, enables the storage and manipulation of complex and extensive data sets. Data—especially personal data of the kind shared on Facebook and the kind sold by the state of Florida, harvested from its Department of Motor Vehicles records, and the kind generated by online retailers and credit card companies—is sometimes referred to as “the new oil,” not because its value derives from extraction, which it does, but because it promises to be both lucrative and economically transformative.

In a report issued in 2011, the World Economic Forum called for personal data to be considered “a new asset class,” declaring that it is “a new type of raw material that’s on par with capital and labour.” Morozov quotes an executive from Bain and Company, which coauthored the Davos study, explaining that “we are trying to shift the focus from purely privacy to what we call property rights.” It’s not much of a stretch to imagine who stands to gain from such “rights.”

Individually, data points are typically small and inconsequential, which is why, day to day, most people are content to give them up without much thought. They only come alive in aggregate and in combination and in ways that might never occur to their “owner.” For instance, records of music downloads and magazine subscriptions might allow financial institutions to infer race and deny a mortgage. Or search terms plus book and pharmacy purchases can be used to infer a pregnancy, as the big-box store Target has done in the past. (...)

This brings us back to DARPA and its quest for an algorithm that will sift through all manner of seemingly disconnected Internet data to smoke out future political unrest and acts of terror. Diagnosis is one thing, correlation something else, prediction yet another order of magnitude, and for better and worse, this is where we are taking the Internet. Police departments around the United States are using Google maps, together with crime statistics and social media, to determine where to patrol, and half of all states use some kind of predictive data analysis when making parole decisions. More than that, gush the authors of Big Data:
In the future—and sooner than we may think—many aspects of our world will be augmented or replaced by computer systems that today are the sole purview of human judgment…perhaps even identifying “criminals” before one actually commits a crime.
The assumption that decisions made by machines that have assessed reams of real-world information are more accurate than those made by people, with their foibles and prejudices, may be correct generally and wrong in the particular; and for those unfortunate souls who might never commit another crime even if the algorithm says they will, there is little recourse. In any case, computers are not “neutral”; algorithms reflect the biases of their creators, which is to say that prediction cedes an awful lot of power to the algorithm creators, who are human after all. Some of the time, too, proprietary algorithms, like the ones used by Google and Twitter and Facebook, are intentionally biased to produce results that benefit the company, not the user, and some of the time algorithms can be gamed. (There is an entire industry devoted to “optimizing” Google searches, for example.)

But the real bias inherent in algorithms is that they are, by nature, reductive. They are intended to sift through complicated, seemingly discrete information and make some sort of sense of it, which is the definition of reductive. But it goes further: the infiltration of algorithms into everyday life has brought us to a place where metrics tend to rule. This is true for education, medicine, finance, retailing, employment, and the creative arts. There are websites that will analyze new songs to determine if they have the right stuff to be hits, the right stuff being the kinds of riffs and bridges found in previous hit songs.

Amazon, which collects information on what readers do with the electronic books they buy—what they highlight and bookmark, if they finish the book, and if not, where they bail out—not only knows what readers like, but what they don’t, at a nearly cellular level. This is likely to matter as the company expands its business as a publisher. (Amazon already found that its book recommendation algorithm was more likely than the company’s human editors to convert a suggestion into a sale, so it eliminated the humans.)

Meanwhile, a company called Narrative Science has an algorithm that produces articles for newspapers and websites by wrapping current events into established journalistic tropes—with no pesky unions, benefits, or sick days required. Call me old-fashioned, but in each case, idiosyncrasy, experimentation, innovation, and thoughtfulness—the very stuff that makes us human—is lost. A culture that values only what has succeeded before, where the first rule of success is that there must be something to be “measured” and counted, is not a culture that will sustain alternatives to market-driven “creativity.”

by Sue Halpern, NY Review of Books |  Read more:
Image: Eric Edelman

Katsushika Hokusai - Kanagawa oki nami ura (1830-31) (variation)

Healthcare.gov: It Could Be Worse

On October 1st, the first day of the government shutdown, the U.S. Centers for Medicare & Medicaid Services launched Healthcare.gov, a four-hundred-million-dollar online marketplace designed to help Americans research and purchase health insurance. In its first days, only a small fraction of users could create an account or log in. The problems were initially attributed to high demand. But as days turned into weeks, Healthcare.gov’s troubles only seemed to multiply. Reports appeared of applications freezing half-completed and of the system “putting users in inescapable loops, and miscalculating healthcare subsidies.” Politico reported that “Web brokers … have been unable to connect to the federal system.” Healthcare.gov is the public face of the Obama Administration’s signature policy achievement, and its launch has been widely derided as a disaster. But it could have been worse.

On September 11, 2001, the F.B.I. was still using a computer system that couldn’t store or display pictures; entering data was time-consuming and awkward, and retrieving it even more so. A 9/11 Commission staff report concluded that “the FBI’s primary information management system, designed using 1980s technology already obsolete when installed in 1995, limited the Bureau’s ability to share its information internally and externally.” But an overhaul of that system had already begun in the months leading up to 9/11. In June, 2001, the F.B.I. awarded the contractor Science Applications International Corp. (S.A.I.C.) a fourteen-million-dollar contract to upgrade the F.B.I.’s computer systems. The project was called Virtual Case File, or V.C.F., and it would ultimately cost over six hundred million dollars before finally being abandoned, in early 2005, unfinished and never deployed. V.C.F. was then replaced with a project called Sentinel, expected to launch in 2009, which was “designed to be everything V.C.F. was not, with specific requirements, regular milestones and aggressive oversight,” according to F.B.I. officials who spoke to the Washington Post in 2006. But by 2010, Sentinel was also being described as “troubled,” and only two out of a planned four phases had been completed. Sentinel was finally deployed on July 1, 2012, after the F.B.I. took over the project from the contractor Lockheed-Martin in 2010, bringing it in-house for completion—at an ultimate cost of at least four hundred and fifty-one million dollars. In the end, the upgrade took the F.B.I. more than a decade and over a billion dollars.

Healthcare.gov is not so much a Web site as an interface for accessing a collection of databases and information systems. Behind the nicely designed Web forms are systems to create accounts, manage user logins, and collect insurance-application data. There’s a part that determines subsidy eligibility, a part that sends applications to the right insurance company, and other parts that glue these things together. Picture the dashboard of your car, which has a few knobs and buttons, some switches, and a big wheel—simple controls for a lot of complex machinery under the hood. All of these systems, whether in your car or on Healthcare.gov, have to communicate the right information at the right time for any of it to work properly. In the case of Healthcare.gov, we don’t know what precisely has gone wrong, because the system isn’t open-source—meaning the code used to build it isn’t available for anyone to see—and nobody involved has released technical information. But the multiple databases and subsystems are probably distributed all over the country, written in a variety of computer languages, and handle data in very different ways. Some are brand new, others are old.

For large software projects, failure is generally determined early in the process, because failures almost exclusively have to do with planning: the failure to create a workable plan, to stick to it, or both. Healthcare.gov reportedly involved over fifty-five contractors, managed by a human-services agency that lacked deep experience in software engineering or project management. The final product had to be powerful enough to navigate any American through a complex array of different insurance offerings, secure enough to hold sensitive private data, and robust enough to withstand peak traffic in the hundreds of thousands, if not millions, of concurrent users. It also had to be simple enough so that anyone who can open a Web browser could use it. In complexity, this is a project on par with the F.B.I.’s V.C.F. or Sentinel. The number and variety of systems to be connected may not be quite as large, but the interface had to be usable by anyone, without special training. And, unlike V.C.F., Healthcare.gov was given only twenty-two months from contract award to launch—less than two years for a project similar to one that took the F.B.I. more than ten years and over twice the budget.

by Rusty Foster, New Yorker |  Read more:
Image: Michael Kupperman

Reflections on a Paris Left Behind


Even Hemingway struggled with this city, working on a memoir of his poor early days, “A Moveable Feast,” off and on for years, before it was finally published after his death. Christopher Hitchens once called it “an ur-text of the American enthrallment with Paris,” identifying an unthinking nostalgia “as we contemplate a Left Bank that has since become a banal tourist enclave in a Paris where the tough and plebeian districts are gone, to be replaced by seething Muslim banlieues all around the periphery.”

Sometimes, reading about Paris in newspapers, magazines and on Web sites devoted to tourism, I feel the clichés piling high enough to touch the Eiffel Tower — or even the still-hideous Tour Montparnasse, which for decades has given skyscrapers a bad name here.

All the clichés are still there, if that’s as far as you’re willing to look, from the supposedly haughty waiters to the baguettes and croissants and the nighttime lights on the Notre-Dame de Paris, shimmering with a faith now largely abandoned. (...)

There are parts of Paris that are “cool,” to be sure, but not the way London is, or Berlin, or even Amsterdam. Paris is a city of the well-to-do, mostly white, and their careful pleasures: museums, restaurants, opera, ballet and bicycle lanes. Bertrand Delanoë, the Paris mayor since 2001, is a Socialist Michael Bloomberg — into bobo virtues like health and the environment and very much down on cars.

Adam Gopnik, a New Yorker writer, finds “the Parisian achievement” to have created, in the 19th century, two concepts of society: “the Haussmannian idea of bourgeois order and comfort, and the avant-garde of ‘la vie de bohème.’ ” While these two societies seemed to be at war, he suggests, in fact they were “deeply dependent on each other.”

Today, however, the balance is gone, and Paris is too ordered, too antiseptic and too tightly policed to have much of a louche life beyond bourgeois adulteries. In that sense, something important has been lost. (...)

Paris is the most beautiful city in the world; to me, only Prague comes close. But Paris is also filthy. While tourists regard Paris with awe and respect, for the most part many Parisians treat it with studied indifference, a high virtue here, or with contempt.

It is the Parisians who leave dog excrement on the sidewalks, who ignore the trash containers. With smoking now supposedly banned inside restaurants, the terraces of cafes become more crowded. But the streets have become ashtrays, and the rubbish defeats the traditional sluicing of the gutters with city water by men with long green nylon brushes. Large parts of Paris remind me of how, in the never quite-so-bad old days, Times Square used to look at 8 a.m. on a Sunday.

France still gets more foreign tourists than most any other country: 83 million in 2012, and 83 percent of them from Europe, compared with only 29.3 million who visited Britain. Paris alone gets 33 million tourists a year, half of them foreigners, many in search of that mythical place where Charles Aznavour meets Catherine Deneuve meets Zidane meets Dior, all drinking Champagne and nibbling foie gras, truffles, oysters and langouste.

While tourists to Israel sometimes suffer from the Jerusalem syndrome, imagining themselves in direct contact with God, some Japanese tourists suffer from what is called the “Paris Syndrome,” distraught at the difference between what they imagine and what they find. Of course, as Walt Whitman wrote about himself, Paris contains multitudes, and most visitors go away having found just enough of what they craved to develop a lifelong yearning to return.

by Steven Erlanger, NY Times |  Read more:
Image: Kosuke Okahara

New Technique Holds Promise for Hair Growth

Scientists have found a new way to grow hair, one that they say may lead to better treatments for baldness.

So far, the technique has been tested only in mice, but it has managed to grow hairs on human skin grafted onto the animals. If the research pans out, the scientists say, it could produce a treatment for hair loss that would be more effective and useful to more people than current remedies like drugs or hair transplants.

Present methods are not much help to women, but a treatment based on the new technique could be, the researchers reported Monday in Proceedings of the National Academy of Sciences.

Currently, transplants move hair follicles from the back of the head to the front, relocating hair but not increasing the amount. The procedure can take eight hours, and leave a large scar on the back of the head. The new technique would remove a smaller patch of cells involved in hair formation from the scalp, culture them in the laboratory to increase their numbers, and then inject them back into the person’s head to fill in bald or thinning spots. Instead of just shifting hair from one spot to another, the new approach would actually add hair. (...)

In the current study, Dr. Christiano worked with researchers from Durham University in Britain. They focused on dermal papillae, groups of cells at the base of hair follicles that give rise to the follicles. Researchers have known for more than 40 years that papilla cells from rodents could be transplanted and would lead to new hair growth. The cells from the papillae have the ability to reprogram the surrounding skin cells to form hair follicles.

But human papilla cells, grown in culture, mysteriously lose the ability to make hair follicles form. A breakthrough came when the researchers realized they might be growing the cells the wrong way.

One of Dr. Christiano’s partners from Durham University, Dr. Colin Jahoda, noticed that the rodent papilla cells formed clumps in culture, but the human cells did not. Maybe the clumps were important, he reasoned. So, instead of trying to grow the cells the usual way, in a flat, one-cell layer on a petri dish, he turned to an older method called the “hanging drop culture.”

That method involves putting about 3,000 papilla cells — the number in a typical papilla — into a drop of culture medium on the lid of a dish, and then flipping the lid over so that the drops are hanging upside down.

“The droplets aren’t so heavy that they drip off,” Dr. Christiano said. “The force of gravity just takes the 3,000 cells and draws them into an aggregate at the bottom of the drop.”

The technique made all the difference. The cells seem to need to touch one another in three dimensions rather than two to send and receive the signals they need to induce hair formation.

by Denise Grady, NY Times |  Read more:
Image: Ruth Fremson