Monday, February 11, 2013

Before Greed

Speaking in New Haven in 1860, Abraham Lincoln told an audience, “I am not ashamed to confess that 25 years ago I was a hired laborer, mauling rails, at work on a flat-boat—just what might happen to any poor man’s son.” After his death, Lincoln’s personal trajectory from log cabin to White House emerged as the ideal American symbol. Anything was possible for those who strived.

But the goal of this striving was not great wealth. Perhaps the most revealing memorial to Lincoln and his world is found in one of the most mundane of American documents: the census. There he is in the Springfield, Illinois, listing of 1860: Abraham Lincoln, 51 years old, lawyer, owner of a home worth $5,000, with $12,000 in personal property. His neighbor Lotus Niles, a 40-year-old secretary—equivalent to a manager today—had accumulated $7,000 in real estate and $2,500 in personal property. Nearby was Edward Brigg, a 48-year-old teamster from England, with $4,000 in real estate and $300 in personal property. Down the block lived Richard Ives, a bricklayer with $4,000 in real estate and $4,500 in personal property. The highest net worth in the neighborhood belonged to a 50-year-old livery stable owner, Henry Corrigan, with $30,000 in real estate but only $300 in personal property. This was a town and a country where bricklayers, lawyers, stable owners, and managers lived in the same areas and were not much separated by wealth. Lincoln was one of the richer men in Springfield, but he was not very rich.

Not only was great wealth an aberration in Lincoln’s time, but even the idea that the accumulation of great riches was the point of a working life seemed foreign. Whereas today the most well-off frequently argue that riches are the reward of hard work, in the Civil War era, the reward was a “competency,” what the late historian Alan Dawley described as the ability to support a family and have enough in reserve to sustain it through hard times at an accustomed level of prosperity. When, through effort or luck, a person amassed not only a competency but enough to support himself and his family for his lifetime, he very often retired. Philip Scranton, an industrial historian, writes of one representative case: Charles Schofield, a successful textile manufacturer in Philadelphia who, in 1863, sold his interest in his firm for $40,000 and “retired with a competency.” Schofield, who was all of 29 years old, considered himself “opulent enough.” The idea of having enough frequently trumped the ambition for endless accumulation.

As the men and women of Lincoln’s and Schofield’s generations aged, they retained the ideal of progress from poverty to competency. Later in the century, midwestern publishers created county histories that featured images of their subscribers’ homesteading progress, from “first home in the woods” to comfortable farm. The “mug books”—so called because they included images not only of cabins and farms but also of their owners—captured the trajectory of these American lives and the achievement of their economic ambitions: the creation of prosperous homes. They built them, but they could build them because they were citizens of a democratic republic. The opportunity to build secure homes was part of the purpose of the economy.

For a moment at the end of the Civil War, it seemed the liberal ideal of a republican citizenry, in which autonomous individuals build a society based on contracts, would reach fruition in a United States where extremes of wealth and poverty were largely nonexistent. Instead, by 1900, extremes of the sort that hadn’t been seen since the abolition of slavery were de rigueur. In 1860 there was only one Cornelius Vanderbilt, but 40 years later, the concentration of wealth in the corporate form ensured an enlarged class of the super-rich.

by Richard White, Boston Review |  Read more:
Photo: Library of Congress

Never on a Saturday

Earlier this week, the United States Post Office announced that come August, it would be suspending regular home delivery service of the mails on Saturdays, except for package service. The USPS is In financial straits, and the budget-cutting move will save about $2 Billion in its first year, putting a dent in the $16 Billion it lost just in 2012.

The Post Office has come under financial pressure from a number of sources over the past decade. Of course the internet has usurped traffic. And there’s also lost market share to private carriers like Federal Express and United Parcel Service, which cut into the lucrative package an overnight delivery markets, while leaving the USPS with an unenviable monopoly in the money-losing but vitally important national letter-and-stamp service. Despite regularly increasing rates over the last decade, the United States still offers one of the cheapest such services in the world, with a flat fee of 46 cents to send a 1 oz. envelope 1st class anywhere in the United States.

For less than half a dollar, you can send a birthday card from Maine to Hawai’i, and be confident that it will arrive in 2-3 days. Pretty impressive. Especially when compared to other nations, almost all of which charge more for an ounce of domestic mail, even though most of them are quite a bit smaller in size. The chart below compares rates from 2011.

Another financial constraint comes from the fact that, other than some small subsidies for overseas U.S. electoral ballots, the USPS is a government agency that pays its own way, operating without any taxpayer dollars for about thirty years now..

However, the biggest factor in its recent financial free fall is undoubtedly the Postal Accountability and Enhancement Act of 2006 (PAEA), which Republicans pushed through Congress and President George W. Bush signed into law. The PAEA required the Post Office fully fund its pension healthcare costs through the year 2081.

Yes, you read that right. 2081. And it was given only 10 years to find the money to fund 75 years worth of retirement healthcare benefits.To clarify just how odious this regulation is, think about it like this. In the next three years, the Post Office must finish finding the money to fully fund not only all of its current retirees and current works, but also decades’ worth of future workers it hasn't hired yet. Indeed, some of the future retired workers in question weren’t even born yet when PAEA was signed into law.

Needless to say, no other federal, state, or government agency, much less any private company, has such a mandate, and the USPS is now bleeding money down the drain like it was shivved in a prison shower stall; which, metaphorically speaking, it was. Cloaked in the mantle of fiscal responsibility, the real impetus for the PAEA was an attack on the postal workers’ union, and a nod to the USPS’s private competitors.

by Akim Reinhardt, 3 Quarks Daily | Read more:
Image: Charles Schultz

Friday, February 8, 2013

The Idealist


At the beginning of every year, Aaron Swartz would post an annotated list of everything he’d read in the last 12 months. His list for 2011 included 70 books, 12 of which he identified as “so great my heart leaps at the chance to tell you about them even now.” One of these was Franz Kafka’s The Trial, about a man caught in the cogs of a vast bureaucracy, facing charges and a system that defy logical explanation. “I read it and found it was precisely accurate—every single detail perfectly mirrored my own experience,” Swartz wrote. “This isn’t fiction, but documentary.”

At the time of his death, the 26-year-old Swartz had been pursued by the Department of Justice for two years. He was charged in July 2011 with accessing MIT’s computer network without authorization and using it to download 4.8 million documents from the online database JSTOR. His actions, the government alleged, violated Title 18 of the U.S. Code, and carried a maximum penalty of up to 50 years in jail and $1 million in fines.

The case had sapped Swartz’s finances, his time, and his mental energy and had fostered a sense of extreme isolation. Though his lawyers were working hard to strike a deal, the government’s position was clear: Any plea bargain would have to include at least a few months of jail time.

A prolonged indictment, a hard-line prosecutor, a dead body—these are the facts of the case. They are outnumbered by the questions that Swartz’s family, friends, and supporters are asking a month after his suicide. Why was MIT so adamant about pressing charges? Why was the DOJ so strict? Why did Swartz hang himself with a belt, choosing to end his own life rather than continue to fight?

When you kill yourself, you forfeit the right to control your own story. At rallies, on message boards, and in media coverage, you will hear that Swartz was felled by depression, or that he got caught in a political battle, or that he was a victim of a vindictive state. A memorial in Washington, D.C., this week turned into a battle over Swartz’s legacy, with mourners shouting in disagreement over what policy changes should be enacted to honor his memory.

Aaron Swartz is a difficult puzzle. He was a programmer who resisted the description, a dot-com millionaire who lived in a rented one-room studio. He could be a troublesome collaborator but an effective troubleshooter. He had a talent for making powerful friends, and for driving them away. He had scores of interests, and he indulged them all. In August 2007, he noted on his blog that he’d “signed up to build a comprehensive catalog of every book, write three books of my own (since largely abandoned), consult on a not-for-profit project, help build an encyclopedia of jobs, get a new weblog off the ground, found a startup, mentor two ambitious Google Summer of Code projects (stay tuned), build a Gmail clone, write a new online bookreader, start a career in journalism, appear in a documentary, and research and co-author a paper.” Also, his productivity had been hampered because he’d fallen in love, which “takes a shockingly huge amount of time!”

How can one sort of organization develop a young man like Aaron Swartz, and how can another destroy him?

He was fascinated by large systems, and how an organization’s culture and values could foster innovation or corruption, collaboration or paranoia. Why does one group accept a 14-year-old as an equal partner among professors and professionals while another spends two years pursuing a court case that’s divorced from any sense of proportionality to the alleged crime? How can one sort of organization develop a young man like Aaron Swartz, and how can another destroy him?

Swartz believed in collaborating to make software and organizations and government work better, and his early experiences online showed him that such things were possible. But he was better at starting things than he was at finishing them. He saw obstacles as clearly as he saw opportunity, and those obstacles often defeated him. Now, in death, his refusal to compromise has taken on a new cast. He was an idealist, and his many projects—finished and unfinished—are a testament to the barriers he broke down and the ones he pushed against. This is Aaron Swartz’s legacy: When he thought something was broken, he tried to fix it. When he failed, he tried to fix something else.

Eight or nine months before he died, Swartz became fixated on Infinite Jest, David Foster Wallace’s massive, byzantine novel. Swartz believed he could unwind the book’s threads and assemble them into a coherent, easily parsed whole. This was a hard problem, but he thought it could be solved. As his friend Seth Schoen wrote after his death, Swartz believed it was possible to “fix the world mainly by carefully explaining it to people.”

It wasn’t that Swartz was smarter than everyone else, says Taren Stinebrickner-Kauffman—he just asked better questions. In project after project, he would probe and tinker until he’d teased out the answers he was looking for. But in the end, he was faced with a problem he couldn’t solve, a system that didn’t make sense.

by Justin Peters, Slate |  Read more:
Photo by Sage Ross/Flickr/Wikimedia Commons

Beat By Dre: The Exclusive Inside Story of How Monster Lost the World


There's never been anything like Beats By Dre. The bulky rainbow headphones are a gaudy staple of malls, planes, clubs, and sidewalks everywhere: as mammoth, beloved, and expensive as their namesake. But Dr. Dre didn't just hatch the flashy lineup from his freight train chest: The venture began as an unlikely partnership between a record-industry powerhouse and a boutique audio company best known for making overpriced HDMI cables.

You might know this; you might own a pair of beats that still has Monster's tiny, subjugated logo printed on them. But what you don't know is how, in inking the deal, Monster screwed itself out of a fortune. It's the classic David vs Goliath story—with one minor edit: David gets his ass kicked and is laughed out of the arena. This is the inside story of one of the all time worst deals in tech.

The route to a rapper-gadget sensation doesn't start in the VIP section of a club over a bottle of Cristal. The idea wasn't hatched in the back of a Maybach or in a boardroom whose walls are decked out in platinum records and shark tanks. Before Dre got paid, and red 'B' logos clamped millions young heads across the globe, the son of Chinese immigrants started toying with audio equipment in California.

Beats begins with Monster, Inc., and Monster begins with Noel Lee. He's a friendly, incredibly smart man with a comic-book hairstyle and a disability that adds to his supervillain stature: Lee is unable to walk. Instead, he glides around on a chrome-plated Segway. Lee has been making things for your ears since 1979, after he took an engineering education and spun it into a components business with one lucrative premise: your music doesn't sound as good as it could.

In true Silicon Valley fashion, Lee started out in his family's basement: taste-testing different varieties of copper wire until he found a type that he thought enhanced audio quality. Then, also in Silicon Valley fashion, he marketed the shit out of it and jacked up its price: Monster Cable. Before it was ever mentioned in the same gasp as Dre, Monster was trying to get music lovers to buy into a superior sound that existed mostly in imaginations and marketing brochures. "We came up with a reinvention of what a speaker cable could be," Noel Lee boasts. His son, Kevin, describes it differently: "a cure for no disease."

Monster expanded into pricey HDMI cables, surge protectors, and... five different kinds of screen-cleaner. Unnecessary, overpriced items like these have earned Monster a reputation over the years as ripoff artists, but that belies the company's ability to make audio products that are actually pretty great. The truth is, audio cable is a lot like expensive basketball shoes: There are a couple hundred people in the world who really need the best, and the rest of us probably can't tell the difference. Doesn't matter: Through a combination of slick persuasion and status-pushing, Noel Lee carved out a small empire.

But you can only sell so many $200 cables. The next step was speakers, but the company started in on speakers too late; the hi-fi era was over. Plenty of people were content with the sound their TVs made, or at most, a soundbar. Monster took a bath.

But speakers for your head? This was the absolute, legit next big thing.

by Sam Biddle, Gizmodo |  Read more:
Image: uncredited

Thursday, February 7, 2013


“Courage is doing what is right; tranquility is courage in repose.” - Inazo Nitobe
via:

Yodamanu Reflets I, Strasbourg 2011.

Caring on Stolen Time: A Nursing Home Diary

I work in a place of death. People come here to die, and my co-workers and I care for them as they make their journeys. Sometimes these transitions take years or months. Other times, they take weeks or some short days. I count the time in shifts, in scheduled state visits, in the sham monthly meetings I never attend, in the announcements of the “Employee of the Month” (code word for best ass-kisser of the month), in the yearly pay increment of 20 cents per hour, and in the number of times I get called into the Human Resources office.

The nursing home residents also have their own rhythms. Their time is tracked by scheduled hospital visits; by the times when loved ones drop by to share a meal, to announce the arrival of a new grandchild, or to wait anxiously at their bedsides for heart-wrenching moments to pass. Their time is measured by transitions from processed food to pureed food, textures that match their increasing susceptibility to dysphagia. Their transitions are also measured by the changes from underwear to pull-ups and then to diapers. Even more than the loss of mobility, the use of diapers is often the most dreaded adaptation. For many people, lack of control over urinary functions and timing is the definitive mark of the loss of independence.

Many of the elderly I have worked with are, at least initially, aware of the transitions and respond with a myriad of emotions from shame and anger to depression, anxiety, and fear. Theirs was the generation that survived the Great Depression and fought the last “good war.” Aging was an anti-climactic twist to the purported grandeur and tumultuousness of their mid-twentieth-century youth.

“I am afraid to die. I don’t know where I will go,” a resident named Lara says to me, fear dilating her eyes.

“Lara, you will go to heaven. You will be happy,” I reply, holding the spoonful of pureed spinach to her lips. “Tell me about your son, Tobias.”

And so Lara begins, the same story of Tobias, of his obedience and intelligence, which I have heard over and over again for the past year. The son whom she loves, whose teenage portrait stands by her bedside. The son who has never visited, but whose name and memory calm Lara.

Lara is always on the lookout, especially for Alba and Mary, the two women with severe dementia who sit on both sides of her in the dining room. To find out if Alba is enjoying her meal, she will look to my co-worker Saskia to ask, “Is she eating? If she doesn’t want to, don’t force her to eat. She will eat when she is hungry.” Alba, always cheerful, smiles. Does she understand? Or is she in her usual upbeat mood? “Lara, Alba’s fine. With you watching out for her, of course she’s OK!” We giggle. These are small moments to be cherished.

In the nursing home, such moments are precious because they are accidental moments.

The residents run on stolen time. Alind, like me, a certified nursing assistant (CNA), comments, “Some of these residents are already dead before they come here.”

By “dead,” he is not referring to the degenerative effects of dementia and Alzheimer’s disease but to the sense of hopelessness and loneliness that many of the residents feel, not just because of physical pain, not just because of old age, but as a result of the isolation, the abandonment by loved ones, the anger of being caged within the walls of this institution. This banishment is hardly the ending they toiled for during their industrious youth.

by Jomo, Dissent |  Read more:
Photo via:

The Marvelous Marie Curie

Marie Curie (1867–1934) is not only the most important woman scientist ever; she is arguably the most important scientist all told since Darwin. Einstein? In theoretical brilliance he outshone her — but her breakthroughs, by Einstein’s own account, made his possible. She took part in the discovery of radioactivity, a term she coined; she identified it as an atomic property of certain elements. When scoffers challenged these discoveries, she meticulously determined the atomic weight of the radioactive element she had revealed to the world, radium, and thereby placed her work beyond serious doubt. Yet many male scientists of her day belittled her achievement, even denied her competence. Her husband, Pierre Curie, did the real work, they insisted, while she just went along for the wifely ride. Chauvinist condescension of this order would seem to qualify Marie Curie as belle idéale of women’s studies, icon for the perennially aggrieved. But such distinction better suits an Aphra Behn or Artemisia Gentileschi than it does a Jane Austen or Marie Curie. Genuine greatness deserves only the most gracious estate, not an academic ghetto, however fashionable and well-appointed.

Yet the fact remains: much of the interest in Madame Curie stems from her having been a woman in the man’s world of physics and chemistry. The interest naturally increases as women claim their place in that world; with this interest comes anger, sometimes righteous, sometimes self-righteous, that difficulties should still stand in the way. A president of Harvard can get it in the neck for suggesting that women don’t have the almost maniacal resolve it takes to become first-rate scientific researchers — that they are prone to distraction by such career-killers as motherhood. So Marie Curie’s singularity cannot but be enveloped in the sociology of science, which is to say these days, feminist politics.

The sociology is important, as long as one remembers the singularity. For Marie Curie did have the almost maniacal resolve to do great scientific work. The work mattered as much to her as it does to most any outstanding scientist; yet can one really say it was everything? She passionately loved her husband and, after his premature death, loved another scientist of immense talent, perhaps of genius; she had the highest patriotic feeling for her native Poland and her adopted France, and risked her life in wartime; she raised two daughters, one, Irène, a Nobel Prize laureate in chemistry, the other, Ève, an accomplished writer, most notably as her mother’s biographer.

Madame Curie’s life reads almost like a comic-book adventure version of feminine heroism: the honest-to-goodness exploits of the original Wonder Woman; the one and only real deal; accept no imitations. Of course, imitation is precisely what such a life tends to inspire in the most zealous and worthy admirers. Madame Curie, however, explicitly warned such aspirants to scientific immortality that the way was unspeakably lonesome and hard, as her daughter Ève Curie records her saying in the 1937 biography Madame Curie. “Madame Curie avoided even that element of vanity that might most easily have been forgiven her: to let herself be cited as an example to other women. ‘It isn’t necessary to lead such an anti-natural existence as mine,’ she sometimes said to calm her overmilitant admirers. ‘I have given a great deal of time to science because I wanted to, because I loved research.... What I want for women and young girls is a simple family life and some work that will interest them.’” Better for gifted women to find some smaller work they enjoy doing and fit it into a life of traditional completeness. But hadn’t Madame Curie herself done it all, and on the titanic scale that launched so many dreamers toward the most earnest fantasies, and in many cases the most heartening achievements? How could she warn others off the path she had traveled? Despite her professions that she had taken the course right for her, did she really regret having traveled it?

One can only say that her intensity was preternatural. She could not have lived otherwise than she did: like a demon’s pitchfork or an angel’s whisper, the need to know, and to be known for knowing — though only among those who mattered, the serious ones like her, for she despised celebrity — drove her on relentlessly. Hardship and ill fortune accompanied her all her days. There seemed to be no ordeal she could not power her way through. Her indomitable will served her voracious intelligence. But for every accomplishment, for every distinction, for every rare joy, she paid and paid. Interludes of happiness brightened the prevailing emotional murk, but the murk did prevail. Episodes of major depression began in childhood and became a fixture. At various times in her life she thought seriously of suicide.

Love could be lost, and forever; children failed to fill the void; only work provided reliable solace and meaning. So she worked. She worked doggedly, devotedly, brilliantly. Scientific work was not simply diversion from the pains of living; it was a way of life, like Socratic philosophy, from which Madame Curie appeared to have acquired the guiding principle: “Nothing in life is to be feared. It is only to be understood.” Whether the unforeseen consequences of her work still sustain that sublime credo is a question as yet unresolved.

by Algis Valiunas, The New Atlantis | Read more:
Photo via:

How the Gun-Control Movement Got Smart


Here is how advocates of gun control used to talk about their cause: They openly disputed that the Second Amendment conferred the right to own a gun. Their major policy goals were to make handguns illegal and enroll all U.S. gun owners in a federal database. The group now known as the Brady Campaign to Prevent Gun Violence was once known as Handgun Control Inc.; a 2001 book by the executive director of the Violence Policy Center was entitled Every Handgun Is Aimed at You: The Case for Banning Handguns.

Contrast that with what you see today: Gun-control groups don't even use the term "gun control," with its big-government implications, favoring "preventing gun violence" instead. Democratic politicians preface every appeal for reform with a paean to the rights enshrined in the Second Amendment and bend over backwards to assure "law-abiding gun owners" they mean them no ill will. Even the president, a Chicago liberal who once derided rural voters' tendency to "cling to guns or religion," seeks to assure gun enthusiasts he's one of them by citing a heretofore-unknown enthusiasm for skeet shooting, adding, "I have a profound respect for the traditions of hunting that trace back in this country for generations. And I think those who dismiss that out of hand make a big mistake."

A frequent question in the current battle over gun control is why anyone should expect reform to succeed now when it's failed repeatedly for the last 20 years. Maybe this is why: Between then and now, advocates of gun control got smarter. They've radically changed their message into one that's more appealing to Middle America and moderate voters.

In the late '90s, "Democrats and gun-control groups had approached the debate consistently in a way that deeply, almost automatically alienated a lot of gun owners," said Jon Cowan, former president of a now-defunct group called Americans for Gun Safety.

The story of the way the gun debate changed is largely the story of AGS. Formed in 2000 by Andrew McKelvey, the CEO of Monster.com, the group sought to reset the terms of the debate and steer the gun-control movement away from its inward-looking, perpetually squabbling, far-left orientation. The various advocacy groups were often more concerned with fighting with each other than with taking the fight to their opponents, and a vocal contingent valued ideological purity over pragmatism. (...)

"There was as much fighting between the groups as with the opposition," David Hantman, a former aide to the bill's sponsor, Senator Dianne Feinstein, recalled. "Some of them insisted that we couldn't just renew [the ban], we had to strengthen it." With Republicans controlling the White House and both houses of Congress, that wasn't politically feasible, and the ban was allowed to lapse. Around the same time, legislation to close the "gun-show loophole" by requiring background checks for non-dealer gun sales was defeated, and Congress passed a bill according gun manufacturers blanket immunity from product-liability lawsuits.

McKelvey, a Yellow Pages ad marketer-turned-tech billionaire, came to the gun issue after being shocked by Columbine. Described by friends as an apolitical businessman who enjoyed hunting (he died of cancer in 2008), McKelvey was frustrated by the tone-deaf approach he saw the gun-control movement taking. He joined the board of Handgun Control Inc. and immediately began pressuring the group to change its name, promising substantial financial support in exchange for such a move; when the group resisted, he quit the board and set out to form his own group -- AGS.

If the NRA today seems fixated on the notion that the left is out to undercut the Second Amendment, confiscate law-abiding Americans' legally acquired firearms, and instigate federal-government monitoring of all gun owners, that's because 15 years ago, gun-control advocates wanted to do all of those things.

by Molly Ball, The Atlantic |  Read more:
Photo: Pete Souza/White House

Wednesday, February 6, 2013


Jennifer Laura Palmer, Number 78 from The Drawing Project, Ink on Paper 9 x 6 in., 2012.
via:

01-04-13 by Lee Kaloidis on Flickr.
via:

Kaunolu



[ed. Pictured are an old heiau site at the summer fishing palace of King Kamehameha I at Kaunolu (top) and Kahekili's Leap near the same location, where warriors would prove their bravery by diving 80 feet over a coral rock ledge into 3-6 m. of water (bottom).]

Photos: markk

Examining the Popularity of Wikipedia Articles: Catalysts, Trends, and Applications

On February 12, 2012, news of Whitney Houston's death brought 425 hits per second to her Wikipedia article, the highest peak traffic on any article since at least January 2010.

It is broadly known that Wikipedia is the sixth most popular website on the Internet, but the English Wikipedia now has over 4 million articles and 29 million total pages. Much less attention has been given to traffic patterns and trends in content viewed. The Wikimedia Foundation makes available aggregate raw article view data for all of its projects.

This article attempts to convey some of the fascinating phenomena that underlie extremely popular articles, and perhaps more importantly to editors, discusses how this information can be used to improve the project moving forward. While some dismiss view spikes as the manifestation of shallow pop culture interests (e.g., Justin Bieber is the 6th most popular article over the past 3 years, see Tab. 2), these are valuable opportunities to study reader behavior and to shape the public perception of our projects. (...)

The origins of heightened popularity

Articles which are "extremely popular" on Wikipedia fall into the category of either (1) occasional or isolated popularity, or (2) consistent popularity.
Tab. 1. The most viewed pages on Wikipedia in a one hour period, since January 1, 2010 (excluding duplicate entries and DOS attacks) 
Whitney Houston 12 Feb 2012 1532302 425.6 Death of subject
Amy Winehouse 23 Jul 2011 1359091 377.5 Death of subject
Steve Jobs 6 Oct 2012 1063665 295.5 Death of subject
Madonna (entertainer) 6 Feb 2012 993062 275.9 Super Bowl halftime
Osama bin Laden 5 Feb 2011 862169 239.5 Death of subject
The Who 7 Feb 2010 567905 157.8 Super Bowl halftime
Ryan Dunn 20 Jun 2011 522301 145.1 Death of subject
Jodie Foster 14 Jan 2013 451270 125.4 Golden Globes speech
The prime sources of occasional or isolated popularity include:

Cultural events and deaths: The best way to reach the highest levels of Wikipedia popularity are to be a celebrity who (a) dies, or (b) plays the Super Bowl halftime show (see Tab. 1). This year's Super Bowl entertainment, Beyoncé Knowles, just missed the chart with 100-110 views/second. Generally, prominent deaths dominate the top-100 traffic events and beyond. However, less morbid events are occasionally on the same scale, such as Jodie Foster following her recent coming out at the 2013 Golden Globes, Bubba Watson upon winning the 2012 Masters Tournament, and Ice hockey at the 2010 Winter Olympics during the final match between the U.S. and Canada (all drew over 250,000 views in a single hour).

Google Doodles: Google often replaces its logo to commemorate anniversaries and other events, and clicking on the logo will usually produce the search results for that topic. With Wikipedia appearing first for many search engine queries, this can be a tremendous source of traffic. When the 110th birthday of Dennis Gabor was celebrated in this fashion on June 5, 2010, his article peaked at over 55 views per second (this for an article that currently sees only about 140 views per day). There are many other examples, including Winsor McCay on October 15, 2012, Gideon Sundback on April 24, 2012, and the London Underground last month.

Non-human views and DOS attacks: Page access data cannot distinguish between human and automated attackers. The most dramatic example occurred on March 9, 2010, when the Jyllands-Posten Muhammad cartoons controversy article saw 5.3 million views in a single hour (likely the densest view-hour at any point in Wikipedia's history). Due to the religious controversy/sensitivity surrounding the topic, this is believed to be an attack designed to prevent others from viewing the page and its associated imagery. Ironically, the Denial of Service article also appears to be a frequent target. Often, it can be hard to distinguish between malicious attacks, accidental misconfiguration (e.g. bot testing), and undiscovered catalysts of human traffic. In compiling the WP:5000/Top25Report, some discretion is applied to attempt to remove odd anomalies. For example, Cat anatomy has been a popular article in raw page views for a few months (and not only on Caturdays), after previously being much less popular.

Second screen effect: Though not nearly on the scale of the above spikes, we find that television programs and their content are reflected in page view data. This can be as broad as spikes on the Big Bang Theory article when the program airs on popular networks, but is even seen in small traffic bumps when a quiz show like Jeopardy! or Who Wants to be a Millionaire? asks about a particular topic. This phenomenon has recently been more thoroughly investigated on the German Wikipedia.[1]

Slashdot effect: When extremely popular aggregation sites like Slashdot or Reddit prominently link to Wikipedia, traffic follows. Internally, Wikipedia's Main page can have much the same effect.
Temporal patterns: The Christmas article is popular in December, Easter peaks around that holiday, and Christianity-related articles tend to see unusual amounts of Sunday traffic. This is the just the start of patterns which are reflected diurnally, annually, and at other pre-determined intervals.

by Andrew West and Milowent, Wikipedia |  Read more:

[ed. Umm, because it's actually working again...? Thanks, Max -- my internet tech wizard!]

Sunday, February 3, 2013

Drone Home

A few months ago I borrowed a drone from a company called Parrot. Officially the drone is called an AR.Drone 2.0, but for simplicity's sake, we're just going to call it the Parrot. The Parrot went on sale last May and retails for about $300.

It's a quadcopter, meaning it's a miniature helicopter with four rotors; basically it looks like a giant four-leaf clover designed by Darth Vader. It's noisy and a bit fussy: it spits error messages at you from a comprehensive menu of them, and it recovers from catastrophes slowly and sulkily. (Pro tip: quadcopters mix poorly with greenery.) But when it's on its best behavior, the Parrot is a little marvel. You control it with an app on your smart phone, to which it feeds real-time video in return. Mashing the Take Off button causes it to leap up to waist height and hover there, stock still, in the manner of Harry Potter's broomstick. It's so firmly autostabilized that on a hot day small children will gather under it to get the cool downwash from its rotors.

It's a toy, the robotic equivalent of a house pet. But just as cats and dogs are related to tigers and wolves, the Parrot is recognizably genetically related to some very efficient killers.

Flying a drone, even just a Parrot, makes you realize what a radically new and deeply strange technology drones are. A drone isn't just a tool; when you use it you see and act through it — you inhabit it. It expands the reach of your body and senses in much the same way that the Internet expands your mind. The Net extends our virtual presence; drones extend our physical presence. They are, along with smart phones and 3-D printing, one of a handful of genuinely transformative technologies to emerge in the past 10 years.

They've certainly transformed the U.S. military: of late the American government has gotten very good at extending its physical presence for the purpose of killing people. Ten years ago the Pentagon had about 50 drones in its fleet; currently it has some 7,500. More than a third of the aircraft in the Air Force's fleet are now unmanned. The U.S. military reported carrying out 447 drone attacks in Afghanistan in the first 11 months of 2012, up from 294 in all of 2011. Since President Obama took office, the U.S. has executed more than 300 covert drone attacks in Pakistan, a country with which we're not at war. Already this year there are credible reports of five covert attacks in Pakistan and as many as eight in Yemen, including one on Jan. 21, the day of Obama's second Inauguration. The Pentagon is planning to establish a drone base in northwestern Africa.

The military logic couldn't be clearer. Unlike, say, cruise missiles, which have to be laboriously targeted and prepped and launched over a period of hours, drones are a persistent presence over the battlefield, gathering their own intelligence and then providing an instantaneous response. They represent a revolution in the idea of what combat is: with drones the U.S. can exert force not only instantly but undeterred by the risk of incurring American casualties or massive logistical bills, and without the terrestrial baggage of geography; the only relevant geography is that of the global communications grid. In the words of Peter Singer, a senior fellow at the Brookings Institution and the author of Wired for War: The Robotics Revolution and Conflict in the 21st Century, drones change "everything from tactics to doctrine to overall strategy to how leaders, the media and the public all conceptualize and decide upon this thing we call war."

Having transformed war, drones are getting ready to transform peace. A year ago Obama ordered the Federal Aviation Administration (FAA) to expedite the process of integrating "unmanned aerial vehicles," as drones are primly referred to within the trade, into civilian airspace. Police departments will use them to study crime scenes. Farmers will use them to watch their fields. Builders will use them to survey construction sites. Hollywood will use them to make movies. Hobbyists will use them just because they feel like it. Drones are an enormously powerful, disruptive technology that rewrites rules wherever it goes. Now the drones are coming home to roost.

by Lev Grossman, Time |  Read more:
Photo: Gregg Segal
Graciela Iturbide
via:

Saturday, February 2, 2013


John Brunsdon R.E., Etching, Pass Near Conniston
via:

Why Did I Bring a Teenager to Venice?


The guidebook. That’s what I think of when I think of Venice. Sure, there were canals, palazzos, and pigeons in the Piazza San Marco. There were Titians and Tintorettos looming out of niches in cool, dark churches. There was even an extravagant trip on a gondola. But what I’ll never forget is the guidebook.

It was 1991, and I was 13. My family lived in a large, misshapen cottage in the English county of Hertfordshire. My grandma, who lived with us, was dying, and my parents were tending her through her difficult last months. Offering what help she could, my mum’s best friend, Annie, volunteered to take my sister, Katie, and me abroad for a week, a little respite for us all.

A single woman in her 30s of meticulous taste, Annie had (and still has) a particular love of Italy, an irresistible, almost religious feeling for the place, akin to Michelangelo’s passion for marble, or Garfield’s for lasagna. And so we were dispatched to Venice—along with Kate, another family friend of Annie’s who was my sister’s age—as the charges of an untested parent.

Thanks to her sophistication and style, Annie’s idea of a holiday was as close to ours as Camembert is to string cheese. She revered the Renaissance, basked in the baroque. We liked to eat ice cream. On the first day, she produced J.G. Links’s 1973 Venice for Pleasure and began to read aloud. The history of the doges, the origins of the Carnevale—the words of the guidebook became our soundtrack as we roved through churches and climbed campaniles. At mealtimes, Annie quizzed us to discover whether we had absorbed the knowledge so generously bestowed upon us. Via these impromptu exams, the guidebook became the dispenser and withholder of pleasures—a scoop of ice cream, instead of fruit; french fries with dinner, instead of spinach. Could we describe the ceremony of La Sensa, in which the Venetians rowed out into the Adriatic in all their pomp and threw a ring into the waves, to honor their “marriage” to the sea? I can, to this day. My unlucky sister, however, ate a lot of spinach.

I don’t want to give the wrong impression. For all of us, this was one of the most memorable trips of our lives, a heady cultural hit laced with an intoxicating freedom from normal parental controls, aided by some of the most eccentric chaperoning the city had seen. A twist of luck landed us in a 17th-century palazzo in the heart of Venice. The furniture, all antique, was defended against the arrival of a 13-year-old and two 11-year-olds with not-to-be-removed plastic sheeting, and the three of us slept together in an enormous four-poster bed. One night, as we slipped under the duvet, we heard a singing gondolier. With one mind, we leaped from the bed, threw on our shoes, and, led all the way by Annie, chased the sound down the alleyways of the San Marco district. Rushing onto a bridge, we watched the operatic operator glide beneath us. All four in our pajamas.

Twenty years later, and approaching my mid-30s, I have had even less exposure to children than Annie had when she gamely took us on. This isn’t from lack of opportunity, mind you, but by choice. My friends have patiently accepted that I grow bored easily around their offspring, and that I have the maternal instinct of a mollusk. The only regular kid contact I’ve had—and by regular, I mean a couple of encounters a year—is with Annie’s own daughter, Niambh (an old Irish spelling; you pronounce it “Neev”). Niambh was born while I was in college, and last year she turned 13. Ready or not—and I really wasn’t—I sensed that a debt must be paid.

And so, one morning in May, I stand at London Gatwick, accepting a minor into my care and checking in for a flight to Venice’s Marco Polo Airport. It is 6:30 a.m., and Niambh is surprisingly chipper for a teenager forced so early from her bed. Annie, at an even greater pitch of excitement, has brought along that guidebook by J.G. Links as well as a large sketchbook. All we need is a wooden tennis racket, and we’ll be characters in A Room with a View. “You simply must make her speak Italian,” Annie trills as we join the line at Departures. “We told the school it was an educational trip.”

Niambh huffs and makes a face behind her mother’s back. “You really don’t need to do that,” she says as soon as we’ve bid Annie good-bye. “My Italian teacher will never know the difference. I’ll just say everything with more of an accent when I get back.” She seems far more assured than I did at 13. I am intimidated by her already.

by Emma John, Afar |  Read more:
Photo: by Peter Dench

Jacob Schere Branching Out in Winter
via: