Sunday, August 18, 2013

The Wind Rises


20/30 Vision

Early one morning this summer, a gaggle of fresh-faced twentysomethings streams into an elevator at the Puck Building on Houston Street. “I like your shoes,” one girl tells another, smiling with a first-day-of-school shyness. They could be mistaken for NYU students—the university has space on the building’s second and third floors—but they all exit on floor five, at the bright white offices of Warby Parker, where two of the company’s founders, Neil Blumenthal and Dave Gilboa, are preparing to kick off their weekly companywide meeting. Dressed near-identically in the uniform of the New York start-up entrepreneur—tailored button-downs, dark jeans—the pair look as streamlined as a couple of smartphones and operate just as efficiently, waiting for fashionable backpacks to be dropped into chairs and Fage yogurts to be procured until precisely 8:30 a.m., when Blumenthal, tall and dark-haired with a voice that recalls Ira Glass, calls the meeting to order.

First on the agenda is the online retailer’s new stores in New York, Boston, and Los Angeles, the opening of which came as a surprise to the industry, since Warby Parker has fashioned itself as a pioneer in the new wave of e-commerce. The founders take a kind of fatherly pride in maintaining transparency with their employees, and so Blumenthal walks them through the reasoning. “One of the things we’ve learned is that if you really want to be a dominant player, you need to have a presence in both online and brick-and-mortar,” Blumenthal tells the group. “Especially in categories like fashion. Other categories, like toilet paper or diapers or paper towels, those are going to shift online more dramatically.” He goes on to cite an example: “My wife and I have actually never bought diapers in a store, which is kind of amazing,” he says. “There are probably a few other people here that have also never done that.”

There’s a pause, then a wave of laughter as Blumenthal realizes his mistake: No one in this audience is buying diapers. Blumenthal, 33, and Gilboa, 32, are pretty much the oldest people at the company.

After that, various departments offer presentations to the class: The social-media coordinator introduces a new strategy; representatives from marketing show off the company’s recent video collaboration with the designer Chrissie Miller, a soft-focus seventies-­style short of a Coney Island dance-off inspired by The Warriors and West Side Story. There is a lot of up-talking? That manner of speech in which you phrase a fact as a question? It seems contagious? Especially among the newer employees? Many of whom still have, stuck to their chairs, balloons with an image of a cow saying nice to meat you. “This is actually our biggest number of hires in one week,” Blumenthal tells his employees, whose numbers have swelled to 250. “Come on up,” he says, Ira Glass morphing into Bob Barker, “and give us your fun fact!”

Fun facts are a Warby Parker tradition, a getting-to-know-you exercise that upholds one of Warby Parker’s eight core values, written on the wall of the kitchen: “Inject fun and quirkiness into everything we do.” While no one has managed to top one early hire’s mind-blowing revelation that she once held Michael Jackson’s infant son Blanket, the newest additions to the team are unlikely to disappoint—the company employs a “cultural swat team” that weeds out dullards in interviews with questions like “When was the last time you wore a costume?”

First up is Kate, from product strategy, who describes herself as a rodeo enthusiast. “Actually, a champion barrel racer?” Next is Priyata, who recently returned from a trip to war-torn Syria, “where we heard missiles but survived?” Natalie, from customer experience, was a “fan dancer” for BeyoncĂ© in the Super Bowl halftime show? Julie lost her sense of smell crowd-surfing at 16? Ryan was the vocalist in a metal band? Emily recently rode an elephant named Pancake?

Blumenthal closes the meeting by talking about upcoming opportunities to volunteer with nonprofits like Venture for America and the Network for Teaching Entrepreneurship. (“Do good” being another one of Warby Parker’s core values, not to be confused, Blumenthal will tell you, with “Don’t be evil”: “Doing good is proactive—‘How can we make this world a better place?,’ not ‘How can we prevent doing something bad?’ ”) He announces a new employee happy hour, “for those of you that can drink.” Then Gilboa proffers one final thought. “We’re very happy to announce that this is the first time a sun SKU—the Downing in walnut tortoise—has made it into the top five best-selling glasses for the month,” he says.

Oh, right! The glasses.

by Jessica Pressler, NY Magazine |  Read more:
Image: Daniel Shea/New York Magazine

Why Can't My Computer Understand Me?

Hector Levesque thinks his computer is stupid—and that yours is, too. Siri and Google’s voice searches may be able to understand canned sentences like “What movies are showing near me at seven o’clock?,” but what about questions—“Can an alligator run the hundred-metre hurdles?”—that nobody has heard before? Any ordinary adult can figure that one out. (No. Alligators can’t hurdle.) But if you type the question into Google, you get information about Florida Gators track and field. Other search engines, like Wolfram Alpha, can’t answer the question, either. Watson, the computer system that won “Jeopardy!,” likely wouldn’t do much better.

In a terrific paper just presented at the premier international conference on artificial intelligence, Levesque, a University of Toronto computer scientist who studies these questions, has taken just about everyone in the field of A.I. to task. He argues that his colleagues have forgotten about the “intelligence” part of artificial intelligence.

Levesque starts with a critique of Alan Turing’s famous “Turing test,” in which a human, through a question-and-answer session, tries to distinguish machines from people. You’d think that if a machine could pass the test, we could safely conclude that the machine was intelligent. But Levesque argues that the Turing test is almost meaningless, because it is far too easy to game. Every year, a number of machines compete in the challenge for real, seeking something called the Loebner Prize. But the winners aren’t genuinely intelligent; instead, they tend to be more like parlor tricks, and they’re almost inherently deceitful. If a person asks a machine “How tall are you?” and the machine wants to win the Turing test, it has no choice but to confabulate. It has turned out, in fact, that the winners tend to use bluster and misdirection far more than anything approximating true intelligence. One program worked by pretending to be paranoid; others have done well by tossing off one-liners that distract interlocutors. The fakery involved in most efforts at beating the Turing test is emblematic: the real mission of A.I. ought to be building intelligence, not building software that is specifically tuned toward fixing some sort of arbitrary test.

To try and get the field back on track, Levesque is encouraging artificial-intelligence researchers to consider a different test that is much harder to game, building on work he did with Leora Morgenstern and Ernest Davis (a collaborator of mine). Together, they have created a set of challenges called the Winograd Schemas, named for Terry Winograd, a pioneering artificial-intelligence researcher at Stanford.

by Gary Marcus, New Yorker |  Read more:
Image: Arnold Roth

Saturday, August 17, 2013

Hempfest 2013


[ed. Ahh, the sweet smell of dope in the morning. Smells like...Victory.]

Hempfest 2013 certainly had a reason to celebrate this year with passage of Initiative 502 legalizing the distribution and personal use of marijuana throughout the State of Washington. And celebrate they did although I've never seen a more laid back, courteous and cheerful victory celebration than this one. The pungent air wafting along the coast and across the beaches carried an easygoing vibe that made everyone seem like a long-lost friend. I also heard snatches of conversations that were truly borderline zen in their cosmic weirdness. Anyway, without further commentary... enjoy the pictures.

by markk

The Magic Bus of Merry Prankster's fame.

 The now infamous Doritos that were passed around by cops to needy fairgoers.

Collector's item

Having a blast

Friday, August 16, 2013


Fuyo [1864-1936] Snowladen Reeds and Kingfisher 1923
via:

[ed. Haven't got to this yet.]
via:

Winslow Homer - The Fog Warning, 1885
via:

The Notorious MSG’s Unlikely Formula For Success

In the last three years, perhaps the boldest thing Chef David Chang has done with food is let it rot. In his tiny Momofuku research and development lab in New York’s East Village, Chang and his head of R&D Dan Felder have obsessed over the many delicious things that happen when molds and fungi are treated like gourmet ingredients rather than evidence that you need to clean out your fridge.

Without fermentation, we would live in a sad world without beer, cheese, miso, kimchi, and hundreds of other delicious things humans have enjoyed for centuries. But in the carefully labeled containers stacked around the cramped confines of their lab, Chang and Felder have been fermenting new things. They’ve turned mashed pistachios, lentils, chickpeas, and other legumes into miso-like pastes Chang calls “hozon” (Korean for “preserved”). They’ve created variations on Japanese tamari — a by-product of miso production that’s similar to soy sauce — with fermented spelt and rye they call “bonji” (“essence”). They’ve even replicated the Japanese staple katsuobushi (a log of dried, smoked, and fermented bonito that’s shaved into bonito flakes) using fermented pork tenderloin instead of fish.

The flavor Chang and Felder are chasing in creating these new fermented products is umami — the savory “fifth taste” detectable by the human tongue along with salty, sweet, sour, and bitter. When bacteria and fungi break down the glucose in foods that are fermenting, they release waste products. And the waste valued in Momofuku’s lab above all others is glutamic acid, the amino acid that creates the taste of umami on our tongues.

Also on the shelf in Chang’s lab, underneath the jars containing foods in various states of controlled spoilage, is a giant tin of monosodium glutamate, more commonly known as MSG — perhaps the most infamously misunderstood and maligned three letters in the history of food. It just so happens that inside that tin of MSG is the exact molecule Chang and his chefs have worked so hard for the last three years to tease out of pots of fermenting beans and nuts. It’s pure glutamic acid, crystallized with a single sodium ion to stabilize it; five pounds of uncut, un-stepped-on umami, made from fermented corn in a factory in Iowa.

We’ve only known for sure that our tongue has specific taste buds for glutamic acid for 13 years. So for chefs like Chang, the Fat Duck’s Heston Blumenthal, Umami Burger’s Adam Fleischman, and many others around the world, umami’s flavors have become one of cooking’s most exciting new frontiers. The new flavors they’re creating use advanced methods to expand on what millions around the world (but especially in east Asia) have known for centuries — that foods rich in glutamic acid are delicious, and we want to eat them.

For these chefs, the path to understanding umami inevitably leads them to MSG, which is chemically identical to the glutamic acid they’re creating from scratch. And yet Chang wouldn’t think of using MSG in his restaurants today. He told me he doesn’t even use it at home, despite being a professed lover of MSG-laced Japanese Kewpie mayo. After decades of research debunking its reputation as a health hazard, and uninterrupted FDA approval since 1959, MSG remains a food pariah — part of a story that spans a century of history, race, culture, and science and says more about how we eat today than any other.

by John Mahoney, BuzzFeed |  Read more:
Image: uncredited

Ripping Off Young America: The College-Loan Scandal

In the early 2000s, a thirtysomething scientist named Alan Collinge seemed to be going places. He had graduated from USC in 1999 with a degree in aerospace engineering and landed a research job at Caltech. Then he made a mistake: He asked for a raise, didn't get it, lost his job and soon found himself underemployed and with no way to repay the roughly $38,000 in loans he'd taken out to get his degree.

Collinge's creditor, Sallie Mae, which originally had been a quasi-public institution but, in the late Nineties, had begun transforming into a wholly private lender, didn't answer his requests for a forbearance or a restructuring. So in 2001, he went into default. Soon enough, his original $38,000 loan had ballooned to more than $100,000 in debt, thanks to fees, penalties and accrued interest. He had a job as a military contractor, but he lost it when his employer ran a credit check on him. His whole life was now about his student debt.

Collinge became so upset that, while sitting on a buddy's couch in Tacoma, Washington, one night in 2005 and nursing a bottle of Jack Daniel's, he swore that he'd see Sallie Mae on 60 Minutes if it was the last thing he did. In what has to be a first in the history of drunken bullshitting, it actually happened. "Lo and behold, I ended up being featured on 60 Minutes within about a year," he says. In 2006, he got to tell his debt story to Lesley Stahl for a piece on Sallie Mae's draconian lending tactics that, curiously enough, Sallie Mae itself refused to be interviewed for.

From that point forward, Collinge – who founded the website StudentLoanJustice.org – became what he calls "a complaint box for the industry." He heard thousands of horror stories from people like himself, and over the course of many years began to wonder more and more about one particular recurring theme, what he calls "the really significant thing – the sticker price." Why was college so expensive? (...)

But the main question is, how is the idea that the government might make profits on defaulted loans even up for debate? The answer lies in the uniquely blood-draining legal framework in which federal student loans are issued. First of all, a high percentage of student borrowers enter into their loans having no idea that they're signing up for a relationship as unbreakable as herpes. Not only has Congress almost completely stripped students of their right to disgorge their debts through bankruptcy (amazing, when one considers that even gamblers can declare bankruptcy!), it has also restricted the students' ability to refinance loans. Even Truth in Lending Act requirements – which normally require lenders to fully disclose future costs to would-be customers – don't cover certain student loans. That student lenders can escape from such requirements is especially pernicious, given that their pool of borrowers are typically one step removed from being children, but the law goes further than that and tacitly permits lenders to deceive their teenage clients.

Not all student borrowers have access to the same information. A 2008 federal education law forced private lenders to disclose the Annual Percentage Rate (APR) to prospective borrowers; APR is a more complex number that often includes fees and other charges. But lenders of federally backed student loans do not have to make the same disclosures.

"Only a small minority of those who've been to college have been told very simple things, like what their interest rate was," says Collinge. "A lot of straight-up lies have been foisted on students."

by Matt Taibbi, Rolling Stone |  Read more:
Image: Victor Juhasz

RECD: The U.S. Behaves Nothing Like a Democracy

According to received doctrine, we live in capitalist democracies, which are the best possible system, despite some flaws. There's been an interesting debate over the years about the relation between capitalism and democracy, for example, are they even compatible? I won't be pursuing this because I'd like to discuss a different system - what we could call the "really existing capitalist democracy", RECD for short, pronounced "wrecked" by accident. To begin with, how does RECD compare with democracy? Well that depends on what we mean by "democracy". (...)

In the past, the United States has sometimes, kind of sardonically, been described as a one-party state: the business party with two factions called Democrats and Republicans. That's no longer true. It's still a one-party state, the business party. But it only has one faction. The faction is moderate Republicans, who are now called Democrats. There are virtually no moderate Republicans in what's called the Republican Party and virtually no liberal Democrats in what's called the Democratic [sic] Party. It's basically a party of what would be moderate Republicans and similarly, Richard Nixon would be way at the left of the political spectrum today. Eisenhower would be in outer space.

There is still something called the Republican Party, but it long ago abandoned any pretence of being a normal parliamentary party. It's in lock-step service to the very rich and the corporate sector and has a catechism that everyone has to chant in unison, kind of like the old Communist Party. The distinguished conservative commentator, one of the most respected - Norman Ornstein - describes today's Republican Party as, in his words, "a radical insurgency - ideologically extreme, scornful of facts and compromise, dismissive of its political opposition" - a serious danger to the society, as he points out.

In short, Really Existing Capitalist Democracy is very remote from the soaring rhetoric about democracy. But there is another version of democracy. Actually it's the standard doctrine of progressive, contemporary democratic theory. So I'll give some illustrative quotes from leading figures - incidentally not figures on the right. These are all good Woodrow Wilson-FDR-Kennedy liberals, mainstream ones in fact. So according to this version of democracy, "the public are ignorant and meddlesome outsiders. They have to be put in their place. Decisions must be in the hands of an intelligent minority of responsible men, who have to be protected from the trampling and roar of the bewildered herd". The herd has a function, as it's called. They're supposed to lend their weight every few years, to a choice among the responsible men. But apart from that, their function is to be "spectators, not participants in action" - and it's for their own good. Because as the founder of liberal political science pointed out, we should not succumb to “democratic dogmatisms about people being the best judges of their own interest". They're not. We're the best judges, so it would be irresponsible to let them make choices just as it would be irresponsible to let a three-year-old run into the street. Attitudes and opinions therefore have to be controlled for the benefit of those you are controlling. It's necessary to "regiment their minds". It's necessary also to discipline the institutions responsible for the "indoctrination of the young." All quotes, incidentally. And if we can do this, we might be able to get back to the good old days when “Truman had been able to govern the country with the cooperation of a relatively small number of Wall Street lawyers and bankers." This is all from icons of the liberal establishment, the leading progressive democratic theorists. Some of you may recognize some of the quotes.

The roots of these attitudes go back quite far. They go back to the first stirrings of modern democracy. The first were in England in the 17th Century. As you know, later in the United States. And they persist in fundamental ways. The first democratic revolution was England in the 1640s. There was a civil war between king and parliament. But the gentry, the people who called themselves "the men of best quality", were appalled by the rising popular forces that were beginning to appear on the public arena. They didn't want to support either king or parliament. Quote their pamphlets, they didn't want to be ruled by "knights and gentlemen, who do but oppress us, but we want to be governed by countrymen like ourselves, who know the people's sores". That's a pretty terrifying sight. Now the rabble has been a pretty terrifying sight ever since. Actually it was long before. It remained so a century after the British democratic revolution. The founders of the American republic had pretty much the same view about the rabble. So they determined that "power must be in the hands of the wealth of the nation, the more responsible set of men. Those who have sympathy for property owners and their rights", and of course for slave owners at the time. In general, men who understand that a fundamental task of government is “to protect the minority of the opulent from the majority". Those are quotes from James Madison, the main framer - this was in the Constitutional Convention, which is much more revealing than the Federalist Papers which people read. The Federalist Papers were basically a propaganda effort to try to get the public to go along with the system. But the debates in the Constitutional Convention are much more revealing. And in fact the constitutional system was created on that basis. I don't have time to go through it, but it basically adhered to the principle which was enunciated simply by John Jay, the president of the ­ Continental Congress, then first Chief Justice of the Supreme Court, and as he put it, "those who own the country ought to govern it". That's the primary doctrine of RECD to the present.

There've been many popular struggles since - and they've won many victories. The masters, however, do not relent. The more freedom is won, the more intense are the efforts to redirect the society to a proper course. And the 20th Century progressive democratic theory that I've just sampled is not very different from the RECD that has been achieved, apart from the question of: Which responsible men should rule? Should it be bankers or intellectual elites? Or for that matter should it be the Central Committee in a different version of similar doctrines?

Well, another important feature of RECD is that the public must be kept in the dark about what is happening to them. The "herd" must remain "bewildered". The reasons were explained lucidly by the professor of the science of government at Harvard - that's the official name - another respected liberal figure, Samuel Huntington. As he pointed out, "power remains strong when it remains in the dark. Exposed to sunlight, it begins to evaporate". Bradley Manning is facing a life in prison for failure to comprehend this scientific principle. Now Edward Snowden as well. And it works pretty well. If you take a look at polls, it reveals how well it works. So for example, recent polls pretty consistently reveal that Republicans are preferred to Democrats on most issues and crucially on the issues in which the public opposes the policies of the Republicans and favors the policies of the Democrats. One striking example of this is that majorities say that they favor the Republicans on tax policy, while the same majorities oppose those policies. This runs across the board. This is even true of the far right, the Tea Party types. This goes along with an astonishing level of contempt for government. Favorable opinions about Congress are literally in the single digits. The rest of the government as well. It's all declining sharply.

Results such as these, which are pretty consistent, illustrate demoralization of the public of a kind that's unusual, although there are examples - the late Weimar Republic comes to mind. The tasks of ensuring that the rabble keep to their function as bewildered spectators, takes many forms. The simplest form is simply to restrict entry into the political system. Iran just had an election, as you know. And it was rightly criticized on the grounds that even to participate, you had to be vetted by the guardian council of clerics. In the United States, you don't have to be vetted by clerics, but rather you have to be vetted by concentrations of private capital. Unless you pass their filter, you don't enter the political system - with very rare exceptions.

by Noam Chomsky, AlterNet |  Read more:
Image via:

August 16 (a Message on Love)


The calendar entry for Tuesday, Aug. 16, 2011, reads simply, “Love One Another.” My wife, Terry, handed me the entry, a leaf torn from a pad, that morning. A drawing beneath the caption depicts a late middle-aged couple embracing as they walk down the beach, eyes sparkling, mouths agape, sharing a hearty laugh. The sun is setting behind them, throwing glitter across the water. Since Terry gave it to me nearly two years ago, that calendar page has remained on my desk in our bedroom, placed so that I see it every time I pass by. I’m not exactly sure why I saved that particular entry, of the many given to me over the years by Terry, a lover of pithy sayings. Perhaps it was the powerful simplicity of the message. Or perhaps it was the promise it represented: golden years shared with a loving companion. This idea was becoming more poignant as middle age set in and Conrad, our only child, entered high school. Seeing the drawing sometimes made the bittersweet foretaste of empty-nesthood more palatable.

Or perhaps I saved it because coincidentally, on that particular day, we were headed to Hawaii, Terry’s home state, still filled with family and childhood friends. Walking along the beach was one of our favorite things to do, both as a couple and as a family. Waking early in the fog of jet lag, Terry, Conrad and I would buy takeout breakfast at Zippy’s and crouch at the water’s edge to eat as the sun rose over Kailua Bay near Terry’s childhood home. Walking along the beach, feeling the cool, wet sand under our feet as the sun warmed our faces, we were happy and grateful and content.

These days, a small, silver religious medal lies atop that calendar entry on my desk. Terry was wearing it last fall on the day she took her own life, the victim of a devastating depression that gripped her out of nowhere and pulled her into a darkness from which she felt there was no escape.

Her illness was a menopausal version of a terrifying episode of postpartum depression she suffered after Conrad was born. Terry once said the only thing that saved her during that first episode was the maternal instinct — knowing that her baby needed her in order to survive. In a sense, Conrad’s infant vulnerability kept his mother alive through that ordeal. This time, Terry became submerged in a deep melancholia that doctors later said may have been brought on or aggravated by the hormonal changes of menopause. Although she was receiving treatment and was about to see a specialist in women’s mental health issues, Terry became convinced that Conrad and I would be better off without her — without a mother and wife stricken by an unbearable, invisible weight pressing down on her heart. This state of mind, unfathomable to healthy people, is a common symptom of major depression, as William Styron’s wrenching first-person account, “Darkness Visible,” makes clear. Cancer of the spirit, as insidious as any of the varieties that attack the flesh, stole a woman who embodied the radiance and beauty of her island home.

by Curtis J. Milhaupt, NY Times |  Read more:
Image: Katherine Streeter

Thursday, August 15, 2013


Zhou Yansheng
via:

Same Old Chip

Before the second play of his first NFL game, Philadelphia's new head coach, Chip Kelly, a man who made his reputation as the architect of college football's most prolific offense — the Oregon Ducks' fast-break, spread-it-out attack — did the unthinkable: He had his team huddle. He followed this with another knee-weakening moment: His quarterback, Michael Vick, lined up under center, an alignment from which the Eagles ran a basic run to the left. For 31 other NFL teams, this would be as ho-hum as it gets. But this is Chip Kelly, he of the fast practices, fast plays, and fast talking. By starting out this way, Kelly, who repeatedly has said he doesn't do anything without a sound reason behind it, was no doubt sending some kind of message to fans, pundits, and opposing coaches waiting anxiously to see what a Chip Kelly offense would look like at the professional level. It was a message that was unmistakable: See, I can adapt to the NFL.

At least that’s what I thought at first. But after studying Philadelphia's game against New England, I came away with almost the exact opposite conclusion: While there were clear differences from what Kelly’s system looked like at Oregon, his Eagles offense looked a lot more like the Ducks offense than I ever anticipated.

Preseason game plans are often described as being "vanilla," and rightly so. The ostensible purpose of the preseason — other than as an opportunity to put more football on TV, which I'm not complaining about — is to evaluate talent as rosters get cut to 53 and players compete for starting spots. Given that preseason football is essentially practice with game uniforms, there is no incentive for a team to reveal its intentions for when the real games begin.

Yet in these vanilla preseason plans often lies some basic truth about a team's identity. In the regular season, the plan will be carefully tailored to the specific strengths, weaknesses, and tendencies of that week's opponent; in the preseason, the plan is stripped to focus only on the most basic, foundational concepts installed in the first few days of camp — the elements the coaches have decided are so essential that a player who cannot master them cannot be on the team.

And so, at the risk of sounding like Vincenzo Coccotti, Michael Vick may have said the Eagles only used about "a third" of their total scheme, but what Philadelphia did was show a lot about how Kelly and his staff will approach bringing his offense to the NFL. More than anything else, Kelly showed that he's not leaving behind what worked for him at Oregon.

Tempo

For years, Kelly's teams have been synonymous with one word: speed. Speed on the field, in practice, and, most famously, in how often they run plays. Known for a blistering, unrelenting pace, Kelly's teams at Oregon were perceived to have simply exhausted opponents, causing missed tackles and blown assignments as defenses cried out for help. This isn’t devoid of truth, but the idea that Kelly's teams always went at warp speed has been overblown, and since he became head coach of the Eagles, the myth that Kelly would go into every game trying to run as many plays as possible has been treated as established fact.

"It will be a weapon for us and a tool in our toolbox," Kelly said of the fast-paced no-huddle after the game, according to the team’s website. Even at Oregon, Kelly's offense had not one but three speeds: red light (slow), yellow light (medium: team gets to the line but quarterback can slow it down and change plays), and green light (superfast: get to the line and run the play). Good defenses will adapt to any pace, but a good no-huddle — whether it’s Kelly's Eagles, Peyton Manning's Broncos, or Tom Brady's Patriots — will vary the speed, using it strategically, waiting to put its foot on the gas pedal precisely when the defense has the wrong personnel stuck on the field.

by Chris Brown, Grantland |  Read more:
Image: Drew Hallowell/Philadelphia Eagles/Getty Images

Tracy Wall
via:

My Smartphone is Smarter Than Me

I just recently bought my first smartphone. I know, I know... welcome to the 20th century... wait...uh, 21st century (man, where do those centuries go?). Anyway, as expected, my smartphone is smarter than me. At least it has more functions than I'll ever use, many of which are truly awesome, probably. The thing about it though is this, I don't have enough patience to learn them. Yes, there's a nice 27 page manual (pdf) I could print off that would tell me everything I need to know about how to make my smartphone do all kinds of smart, cool tricks. But do I need all those things? Not much. I can't even imagine situations where they might come in handy, let alone remember how they might be used. Then there are the bazillions of apps (produced by the bazillions of tech startups that everyone seems to work for these days), each with it's own little universe of quirkiness. Like the camera app, one of the most basic and functional things you can do with a smartphone, right? It has a hundred different features (probably) that could make me the envy of National Geographic. Will I ever use them all? No. Will I use maybe even most of them? No. Will I at least be able to see them when I'm standing outside in broad daylight? No. Believe me, I've tried. The other day I got home and had a twelve minute video of the inside of my pocket. I thought I had taken an earlier snapshot but had apparently touched the wrong button (or slid it the wrong way, or something). I can't even tell when an app is still running, or not.

Which brings me to the main problem I'm having in trying to be smarter than my smartphone: I have to operate on its turf. Somehow the glass interface with all its tapping, turning, sliding, pinching, circling, shaking, etc. etc. just does not seem very intuitive to me. Plus, I must have really fat fingers or something. I have this dissonant memory of trying to teach my mom about computers years ago. She couldn't grasp the analogy of files and folders, documents and desktops, let alone cut and paste, right click/left click and so on, so she'd write short instructions to herself on little post-it notes which were pasted all over her desk. She wondered why they couldn't just make computers more like cable tv. I'm more sympathetic now.

I read an article today about experience design with a quote that seems particularly apt to this rant/discussion:
Today, Buxton, who is principal researcher at Microsoft Research, says that the next challenge for experience design is to create a constellation of devices, including wearable gadgets, tablets, phones, and smart appliances, that can coordinate with one another and adapt to users’ changing needs. This focus on the totality of our devices stands in contrast to where we find ourselves today: constantly adding new gadgets and functions without much thought as to how they fit together. (For instance, anyone lugging around a laptop, iPad, and iPhone is also carrying the equivalent of three video cameras, three email devices, three media players, and probably three different photo albums.) Even as our devices have individually gotten simpler, the cumulative complexity of all of them is increasing. Buxton has said that the solution is to “stop focusing on the individual objects as islands.” He has come up with a simple standard for whether a gadget should even exist: Each new device should reduce the complexity of the system and increase the value of everything else in the ecosystem. (...)

In the wrong hands, this is a dystopian prospect—technology’s unwanted intrusion into our every waking moment. But without the proper design, without considering how new products and services fit into people’s day-to-day lives, any new technology can be terrifying. That’s where the challenge comes in. The task of making this new world can’t be left up to engineers and technologists alone—otherwise we will find ourselves overrun with amazing capabilities that people refuse to take advantage of. Designers, who’ve always been adept at watching and responding to our needs, must bring to bear a better understanding of how people actually live. It’s up to them to make this new world feel like something we’ve always wanted and a natural extension of what we already have.
I guess the alternative would be more of the same (e.g. seven remotes scattered around the living room), or as my friend suggested when I told her about my phone issues: just go find a kid.

by markk
Image: markk (sans duck lips)

Dark Cloud

[ed. Forget that pesky old Fourth Amendment, it might be the bottom-line that really counts.]

Whether it's tech companies' fault or not—it's hard to fight against secret court orders, although some folks are—the PRISM scandal and related surveillance programs have dissolved any trust consumers could have in the privacy of US-based servers. That lack of faith comes with a cost: According to a new report from the the Information Technology and Innovation Foundation (ITIF), the US could end up losing out on tens of billions of dollars in the cloud-based computing space.

The cloud computing space has been growing steadily in past years, and is projected to boom even further in the next few, as shown by the ITIF graph at right. The United States, serviced by giants like Google and Amazon, has until now spent more money on cloud computing than the rest of the world combined, but that gap has closed considerably, with Western European markets expected to grow heavily in the next few years.

While Europe in particular has been open about trying to spur local cloud efforts, American firms still had a great opportunity to dive into a budding market. But with the US's great cloud computing secret now out in the open—American servers can be tapped whenever, in secret, with secret court orders—those firms are going to have a much more difficult time competing with upstarts like Iceland, where strict privacy laws have fostered growth in cloud computing and hosting services.

That the US's intrusions into data would have chilling effects on the data economy is no surprise. "It is often American providers that will miss out, because they are often the leaders in cloud services," Neelie Kroes, European commissioner for digital affairs, told the Guardian in July. "If European cloud customers cannot trust the United States government, then maybe they won't trust US cloud providers either. If I am right, there are multibillion-euro consequences for American companies. If I were an American cloud provider, I would be quite frustrated with my government right now. (...)

Based on its market analysis, the ITIF pegs the potential loss to US cloud companies over the next three years at between $21.5 billion and $35 billion. (These are report estimates based on a projected market share loss that's magnified as the global market grows.) And beyond those years, US companies' lost market share will continue to be a disadvantage.

by Derek Mead, Motherboard |  Read more:
Image: NSA Security Operations Center, courtesy NSA

Dying of a Broken Heart

[ed. Wonderful serialized blog/book, with successive chapters posted on the right side of the page. This is Chapter One. Be sure to read the Prologue, too.]

It happened just over a week ago. I was lying on my med-bed on the third floor of the local Nashville nuthouse, waiting for the Ambien to amplify all of the other shit coursing across the blood-brain barrier: Zoloft…150 mg, a zonester of there ever was one…Ativan, a 10 mg mini-pill, clicking along the mellow mental interstate like an unloaded 18-wheeler dead-heading home…and my personal favorite, Risperdal, its 1 mg packing a punch like a lead-weighted glove aimed straight at the deepest wrinkles in the old medulla oblongata. Suddenly I saw them, in Technicolor on the insides of my eyelids, these words: I am dying of a broken heart.

Don’t get me wrong. It had nothing to do with the drugs, and certainly nothing to do with checking myself into the nut for a much-needed life-recalculation and some chemical cell-tuning. No, it had been coming for a long, long time, and the only thing that had saved me until then was that tried and true foxhole fixation, Denial. I’ve been real, real good at it, Denial. After all, I’m a Truscott.

My grandfather, General Lucian King Truscott Jr., died of a broken heart. So did my father, Colonel Lucian King Truscott III. And so did my brother Francis Meriwether Truscott, who took his life with a Tokarev pistol taken from an NVA Major in Vietnam. The war finally got him, his wife Debbie told me on the phone the morning his body was found on the back steps of his local funeral home. Broken hearts in the Truscott family follow broken bodies and lost lives right to the bloody fucking end.

I’ve had friends, too, who died of broken hearts. Hunter Thompson, who shot himself in his kitchen when he finally realized that what he loved as much as life itself — the fun — was over. Gore Vidal, who died in his bed never having been able to bring himself out of the Final Closet: he was for the entirety of his life a terminal romantic who lost his first love and could never allow himself to love again. And Bill Cardoso, who died with a tall Dewars and soda in hand and his loving companion Mary Miles Ryan at his side, still raging against a world in which he had no place left to write, no place that would publish his marvelous wit which he wielded with a rapier, nearly intolerable ego.

The weird thing about lying in the dark full of drugs in a nuthouse realizing that you’re dying of a broken heart is how good it feels. It’s soft and psychically comfy to finally realize that where you are and what you are feeling has roots in family tradition, and looked at in that way, there’s really nothing wrong with you. You’re a Truscott. Of course your heart is broken. Of course you’re going to die. The two go together like gin and tonic. (...)

Life is full of love affairs and relationships and sadness and regrets and mistakes and joy and fights and reconciliations and you remember most of them fondly and without rancor, some painfully, and nearly all with love in your heart.

by by Lucian K. Truscott IV | Read more:
Image:Truscott family photo