Friday, September 9, 2011

Personalizing Your Hotel Search

by Michelle Higgins

Searching for a hotel online has long been limited to plugging in your travel dates and destination and then sifting through star ratings and prices. But there are other factors involved. Is the hotel in a convenient location? Is it child friendly? Will the room have a view of a brick wall or the sea?

Now, a number of Web sites are attempting to answer these questions with tools including photo-based searches and maps that show where a town’s hot spots are.
 
Google.com/Hotelfinder

Google’s experimental hotel search site, which started in July, focuses on where to stay and finding a good deal. After entering your destination, dates and price range, HotelFinder delivers its top recommendations (for cities within the United States) in a list or on a Google Map. A blue perimeter delineates the area, with less-popular zones shadowed in gray.

In addition to the current price of a hotel, the site offers the hotel’s historical average so you can tell if you are getting a deal or not. For example, a $144 nightly rate for the Latham Hotel in Washington in early September was 11 percent less than usual.

Clicking on a hotel brings up a collage of images, reviews by Google users and basic hotel information so you do not have to leave the page to do more research. You can also create a list of hotels you would like to compare further.
 
Best feature: You can redraw the perimeters on the map to narrow your search. So if you want to look at hotels only in, say, the Georgetown neighborhood of Washington, you can manipulate the blue lines to home in on it.
 
Worst feature: The so-called “tourist spotlight” designed to shine a light on popular zones isn’t very enlightening. In a search for New York City hotels, for instance, practically all of Manhattan (with the exception of parts of Harlem and the Lower East Side, where few hotels are located) was highlighted.

Read more:

Thursday, September 8, 2011

Computer Games Explore Social Issues

by Kara Platoni

Social studies teachers Karl Atkins and Scott Deckelmann take on a very serious subject by giving their students a very amusing challenge: Win a computer game. In fact, students have to win PeaceMaker, a simulation of the Middle East peace process, twice -- once while playing as the Israeli prime minister and once as the Palestinian president.

In both cases, students must respond to a rapidly evolving political situation by choosing which actions -- building settlements, launching rockets, making speeches -- are most likely to broker peace. The Scappoose, Oregon, teachers have played PeaceMaker with more than a dozen sections of their freshman global-studies and junior international-relations classes, and they say gaming is an effective way to explore intricate political issues. Indeed, PeaceMaker is at the forefront of a movement -- often called serious games or social-issues games -- in which educators use games to illustrate complex social issues, from immigration to climate change.

"Games are largely misunderstood in our society. They aren't necessarily trivial or sophomoric. Gaming is just a young medium," says Suzanne Seggerman, president and cofounder of Games for Change, a resource and support clearingouse for game developers, nonprofit organizations, and educators. "They're a great way for people to explore serious issues."

Better yet, they make that exploration fun, even addicting, according to Scapoose sophomore Ashley Amick, who played PeaceMaker at school last year. "I never wanted to go to my next class, because I hadn't won yet, and I wanted to see what would happen when I did," she explains. "We usually learn from textbooks or worksheets, but because you automatically learn while you play it, even my classmates that don't like school had fun." 

Modeling the Real World

Social issues are by their nature complex and dynamic. Understanding them involves analyzing cause and effect, multiple viewpoints, and rapidly shifting scenarios. Games easily mirror this fluidity.

"The thing we get with games that is different from what we get with books or other media is that we are able to actually build models of relationships between the different moving parts of a system and let people mess around with them, let people experience what happens when they change one variable or when they introduce a different kind of behavior," says Ian Bogost, an associate professor of computational and digital media at the Georgia Institute of Technology. (Bogost is also an adviser to the Serious Games Summit at the annual Game Developers Conference, and he wrote the book Persuasive Games: The Expressive Power of Video Games.)

"Understanding something such as war or poverty or immigration demands understanding a whole range of different kinds of inputs and outputs," he adds. In other words, if you take an action in PeaceMaker, you'll soon find out what the other side thought of your input.

"It is very clearly active, not passive," says Deckelmann of the way his students use the game. "They are part of the game. They are helping determine the end of the story. They don't get to determine the end of a documentary. It's about them deciding what's important, as opposed to us telling them what is important. And it's allowing them to fail in a safe place where no one can shame them." Games teach almost entirely through trial and error, with few real-world consequences; if you mess up, you can always restart.

Read more:
Slavko Krunic Instructor
via:
Rick Lawrence, fairhaven 02

Tamara de Lempica (1898-1980)
Succulent and Flask, 1941
via:

Are jobs obsolete?

[ed.  Interesting thought piece.  However, it seems whatever direction we take it will have to come from the intersection of capitalism and politics, and at this point that suggests anarchy rather than enlightenment.]

by Douglas Rushkoff

We're living in an economy where productivity is no longer the goal, employment is. That's because, on a very fundamental level, we have pretty much everything we need. America is productive enough that it could probably shelter, feed, educate, and even provide health care for its entire population with just a fraction of us actually working.
According to the U.N. Food and Agriculture Organization, there is enough food produced to provide everyone in the world with 2,720 kilocalories per person per day. And that's even after America disposes of thousands of tons of crop and dairy just to keep market prices high. Meanwhile, American banks overloaded with foreclosed properties are demolishing vacant dwellings to get the empty houses off their books.

Our problem is not that we don't have enough stuff -- it's that we don't have enough ways for people to work and prove that they deserve this stuff.

Jobs, as such, are a relatively new concept. People may have always worked, but until the advent of the corporation in the early Renaissance, most people just worked for themselves. They made shoes, plucked chickens, or created value in some way for other people, who then traded or paid for those goods and services. By the late Middle Ages, most of Europe was thriving under this arrangement.

The only ones losing wealth were the aristocracy, who depended on their titles to extract money from those who worked. And so they invented the chartered monopoly. By law, small businesses in most major industries were shut down and people had to work for officially sanctioned corporations instead. From then on, for most of us, working came to mean getting a "job."

The Industrial Age was largely about making those jobs as menial and unskilled as possible. Technologies such as the assembly line were less important for making production faster than for making it cheaper, and laborers more replaceable. Now that we're in the digital age, we're using technology the same way: to increase efficiency, lay off more people, and increase corporate profits.

While this is certainly bad for workers and unions, I have to wonder just how truly bad is it for people. Isn't this what all this technology was for in the first place? The question we have to begin to ask ourselves is not how do we employ all the people who are rendered obsolete by technology, but how can we organize a society around something other than employment? Might the spirit of enterprise we currently associate with "career" be shifted to something entirely more collaborative, purposeful, and even meaningful?

Instead, we are attempting to use the logic of a scarce marketplace to negotiate things that are actually in abundance. What we lack is not employment, but a way of fairly distributing the bounty we have generated through our technologies, and a way of creating meaning in a world that has already produced far too much stuff.

The communist answer to this question was just to distribute everything evenly. But that sapped motivation and never quite worked as advertised. The opposite, libertarian answer (and the way we seem to be going right now) would be to let those who can't capitalize on the bounty simply suffer. Cut social services along with their jobs, and hope they fade into the distance.

But there might still be another possibility -- something we couldn't really imagine for ourselves until the digital era. As a pioneer of virtual reality, Jaron Lanier, recently pointed out, we no longer need to make stuff in order to make money. We can instead exchange information-based products.

Read more: 

[ed. Update:  As a bookend, here's an interesting non-thought piece:  Rush Limbaugh on Doug Rushkoff.]

Sextortion

by Nate Anderson

In the spring of 2009, a college student named Amy received an instant message from someone claiming to know her. Certainly, the person knew something about her—he was able to supply details about what her bedroom looked like and he had, improbably, nude photos of Amy. He sent the photos to her and asked her to have "Web sex" with him.

Instead, Amy contacted her boyfriend Dave, who had been storing the naked photos on his own computer. (Note: victim names have been changed in this story). The two students exchanged instant messages about Amy's apparent stalker, trying to figure out what had happened. Soon after the exchange, each received a separate threat from the man. He knew what they had just chatted about, he warned, and they were not to take their story to anyone, including the police.

Amy, terrified by her stalker's eerie knowledge, contacted campus police. Officers were dispatched to her room, where they took down Amy's story and asked her questions about the incident. Soon after, Dave received more threats from the stalker because Amy had gone to the police—and the stalker knew exactly what she had said to them.

Small wonder that, when the FBI later interviewed Amy about the case, she was "visibly upset and shaking during parts of the interview and had to stop at points to control her emotions and stop herself from crying." So afraid was Amy for her own safety that she did not leave her dorm room for a full week after the threats.

As for Dave, he suffered increased fear, anxiety, confusion, and anger; he later told a court that even his parents "had a hard time trusting anyone or even feeling comfortable enough to use a computer" after the episode.

Due in large part to the stress of the attack, Dave and Amy broke up.

But who had the mysterious stalker been? And how did he have access both to the contents of Dave's computer and to private discussions with police that Amy conducted in the privacy of her own room?

Read more:
Toshiaki Kato, Rapunzel
via:

The King and I

by Ray Connolly

Elvis Presley changed my life. I’m old enough to admit it now. Actually he changed a lot of lives. That’s the point about him, the reason why we hear his name and see his face so often, why his record company still releases two or three albums of his songs every year, why his best work can still be given away with a newspaper looking for a sales boost, and why he is recognised by his first name as easily as anyone in the world. He’s been dead for 34 years, yet everyone knows about Elvis.

I first heard him in March 1956. I was 15, a schoolboy in a small town in Lancashire. He was like nothing on earth: nothing in my world, anyway. The word “teenage” barely existed. Once you were fully grown, you were expected to dress and talk and think like a younger version of your parents. In that austere, cautious, know-your-place moment, the sound of Elvis singing “Well, since my baby left me, well I’ve found a new place to dwell” struck like a lightning bolt. His voice was stark, ghostly, echoing. Paul McCartney still talks of that record, “Heartbreak Hotel”, as being musical “perfection”. Culturally it was something else—a birth cry, perhaps, although we didn’t yet know what was being born. Whatever it was, I was determined to be included.

“Heartbreak Hotel” was not the first rock hit in Britain. Bill Haley’s “Rock Around the Clock” had come out the previous year and started riots when it rang out in the film “Blackboard Jungle”, or so the papers said. Maybe, but not in the cinema I went to. “Rock Around the Clock” was sung by a pleasant, chubby, 30-year-old man with a chessboard jacket and a kiss curl who had stumbled on board a new trend. Entertaining as his Comets were, Haley’s music was beamed through a prism of early-onset middle age.

And then came Elvis, just 21, with his puppy-dog face, obscenely long hair for the time, and all the confidence of the idiot savant who had sucked in half a dozen musical styles, mixed them together and unwittingly created an idiom of his own. He even had a strange name: Elvis. We’d never heard of anyone called Elvis before. His detractors, which is to say just about everyone out of their teens, declared immediately that he was a flash in the pan who couldn’t sing.

It was more a case of them not being able to hear, because if Elvis could do nothing else he could sing—anything and everything. An untrained tenor with a pleading, urgent quality, he had an innate gift for musical communication. Over a billion records sold now attest to that. Before him, popular singers had been mainly bland and polished—variations on a theme of Perry Como, dressed in light-orchestral string arrangements. Elvis, backed by blue-collar, do-it-yourself instruments—guitar, bass and drums—sang with operatic emotion distilled through the blues artists he had heard on black radio stations in the South in the 1940s. His first musical ambition had been to join a gospel quartet: as a teenager in racially segregated Memphis, he stood in a visitors’ porch at a black church just to hear the singing. 

To an English boy who was just discovering John Steinbeck and William Faulkner, Elvis’s story was almost melodramatically romantic. Born in Tupelo, Mississippi, in 1935, he was a surviving twin whose stillborn brother had been buried in a cardboard box. When he was three, his father went to jail for doctoring and cashing a cheque from his landlord to pay for a pig. At 18, having never performed in public, he went into Sun Records in Memphis, a small company that did a sideline in private recordings, and paid $3.98 to make an acetate disc of “My Happiness”. Sun’s owner, Sam Phillips, saw his potential. Less than three years later, wearing sideboards which made him look like a trooper in the American civil war, Elvis was the most famous young man in the world.

Read more:

My Kailua

by Lawrence Downes

Walking to the beach with my family on a hot Kailua afternoon, let’s say 1972. My toy foam surfboard clip-clopping against my knees, towel scratching my neck, rubber slippers squeaking on steamy blacktop. Around the corner of Kuuala Street, across Kalaheo Avenue, then down the skinny beach path, hugging  a cinderblock wall under a thick, shady row of octopus trees and bougainvillea. Footfalls echoing on packed dirt.

Coming out onto Kailua Bay. A field of impossible blue, sky down to water. Squinting in the brilliance of the broad, white crescent beach.

My father swimming, in long, lazy lines parallel to shore. My mother sitting on the sand. Me, pondering the choices: sand castles or sand balls — wet double handfuls smooth-coated with dry sand into hard, sugar-dusted spheres; such a pity to have to whip one at your brother.

Kailua. The guidebooks say it’s basically a beach. But there’s a town wrapped around the beach, and, around that, a whole other side of the 600-square-mile island of Oahu — the windward side, a world away from Honolulu. Kailua is barely half an hour from downtown and Waikiki, but separated by a soaring ribbon of razorback mountains, the Koolau Range. The green lava wall is pierced near its summit by two sets of highway tunnels, like airlocks in time and space. The Honolulu side is dry and sunny, its postcard loveliness folded among high-rises, offices, airport and freeways. The Kailua side, where I grew up, is greener, quieter, lower and slower, with marshes and palms and that perfect bay.

The windward Oahu I know best is three communities: Kailua, Lanikai and their next-door country cousin, Waimanalo. They’re beachy but not snooty. Kailua has a downtown but no night life to speak of. It’s less a spot for touristic stimulation than a place you nestle into, as Hawaiian royalty once did, escaping dusty Honolulu since long before King Kamehameha’s day.

Two Beatles, John and George, mobbed in Waikiki, fled there once, in 1964. They were discovered, and so, eventually, was Kailua, although it and the rest of windward Oahu have managed to keep a reasonably low profile on Hawaii’s well-worn tourist map.

That may be changing, especially now that President Obama has claimed Kailua as his. He grew up in Honolulu, but Kailua is where he returns. This is his place called Hope, his San Clemente, his Texas hill country. Every winter the Obamas stay at the same rented house at one end of the crescent bay, whose waters he knows from boyhood, as he wrote in his memoir, “Dreams From My Father”:

“I still remember how, one early morning, hours before the sun rose, a Portuguese man to whom my grandfather had given a good deal on a sofa set took us out to spear fish off Kailua Bay. A gas lantern hung from the cabin on the small fishing boat as I watched men dive into inky-black waters, the beams of their flashlights glowing beneath the surface until they emerged with a large fish, iridescent and flopping at the end of one pole. Gramps told me its Hawaiian name, humu-humu-nuku-nuku-apuaa, which we repeated to each other the entire way home.”

In this story, either Gramps or young Barack was mistaken, since the humuhumu is a little reef fish, barely six inches long. But let’s give Gramps a break on his fish names, and allow Barack his childhood lens of magnified wonderment: Hawaiians do still fish here with spears and nets, often in darkness, and are done by dawn.

Read more:

Wednesday, September 7, 2011

Bernard Fleetwood-Walker (1893 - 1965) - “Amity
via:

Adventures in Marketing

[ed.  I'd be pissed.]

by Andrew Adam Newman

In August, food bloggers and mom bloggers in New York were invited to dine at an underground restaurant in a West Village brownstone run, apparently, by George Duran, the chef who hosts the “Ultimate Cake Off” on TLC.

Sotto Terra, the invitation said, was “an intimate Italian restaurant” where attendees would enjoy a “delicious four-course meal,” Mr. Duran’s “one-of-a-kind sangria,” and learn about food trends from a food industry analyst, Phil Lempert. The invitation continued that upon confirming — for one of five evenings beginning Aug. 23 — bloggers would receive an extra pair of tickets as a prize for readers and that the dinner would include “an unexpected surprise.”

The surprise: rather than being prepared by the chef, the lasagna they were served was Three Meat and Four Cheese Lasagna by Marie Callender’s, a frozen line from ConAgra Foods. Hidden cameras at the dinners, which were orchestrated by the Ketchum public relations unit of the Omnicom Group, captured reactions to the lasagna and to the dessert, Razzleberry Pie, also from Marie Callender’s.

“Our intention was to really have a special evening in a special location with Chef George Duran,” said Stephanie Moritz, senior director of public relations and social media at ConAgra.

“The twist at the end was not dissimilar with what brands like Pizza Hut and Domino’s have done in the recent past with success,” she said, referring to hidden-camera advertising campaigns. ConAgra expected to use the footage for promotional videos on YouTube and its Web site, and for bloggers to generate buzz when they wrote about being pleasantly surprised.

But it was the marketers, not the diners, who were in for the biggest surprise.

Read more:
Kevin Chupik
via:

The Girl From Trail's End

by Kathy Dobie

Three teenagers were clustered around the cell phone, heads almost touching as they peered at the video. "Eww...that's nasty." A surge of excitement, of almost electric disgust, passed between them. It was Monday after the long Thanksgiving weekend. The Texas morning was warm and overcast, the air spongy. In the cafeteria of Cleveland High School, the students jockeyed with one another to get a better view of the tiny screen. They could see a naked girl lying on a mattress. A guy moving on top of her. A wall of legs surrounded the couple, like a slatted fence. The faces of the others in the room weren't visible, only their legs and feet, shifting impatiently. It looked like there were eight to ten guys watching the girl, watching and waiting their turn. Each time the guys switched places, another face was revealed—some of them were boys in their school. (One later told a female classmate that he'd stuck a beer bottle into the girl.) Others were older and unfamiliar. But as the video flew from phone to phone that day, almost everyone recognized the girl on the mattress—that long ink black hair, the brown eyes and baby cheeks. She was a sixth grader from the middle school next door. An 11-year-old.

Two and a half months later, the arrests began. On February 18, four Cleveland men were picked up and charged with "continuous sexual abuse" of a child. In court documents, she was referred to as "Regina D. Stewart," a pseudonym. Over the next three weeks, fifteen more men and boys, ranging in age from 16 to 27, were indicted for "aggravated sexual abuse" of a child, bringing the total number of defendants to nineteen. Nineteen men and boys who, if the charges were true, had gathered in a place where no one lived but them—no police, no girlfriends, no fathers, no mothers or grandmothers—and what was wrong became, if not exactly right, then all right. They would all plead not guilty.

Even before the arrests, the press descended on this East Texas town of 8,000 where half the population is white, a quarter is black, and a quarter, Hispanic. Located just forty-five miles north of Houston, Cleveland is both rural and citified. Families keep chickens and donkeys while fast-food restaurants pull in traffic off the main drag. The crime rate is high, the faces are friendly, and the air smells of crispy chicken, toasted ancho peppers, fresh-cut grass, manure, truck fumes, lilacs, pine sap, and mud.

By mid-December, TV-news trucks were gathered outside the high school. Reporters ducked into pews at church services and, notebooks in hand, grimly worked the playgrounds. Each new development brought another wave of media attention. Frustrated, a Cleveland teenager posted on Facebook, "man yall y r we still on the fuckin news they need to let that shit go." The story, already red-hot, became inflammatory when it was reported that all of the suspects were black and the victim Hispanic. Friends and relatives of the men and boys were quoted defending them and blaming the girl, who they said acted much older than 11, wearing makeup and sexy clothes. They speculated that she had probably lied about her age, so how were the males to know? The New York Times was roundly castigated for its "rape-friendly" coverage of the assault, which was heavy on sympathetic quotes about the defendants and uncritical of malicious comments about the victim. After receiving tens of thousands of readers' complaints, the Times took the extraordinary step of sending its reporter back to Cleveland for a do-over, and the media began to cover its own coverage. Clearly no one was planning to "let that shit go" anytime soon.

Read more:

Annoying? Yoga? Surely Not

by Sarah Miller

For some it's an ancient path to health and enlightenment. For others it's utterly infuriating. And I should know – I'm an instructor.

In addition to being somewhat crazy – a shrink once diagnosed me with borderline personality disorder, which I thought was a bit of a stretch until I realised that, like everyone else, he just wanted to have sex with me – I am a yoga teacher. Should you, recoiling in horror as you read this, find yourself asking, "But how does someone like this become a yoga teacher?", the short answer is that I gave a man with a beard and his hot wife $3,200. The long answer is … well, I'd like to say that it's because if I hadn't become obsessed with yoga I'd probably be dead, because that's what people always say about things like this. But that would be, frankly, a little overdramatic. Let's just say that if I didn't do yoga everything bad about me would just be worse, and what is bad is already bad enough.

Now, because you can't get something for nothing, there's a problem: yoga can be extremely annoying. There's no getting around it. Yoga has moments of such profound annoyingness that after I finished Eat, Pray, Love (I read the ashram section 100 times) all I could think was: "You wrote an entire book about yoga and meditation and you never mentioned, 'Oh, by the way, sometimes you will want to punch these people in the face'."

And this is where I perform my public service; in yoga we call that a seva (how annoying is that?). All the stuff Elizabeth Gilbert was too high on homemade pizza and Javier Bardem penis to mention, you need to know. Everyone's always telling you how great yoga is, and that's true, but then you go and maybe the studio smells like onions steamed in cat pee, and it might have been helpful to know about that beforehand.

You need to know exactly what will disturb you before you get there, so you can prepare; and you should also know that, even though everyone around you will seem perfectly unperturbed, someone feels your pain. Oh, and by the way, I want to underscore that what follows below is what bugs me about yoga; everything else is a glittering gift from Lord Shiva. Namaste!

Read more:

Cost of Doing Business

by Barry Estabrook

Given that Salmonella-contaminated ground turkey produced by Cargill, Inc. had already sickened more than 100 people and killed one, William Marler’s offer to the Minneapolis-based company early last month seemed worth considering: Regularly test your meat for antibiotic-resistant Salmonella, and I won’t sue you.

Suing corporations that sicken their customers is something Marler does often and well. He is a Seattle trial attorney whose firm, Marler Clark, specializes in representing victims of food poisoning. It’s proven to be a lucrative specialty. Marler has won more than $600 million for his clients over the past two decades. A good chunk of that money has come from Cargill, which, according to Marler, has had four outbreaks of resistant salmonella in its facilities in the last 10 years.

Marler also knows a fair bit about self-promotion. His offer was obviously designed to draw public attention to his firm, which represents about two dozen victims of the most recent Cargill outbreak. But in addition, Marler hoped that his overture would shine light on one of the most gaping holes in the tattered safety net that is supposed to keep our food supply safe.

Astoundingly, under current United States Department of Agriculture (USDA) rules, it’s perfectly okay for companies to sell meat to the public that is contaminated with Salmonella and other disease-causing bacteria.

Although the USDA stipulates that meat and poultry containing “adulterants” cannot be sold, it recognizes only one bug—E. coli O157:H7—as an adulterant, even though Salmonella, Listeria, Campylobacter, and many strains of E. coli have also sickened or killed people. In a twist of logic that would baffle anyone other than a bureaucrat, these potentially lethal bacteria achieve official adulterant status after the fact—only in specific instances when they actually make people sick. “Then they magically become adulterants,” said Marler in an interview.

Since the USDA decreed that E. coli O157:H7 was an adulterant in 1994 and required companies to test for the bug and to cook any positive samples before distributing them to consumers, Marler has noticed a dramatic drop in the outbreaks of illness caused by E. coli-tainted ground meat. “Prior to that, 90 percent of our firm’s revenue came from E.-coli cases linked to hamburger,” said Marler. “That’s virtually disappeared—with one little act.”

Tuesday, September 6, 2011

Facebook Existentialism

Paradise

Fiction
by David Guterson

They went in late September, starting out on I-5, which she handled by staying in the right lane with ample braking distance, keeping her hands at 9 and 3 on the wheel, and disdaining speeders and tailgaters. No problem there—he found her driving style charming enough. She was a silver beauty in a dark blue Honda Element—one of those boxy, hip-to-be-square cars—with nearly inaudible public-radio chatter on fade, and all of that was fine too. She wore a jean jacket with mother-of-pearl buttons, an ironed pastel skirt, and suede-laced sandals. Her eyes were green, her smile was warm, and she didn’t talk just to fill space. She seemed self-sufficient but not cold about it. In her politics, she was not so liberal as to be obnoxious, but not so conservative as to suggest one-upmanship. She didn’t pretend to be an organic farmer, kitchen goddess, world traveler, yoga master, artist, or humanitarian; neither was she reactionary with regard to those personas. She was green but not gloomy and, while not indifferent to approaching 60, not obsessed by it either. She had a good sense of humor—quiet and subtle. She didn’t expect to live forever through exercise and a healthy diet. She understood that he was still in the aftermath—damaged goods—without making his condition central to the way she treated him. In short, he wasn’t disenchanted. But he still expected to be.

How had this happened—this trip to Paradise? Via Match.com, that was the simple answer. The idea that he would need Match.com—he wouldn’t have predicted it, hadn’t seen that he would go there. But Match.com was what people did now, and actually it made sense. It saved single people trouble and grief, decreased their disappointments and misunderstandings. Digitized, you put yourself out there, minus the pretense that it was other than what it was. You cut to the chase without preliminaries. And the people you met were just like you—they’d also resorted to Match.com—so you didn’t have to feel embarrassed, really, unless you wanted to do that together and mutually laugh at yourselves.

They’d skipped that step—the self-loathing self-punctures—opting instead for straightforwardness in a wine bar, where he told her immediately about his wife, and she told him about her former husband, long remarried. He described his children—a boy out of college and a girl still in, both thousands of miles from him—and she described her energetic twin sons, who’d found good marriage partners, stayed in Seattle, and started a successful business together selling “hand-forged” doughnuts. He knew about her work from her Match.com profile, but asked about it anyway, as a matter of course: sociology at Seattle University and research, right now, on social networks and epidemiology. His turn arrived: commercial litigation. Specializing in securities fraud. What exactly was securities fraud? And so they got through their first date.

Read more:

Monday, September 5, 2011

The Style Council & Tracey Thorn


Beautiful film photography by Willy Ronis, Henri Cartier Bresson, Robert Doisneau and Elliot Erwitt.