Thursday, April 3, 2014
Literacy Is Knowledge
Math is relentlessly hierarchical—you can’t understand multiplication, for example, if you don’t understand addition. Reading is mercilessly cumulative. Virtually everything a child sees and hears, in and out of school, contributes to his vocabulary and language proficiency. A child growing up in a book-filled home with articulate, educated parents who fill his early years with reading, travel, museum visits, and other forms of enrichment arrives at school with enormous advantages in knowledge and vocabulary. When schools fail to address gaps in knowledge and language, the deficits widen—a phenomenon that cognitive scientist Keith Stanovich calls the “Matthew Effect,” after a passage in the Gospel of Matthew: “For unto every one that hath shall be given, and he shall have abundance: but from him that hath not shall be taken away even that which he hath.” The nature of knowledge and vocabulary acquisition all but assures that children raised in language-rich homes gain in reading comprehension, while the language-poor fall further behind (see “A Wealth of Words,” Winter 2013). “The mainspring of [reading] comprehension is prior knowledge—the stuff readers already know that enables them to create understanding as they read,” explains Daniel Willingham, a cognitive scientist at the University of Virginia.
To make matters worse, most reading curricula have focused on developing generalized, all-purpose reading-comprehension “skills” uncoupled from subject-specific knowledge—reducing a complex cognitive process to a collection of all-purpose “reading strategies” to be applied to any book or bit of text that a student might encounter. Attempts to teach reading comprehension as knowledge-neutral put an enormous premium on student engagement. For teachers, reading instruction can often feel more like cheerleading: sell kids on the magic of books, get them to read a lot, and—voilĂ !—they will emerge as verbally adroit adults with a lifelong love of reading. As generations of results show, this approach doesn’t work. (...)
Reading comprehension, like critical thinking and problem solving, is what psychologists call “domain-specific”: you need to know something about a topic to be able to think about it. Faced with a text passage about the customs of New Amsterdam, the student familiar with the topic may breeze through with relative ease. For the student who has no idea who the Dutch were, or is unfamiliar with early New York history or has never heard the word “custom,” the passage is a verbal minefield. To shift metaphors, a piece of text is like a tower of wooden blocks, with each block a vocabulary word or a piece of background knowledge. Pull out two or three blocks, and the tower can still stand. Pull out too many, and it collapses.
Imagine taking a child to his first baseball game. If you know baseball, you will easily explain what’s happening. You draw the child’s attention to the most important actions on the field, reflexively tailoring your explanation to the child’s level of understanding. If the child knows nothing about baseball, you might explain the basics: what the pitchers and batters are doing. Balls and strikes. Scoring a run when a player makes it all the way around the bases without being called out. You’d explain what an “out” is. If the child knows the game or plays Little League, you might instead draw his attention to game strategy. Would a bunt or a stolen-base attempt be the best move at a crucial moment? You might point out when the infielders move in, hoping for a double play.
Now imagine attending a cricket match and doing the same thing, assuming that you know nothing about the game. Your knowledge of baseball doesn’t transfer to cricket, though both games feature balls, bats, and runs. “Sports comprehension strategies,” if such existed, would be of no use. Your ability to make sense of what’s happening in front of you and to explain it to a child depends on your knowledge of the specific game—not your ability to connect what you notice to other games that you understand. The same is true of reading. Even if you aced the verbal portion of your SATs, you will find yourself in situations where you are not an excellent reader. You might struggle to make sense of a contract, say, or a new product warranty. Your tech-savvy teenage daughter might have an easier time understanding the instructions for upgrading a computer operating system. You didn’t suddenly become a poor reader in these situations; you’re merely reading out of your depth.
Reading comprehension, then, is not a skill that you teach but a condition that you create. Teachers foster that condition by exposing children to the broadest possible knowledge of the world outside their personal experience. As Daniel Willingham aptly titled one of his instructional YouTube videos a few years ago, “Teaching content is teaching reading.”
The specific body of knowledge that students need for broad reading competence is open to debate, but a useful guideline is to emphasize the common body of knowledge—from basic knowledge of history and science to works of art and literature—that most literate Americans know, as reflected in their speech and writing. This has been the precise aim of E. D. Hirsch’s Core Knowledge movement. Hirsch’s critics have often accused him of attempting to impose a rigid canon, but Core Knowledge is better understood as an attempt to curate and impart the basic knowledge of history, science, and the arts that undergirds literate speech and writing. Regardless of whether schools adopt the Core Knowledge approach or develop their own catalog of essential knowledge, knowledge acquisition belongs at the heart of literacy instruction.
by Robert Pondiscio, City Journal | Read more:
Image: Henri Matisse’s portrait of his daughter readingFire TV, and Amazon's Commitment to Consumption
Amazon has unveiled a new device for your television. It’s called Amazon Fire TV. In the industry, it’s known as a set-top box. It’s black, about the size of a ham sandwich, and extremely powerful. It has “over 3x the processing power of Apple TV, Chromecast, or Roku 3,” according to Amazon’s press release, “plus 4x the memory of Apple TV, Chromecast, or Roku 3 for exceptional speed and fluidity.” Your Fire TV “arrives pre-registered,” which means that after you plug it into your HDTV and connect it to your WiFi, you are immediately ready to consume hundreds of thousands of movies, TV episodes, songs, and video games in 1080p HD video and Dolby Digital Plus surround sound, without ever getting up from your chair.

Your Fire TV has an Advance Streaming and Prediction feature that will record data from your Watchlist and personalized recommendations, deduce your preference for soft-core teen comedy flicks, and automatically buffer “Virgin High” for playback “before you even hit play,” so that you can watch it the instant you admit to yourself that you want to, as you inevitably will. Like Amazon’s patented anticipatory-shipping technology—which, one day, might use your shopping history to place products on trucks near your location before you’ve even thought about buying them—Advance Streaming and Prediction, or A.S.A.P., knows more about your habits and desires than you do. (...)
Convenience, selection, price. As James McQuivey, an analyst with Forrester, told the Times, “Amazon has a vested interest in making sure it is present at every moment of possible consumption, which is all the time. It wants to get into the television screen and start to build a relationship.” Streaming devices are revolutionizing television just as, six years ago, the Kindle revolutionized books. Just as the Kindle is designed to be a portal that brings readers into a permanent relationship with the Amazon universe, Fire TV will do the same for television viewers, who, according to researchers, tend to be binge consumers, with even shorter attention spans and more compulsive shopping habits than book buyers, making them the ideal customers for “Earth’s most customer-centric company.”
by George Packer, New Yorker | Read more:
Image: Diane Bondareff/Invision for Amazon/APThe Great Divide
The program you are about to see is ‘All in the Family.’ It seeks to throw a humorous spotlight on our frailties, prejudices and concerns. By making them a source of laughter, we hope to show—in a mature fashion—just how absurd they are.”
This nervous disclaimer, which was likely as powerful as a “Do not remove under penalty of law” tag on a mattress, ran over the opening credits of Norman Lear’s new sitcom. It was 1971, deep into the Vietnam War and an era of political art and outrage, but television was dominated by escapist fare like “Bewitched” and “Bonanza.” “All in the Family” was designed to explode the medium’s taboos, using an incendiary device named Archie Bunker. A Republican loading-dock worker living in Queens, Bunker railed from his easy chair against “coons” and “hebes,” “spics” and “fags.” He yelled at his wife and he screamed at his son-in-law, and even when he was quiet he was fuming about “the good old days.” He was also, as played by the remarkable Carroll O’Connor, very funny, a spray of malapropisms and sly illogic. (...)
Yet, as Saul Austerlitz explains in his smart new book, “Sitcom: A History in 24 Episodes from ‘I Love Lucy’ to ‘Community,’ ” Lear’s most successful character managed to defy his creator, with a “Frankenstein”-like audacity. “A funny thing happened on the way to TV immortality: audiences liked Archie,” Austerlitz writes. “Not in an ironic way, not in a so-racist-he’s-funny way; Archie was TV royalty because fans saw him as one of their own.”
This sort of audience divide, not between those who love a show and those who hate it but between those who love it in very different ways, has become a familiar schism in the past fifteen years, during the rise of—oh, God, that phrase again—Golden Age television. This is particularly true of the much lauded stream of cable “dark dramas,” whose protagonists shimmer between the repulsive and the magnetic. As anyone who has ever read the comments on a recap can tell you, there has always been a less ambivalent way of regarding an antihero: as a hero. Some of the most passionate fans of “The Sopranos” fast-forwarded through Carmela and Dr. Melfi to freeze-frame Tony strangling a snitch with electrical wire. (David Chase satirized their bloodlust with a plot about “Cleaver,” a mob horror movie with all of the whackings, none of the Freud.) More recently, a subset of viewers cheered for Walter White on “Breaking Bad,” growling threats at anyone who nagged him to stop selling meth. In a blog post about that brilliant series, I labelled these viewers “bad fans,” and the responses I got made me feel as if I’d poured a bucket of oil onto a flame war from the parapets of my snobby critical castle. Truthfully, my haters had a point: who wants to hear that they’re watching something wrong?
But television’s original bad-fan crisis did not, as it happens, concern a criminal bad boy, or even take place on a drama. It involved Norman Lear’s right-wing icon, Archie Bunker, the loudmouthed buffoon who became one of TV’s most resonant and beloved television characters. Archie was the first masculine powerhouse to simultaneously charm and alienate viewers, and, much like the men who came after him, he longed for an era when “guys like us, we had it made.” O’Connor’s noisy, tender, and sometimes frightening performance made the character unforgettable, but from the beginning he was a source of huge anxiety, triggering as many think pieces as Lena Dunham. Archie represented the danger and the potential of television itself, its ability to influence viewers rather than merely help them kill time. Ironically, for a character so desperate to return to the past, he ended up steering the medium toward the future.

Yet, as Saul Austerlitz explains in his smart new book, “Sitcom: A History in 24 Episodes from ‘I Love Lucy’ to ‘Community,’ ” Lear’s most successful character managed to defy his creator, with a “Frankenstein”-like audacity. “A funny thing happened on the way to TV immortality: audiences liked Archie,” Austerlitz writes. “Not in an ironic way, not in a so-racist-he’s-funny way; Archie was TV royalty because fans saw him as one of their own.”
This sort of audience divide, not between those who love a show and those who hate it but between those who love it in very different ways, has become a familiar schism in the past fifteen years, during the rise of—oh, God, that phrase again—Golden Age television. This is particularly true of the much lauded stream of cable “dark dramas,” whose protagonists shimmer between the repulsive and the magnetic. As anyone who has ever read the comments on a recap can tell you, there has always been a less ambivalent way of regarding an antihero: as a hero. Some of the most passionate fans of “The Sopranos” fast-forwarded through Carmela and Dr. Melfi to freeze-frame Tony strangling a snitch with electrical wire. (David Chase satirized their bloodlust with a plot about “Cleaver,” a mob horror movie with all of the whackings, none of the Freud.) More recently, a subset of viewers cheered for Walter White on “Breaking Bad,” growling threats at anyone who nagged him to stop selling meth. In a blog post about that brilliant series, I labelled these viewers “bad fans,” and the responses I got made me feel as if I’d poured a bucket of oil onto a flame war from the parapets of my snobby critical castle. Truthfully, my haters had a point: who wants to hear that they’re watching something wrong?
But television’s original bad-fan crisis did not, as it happens, concern a criminal bad boy, or even take place on a drama. It involved Norman Lear’s right-wing icon, Archie Bunker, the loudmouthed buffoon who became one of TV’s most resonant and beloved television characters. Archie was the first masculine powerhouse to simultaneously charm and alienate viewers, and, much like the men who came after him, he longed for an era when “guys like us, we had it made.” O’Connor’s noisy, tender, and sometimes frightening performance made the character unforgettable, but from the beginning he was a source of huge anxiety, triggering as many think pieces as Lena Dunham. Archie represented the danger and the potential of television itself, its ability to influence viewers rather than merely help them kill time. Ironically, for a character so desperate to return to the past, he ended up steering the medium toward the future.
by Emily Nussbaum, New Yorker | Read more:
Image: Joanna NeborskyHow Self-Appointed Experts Rule the Autograph Industry
At his cluttered kitchen table in Myrtle Beach, South Carolina, surrounded by hundreds of trading cards and discarded foil wrappers, Steve Sterpka finally found what he'd been looking for.
The CVS Pharmacy manager had sifted through 15 boxes of Upper Deck baseball cards, hoping to encounter one of the coupons for rare collectibles the company randomly inserted to entice customers. In this case, Sterpka was after the signature of a famous historical figure — George Washington, maybe, or Babe Ruth — that had been paired with a single lock of the person's hair. One collector fortunate enough to score an Abraham Lincoln sold it at auction for $24,000.
The odds were not in Sterpka's favor: Only 10 of the Hair Cut Signatures were available. He'd spent $1,500 to purchase a case of 768 cards. With just 48 remaining, it appeared to be a lost cause.
Then he saw it: a card redeemable for Charles Lindbergh's signature and a strand of the famous aviator's hair.
Oh, my God, he thought. I can't believe what I've got in front of me.
He contacted Upper Deck. The company sent him a 2.5-by-3.5-inch piece of cardboard featuring Lindbergh's scrawl and a follicular sample. The back of the tiny treasure congratulated its new owner:
"You have received a trading card with an [sic] historical strand of Charles Lindbergh's hair that includes an autograph of Charles Lindbergh. The memorabilia was certified to us as belonging to Charles Lindbergh. The cut autograph was independently authenticated by a third party authenticator."
That last bit of language is where Sterpka's problems started.
Today, few autographs are bought or sold without the blessing of either Professional Sports Authenticator (PSA) or its competitor, James Spence Authentication (JSA). The two companies have come to dominate the market, verifying hundreds of thousands of signatures each year.
Business is so good that they use garbage cans to hold the cash they collect from reviews at hobby conventions. EBay, the world's largest facilitator of memorabilia auctions, endorses both companies to its customers. Nothing seems beyond the scope of their expertise, from Frank Sinatra's scrawl to baseballs defaced by Mickey Mantle.

The odds were not in Sterpka's favor: Only 10 of the Hair Cut Signatures were available. He'd spent $1,500 to purchase a case of 768 cards. With just 48 remaining, it appeared to be a lost cause.
Then he saw it: a card redeemable for Charles Lindbergh's signature and a strand of the famous aviator's hair.
Oh, my God, he thought. I can't believe what I've got in front of me.
He contacted Upper Deck. The company sent him a 2.5-by-3.5-inch piece of cardboard featuring Lindbergh's scrawl and a follicular sample. The back of the tiny treasure congratulated its new owner:
"You have received a trading card with an [sic] historical strand of Charles Lindbergh's hair that includes an autograph of Charles Lindbergh. The memorabilia was certified to us as belonging to Charles Lindbergh. The cut autograph was independently authenticated by a third party authenticator."
That last bit of language is where Sterpka's problems started.
Today, few autographs are bought or sold without the blessing of either Professional Sports Authenticator (PSA) or its competitor, James Spence Authentication (JSA). The two companies have come to dominate the market, verifying hundreds of thousands of signatures each year.
Business is so good that they use garbage cans to hold the cash they collect from reviews at hobby conventions. EBay, the world's largest facilitator of memorabilia auctions, endorses both companies to its customers. Nothing seems beyond the scope of their expertise, from Frank Sinatra's scrawl to baseballs defaced by Mickey Mantle.
by Jake Rossen, Dallas Observer | Read more:
Image: Matthew BillingtonWednesday, April 2, 2014
Low Vitamin D Levels Linked to Disease in Two Big Studies
People with low vitamin D levels are more likely to die from cancer and heart disease and to suffer from other illnesses, scientists reported in two large studies published on Tuesday.
The new research suggests strongly that blood levels of vitamin D are a good barometer of overall health. But it does not resolve the question of whether low levels are a cause of disease or simply an indicator of behaviors that contribute to poor health, like a sedentary lifestyle, smoking and a diet heavy in processed and unhealthful foods.
Nicknamed the sunshine nutrient, vitamin D is produced in the body when the skin is exposed to sunlight. It can be obtained from a small assortment of foods, including fish, eggs, fortified dairy products and organ meats, and vegetables like mushrooms and kale. And blood levels of it can be lowered by smoking, obesity and inflammation.
Vitamin D helps the body absorb calcium and is an important part of the immune system. Receptors for the vitamin and related enzymes are found throughout cells and tissues of the body, suggesting it may be vital to many physiological functions, said Dr. Oscar H. Franco, a professor of preventive medicine at Erasmus Medical Center in the Netherlands and an author of one of the new studies, which appeared in the journal BMJ.
“It has effects at the genetic level, and it affects cardiovascular health and bone health,” he said. “There are different hypotheses for the factors that vitamin D regulates, from genes to inflammation. That’s the reason vitamin D seems so promising.”
The two studies were meta-analyses that included data on more than a million people. They included observational findings on the relationship between disease and blood levels of vitamin D. The researchers also reviewed evidence from randomized controlled trials — the gold standard in scientific research — that assessed whether taking vitamin D daily was beneficial. (...)
“We are talking about a large part of the population being affected by this,” he said. “Vitamin D could be a good route to prevent mortality from cardiovascular disease and other causes of mortality.”

Nicknamed the sunshine nutrient, vitamin D is produced in the body when the skin is exposed to sunlight. It can be obtained from a small assortment of foods, including fish, eggs, fortified dairy products and organ meats, and vegetables like mushrooms and kale. And blood levels of it can be lowered by smoking, obesity and inflammation.
Vitamin D helps the body absorb calcium and is an important part of the immune system. Receptors for the vitamin and related enzymes are found throughout cells and tissues of the body, suggesting it may be vital to many physiological functions, said Dr. Oscar H. Franco, a professor of preventive medicine at Erasmus Medical Center in the Netherlands and an author of one of the new studies, which appeared in the journal BMJ.
“It has effects at the genetic level, and it affects cardiovascular health and bone health,” he said. “There are different hypotheses for the factors that vitamin D regulates, from genes to inflammation. That’s the reason vitamin D seems so promising.”
The two studies were meta-analyses that included data on more than a million people. They included observational findings on the relationship between disease and blood levels of vitamin D. The researchers also reviewed evidence from randomized controlled trials — the gold standard in scientific research — that assessed whether taking vitamin D daily was beneficial. (...)
“We are talking about a large part of the population being affected by this,” he said. “Vitamin D could be a good route to prevent mortality from cardiovascular disease and other causes of mortality.”
by Anahad O'Connor, NY Times | Read more:
Image: Lawrence Lool/European Pressphoto AgencyTuesday, April 1, 2014
The Wolf Hunters of Wall Street
Before the collapse of the U.S. financial system in 2008, Brad Katsuyama could tell himself that he bore no responsibility for that system. He worked for the Royal Bank of Canada, for a start. RBC might have been the fifth-biggest bank in North America, by some measures, but it was on nobody’s mental map of Wall Street. It was stable and relatively virtuous and soon to be known for having resisted the temptation to make bad subprime loans to Americans or peddle them to ignorant investors. But its management didn’t understand just what an afterthought the bank was — on the rare occasions American financiers thought about it at all. Katsuyama’s bosses sent him to New York from Toronto in 2002, when he was 23, as part of a “big push” for the bank to become a player on Wall Street. The sad truth was that hardly anyone noticed it. “The people in Canada are always saying, ‘We’re paying too much for people in the United States,’ ” Katsuyama says. “What they don’t realize is that the reason you have to pay them too much is that no one wants to work for RBC. RBC is a nobody.”

For his first few years on Wall Street, Katsuyama traded U.S. energy stocks and then tech stocks. Eventually he was promoted to run one of RBC’s equity-trading groups, consisting of 20 or so traders. The RBC trading floor had a no-jerk rule (though the staff had a more colorful term for it): If someone came in the door looking for a job and sounding like a typical Wall Street jerk, he wouldn’t be hired, no matter how much money he said he could make the firm. There was even an expression used to describe the culture: “RBC nice.” Although Katsuyama found the expression embarrassingly Canadian, he, too, was RBC nice. The best way to manage people, he thought, was to persuade them that you were good for their careers. He further believed that the only way to get people to believe that you were good for their careers was actually to be good for their careers.
His troubles began at the end of 2006, after RBC paid $100 million for a U.S. electronic-trading firm called Carlin Financial. In what appeared to Katsuyama to be undue haste, his bosses back in Canada bought Carlin without knowing much about the company or even electronic trading. Now they would receive a crash course. (...)
As it happened, at almost exactly the moment Carlin Financial entered Brad Katsuyama’s life, the U.S. stock market began to behave oddly. Before RBC acquired this supposed state-of-the-art electronic-trading firm, Katsuyama’s computers worked as he expected them to. Suddenly they didn’t. It used to be that when his trading screens showed 10,000 shares of Intel offered at $22 a share, it meant that he could buy 10,000 shares of Intel for $22 a share. He had only to push a button. By the spring of 2007, however, when he pushed the button to complete a trade, the offers would vanish. In his seven years as a trader, he had always been able to look at the screens on his desk and see the stock market. Now the market as it appeared on his screens was an illusion.
by Michael Lewis, NY Times | Read more:
Image: Stefan RuizThursday, March 27, 2014
What We Talk About When We (Don't) Talk About Dying
It will be okay, I finally said. And when she finally fell asleep I watched her and remembered all the times she told me, patient and comforting, that it would be okay. When I didn’t want her to leave my sight in a shopping mall. Or the times I got nervous before a grade-school field trip. Or when I was sick and needed to take medicine, back in the days when it actually tasted like medicine. Or when I woke up in the middle of the night, not old enough to know what a nightmare was but young enough to call out for the one person who always came. It will be okay, she always said, and I always believed her.
My mother always told me what I needed to hear and I gradually came to understand—and appreciate—that none of these things were a matter of life and death. Eventually I acknowledged—and accepted—that it would be okay, because when your mother tells you this, she knows it’s the truth. She wouldn’t say it unless she believed it, so I believed her.
You each get older and learn to recognize the things you can control and the things you can’t. You gain perspective and experience and grasp that life goes on no matter how you wonder and worry. You might get sick and you might need reassurance but that’s all part of the process, another step in your journey. You adapt and endure because it always gets better. You remind yourself: it’s not a matter of life and death.
And so on.
So what can you say when, one day, it becomes a matter of life and death? What do you do when the person crying in the bed is looking to you for reassurance? How do you proceed when the person who always calmed you down is shuddering with fear and afraid to be alone? What else is left when actions have failed and, for the first time, even words are incapable of offering consolation? You tell your mother it will be okay. You do this because there’s nothing else left to do. You say it will be okay because you know it won’t and you still hope she’s able to believe you. (...)
It gets very quiet while time and place and the guarded feelings that enable us to function all fall away and you concentrate every thought into one simple, implausible objective: peace. You think it and you will it and for a moment that might be forever you become it in ways you’re never able to talk about later, even if you are inclined (and you aren’t, especially). You shiver but are calm; you are entirely in the present tense yet you are also somewhere else, somewhere deeper inside that, somehow, connects you to everything else you’ve ever known.
It will be okay, you whisper, actually believing this because it’s not even your own voice you hear. You don’t know if this is you, or your mind, or the actualization of thatother place (you are hazily aware) you have managed to access, understanding it’s not anything you can anticipate or comprehend even though you’ve been preparing for it (you realize, abruptly) as far back as you can remember.
It’s okay, you say, and maybe your vision is blurred or your eyes are closed, or probably you’re seeing more clearly than ever before, but now you recognize this voice and, as you look down at eyes that can no longer see you, understand, finally, that you’re talking to yourself.

You each get older and learn to recognize the things you can control and the things you can’t. You gain perspective and experience and grasp that life goes on no matter how you wonder and worry. You might get sick and you might need reassurance but that’s all part of the process, another step in your journey. You adapt and endure because it always gets better. You remind yourself: it’s not a matter of life and death.
And so on.
So what can you say when, one day, it becomes a matter of life and death? What do you do when the person crying in the bed is looking to you for reassurance? How do you proceed when the person who always calmed you down is shuddering with fear and afraid to be alone? What else is left when actions have failed and, for the first time, even words are incapable of offering consolation? You tell your mother it will be okay. You do this because there’s nothing else left to do. You say it will be okay because you know it won’t and you still hope she’s able to believe you. (...)
It gets very quiet while time and place and the guarded feelings that enable us to function all fall away and you concentrate every thought into one simple, implausible objective: peace. You think it and you will it and for a moment that might be forever you become it in ways you’re never able to talk about later, even if you are inclined (and you aren’t, especially). You shiver but are calm; you are entirely in the present tense yet you are also somewhere else, somewhere deeper inside that, somehow, connects you to everything else you’ve ever known.
It will be okay, you whisper, actually believing this because it’s not even your own voice you hear. You don’t know if this is you, or your mind, or the actualization of thatother place (you are hazily aware) you have managed to access, understanding it’s not anything you can anticipate or comprehend even though you’ve been preparing for it (you realize, abruptly) as far back as you can remember.
It’s okay, you say, and maybe your vision is blurred or your eyes are closed, or probably you’re seeing more clearly than ever before, but now you recognize this voice and, as you look down at eyes that can no longer see you, understand, finally, that you’re talking to yourself.
by Sean Murphy, The Weeklings | Read more:
Image: uncredited
Wednesday, March 26, 2014
Subscribe to:
Posts (Atom)