Wednesday, July 11, 2018

Where Millennials Come From

And why we insist on blaming them for it.

Imagine, as I often do, that our world were to end tomorrow, and that alien researchers many years in the future were tasked with reconstructing the demise of civilization from the news. If they persevered past the coverage of our President, they would soon identify the curious figure of the millennial as a suspect. A composite image would emerge, of a twitchy and phone-addicted pest who eats away at beloved American institutions the way boll weevils feed on crops. Millennials, according to recent headlines, are killing hotels, department stores, chain restaurants, the car industry, the diamond industry, the napkin industry, homeownership, marriage, doorbells, motorcycles, fabric softener, hotel-loyalty programs, casinos, Goldman Sachs, serendipity, and the McDonald’s McWrap.

The idea that millennials are capriciously wrecking the landscape of American consumption grants quite a bit of power to a group that is still on the younger side. Born in the nineteen-eighties and nineties, millennials are now in their twenties and thirties. But the popular image of this generation—given its name, in 1987, by William Strauss and Neil Howe—has long been connected with the notion of disruptive self-interest. Over the past decade, that connection has been codified by Jean Twenge, a psychology professor at San Diego State University, who writes about those younger than herself with an air of pragmatic evenhandedness and an undercurrent of moral alarm. (An article adapted from her most recent book, “iGen,” about the cohort after millennials, was published in the September issue of The Atlantic with the headline “Have Smartphones Destroyed a Generation?” It went viral.) In 2006, Twenge published “Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitled—and More Miserable Than Ever Before.” The book’s cover emblazoned the title across a bare midriff, a flamboyant illustration of millennial self-importance, sandwiched between a navel piercing and a pair of low-rise jeans.

According to Twenge, millennials are “tolerant, confident, open-minded, and ambitious, but also disengaged, narcissistic, distrustful, and anxious.” She presents a barrage of statistics in support of this assessment, along with anecdotal testimonials and pop-cultural examples that neatly confirm the trends she identifies. (A revised edition, published in 2014, mentions the HBO show “Girls” six times.) Twenge acknowledges that the generation has come of age inside an “economic squeeze created by underemployment and rising costs,” but she mostly explains millennial traits in terms of culture and choice. Parents overemphasized self-esteem and happiness, while kids took their cues from an era of diversity initiatives, decentralized authority, online avatars, and reality TV. As a result, millennials have become irresponsible and fundamentally maladjusted. They “believe that every job will be fulfilling and then can’t even find a boring one.” They must lower their expectations and dim their glittering self-images in order to become functional adults.

This argument has a conservative appeal, given its focus on the individual rather than on the structures and the conditions that govern one’s life. Twenge wonders, “Is the upswing in minority kids’ self-esteem an unmitigated good?” and then observes, “Raising children’s self-esteem is not going to solve the problems of poverty and crime.” It’s possible to reach such moralizing conclusions even if one begins with the opposite economic premise. In “The Vanishing American Adult,” published in May, Senator Ben Sasse, Republican of Nebraska, insists that we live in a time of generalized “affluenza,” in which “much of our stress now flows not from deprivation but, oddly, from surplus.” Millennials have “far too few problems,” he argues. Sasse chastises parents for allowing their kids to succumb to the character-eroding temptations of contemporary abundance and offers suggestions for turning the school-age generation into the sort of hardworking, financially independent grownups that the millennials have yet to become.

The image of millennials has darkened since Strauss and Howe walked the beat: in their 2000 book, “Millennials Rising,” they claimed that the members of this surging generation were uniquely earnest, industrious, and positive. But the decline in that reputation is hardly surprising. Since the nineteen-sixties, most generational analysis has revolved around the groundbreaking idea that young people are selfish. Twenge’s term for millennials merely flips an older one, the “me generation,” inspired by a 1976 New York cover story by Tom Wolfe about the baby boomers. (The voluble Wolfe, born in 1930, is a member of the silent generation.) Wolfe argued that three decades of postwar economic growth had produced a mania for “remaking, remodeling, elevating, and polishing one’s very self . . . and observing, studying, and doting on it.” The fear of growing selfishness has, in the forty years since, only increased.

That fear is grounded in concrete changes: the story of American self-interest is a continuous one that nonetheless contains major institutional and economic shifts. Adapting to those shifts does tend to produce certain effects. I was born smack in the middle of the standard millennial range, and Twenge’s description of my generation’s personality strikes me as broadly accurate. Lately, millennial dreams tend less toward global fame and more toward affordable health insurance, but she is correct that my cohort has grown up under the influence of novel and powerful incentives to focus on the self. If for the baby boomers self-actualization was a conscious project, and if for Gen X—born in the sixties and seventies—it was a mandate to be undermined, then for millennials it’s more like an atmospheric condition: inescapable, ordinary, and, perhaps, increasingly toxic. A generation has inherited a world without being able to live in it. How did that happen? And why do so many people insist on blaming them for it?

Kids These Days: Human Capital and the Making of Millennials,” by Malcolm Harris (Little, Brown), is the first major accounting of the millennial generation written by someone who belongs to it. Harris is twenty-eight—the book’s cover announces his birth year next to a sardonic illustration of elementary-school stickers—and he has already rounded the bases of young, literary, leftist media: he is a writer and editor for the online magazine the New Inquiry; he has written for Jacobin and n+1. He got his first taste of notoriety during Occupy Wall Street: shortly after activists settled in at Zuccotti Park, he wrote a blog post for Jacobin in which he claimed to have “heard unconfirmed reports that Radiohead is planning a concert at the occupation this week.” He set up an e-mail account using the name of the band’s manager and wrote to Occupy organizers, conveying the band’s interest in performing. Later, in a piece for Gawker titled “I’m the Jerk Who Pranked Occupy Wall Street,” he explained that his goal was to get more people to the protest, and expressed disdain for the way the organizers responded. (Fooled by his e-mail, they held a press conference and confirmed the band’s plan to appear.)

Harris’s anatomizing of his peers begins with the star stickers that, along with grade-school participation trophies, so fascinate Sasse, Twenge, and other writers of generational trend pieces. “You suck, you still get a trophy” is how Twenge puts it, describing contemporary K through five as an endless awards ceremony. Harris, on the other hand, regards elementary school as a capitalist boot camp, in which children perform unpaid labor, learn the importance of year-over-year growth through standardized testing, and get accustomed to constant, quantified, increasingly efficient work. The two descriptions are not as far apart as one might think: assuring kids that they’re super special—and telling them, as Sasse does, that they have a duty to improve themselves through constant enrichment—is a good way to get them to cleave to a culture of around-the-clock labor. And conditioning them to seek rewards in the form of positive feedback—stars and trophies, hearts and likes—is a great way to get them used to performing that labor for free.

My memories of childhood—in a suburban neighborhood in west Houston that felt newly hatched, as open as farmland—are different, breezy and hot and sunlit. I attended, mostly on scholarship, a Southern Baptist school attached to one of the largest megachurches in America, and elementary school seemed like the natural price of admission for friends, birthday parties, and long summers full of shrieking, unsupervised play. (The very young aren’t much for picking up on indoctrination techniques; the religious agitprop felt natural enough, too.) But some kind of training did kick in around the time I entered high school, when I began spending fourteen-hour days on campus with the understanding that I needed to earn a scholarship to a good college. College, of course, is where the millennial lounges around on lush green quads, spends someone else’s money, insists on “safe spaces,” protests her school’s heteronormative core curriculum, and wages war on her professors if she receives a grade below an A. I did the first two of those things, thanks to the Jefferson Scholars Foundation at the University of Virginia. I also took six classes a semester, worked part time, and crammed my schedule with clubs and committees—in between naps on the quad and beers with friends on my porch couch and long meditative sessions figuring out what kind of a person I was going to be.

Most undergraduates don’t have such a luxurious and debt-free experience. The majority of American college students never live on campus; around a third go to community college. The type of millennial that much of the media flocks to—white, rich, thoughtlessly entitled—is largely unrepresentative of what is, in fact, a diverse and often downwardly mobile group. (Millennials are the first generation to have just a fifty-fifty chance of being financially better off than their parents.) Many millennials grew up poor, went to crummy schools, and have been shuttled toward for-profit colleges and minimum-wage jobs, if not the prison system. (For-profit colleges, which disproportionately serve low-income students, account for roughly a tenth of undergraduates, and more than a third of student-loan defaults.) Average student debt has doubled just within this generation, surging from around eighteen thousand dollars at graduation for the class of 2003 to thirty-seven thousand for the class of 2016. (Under the tax plan recently passed by House Republicans, the situation worsens for student borrowers and their families: that bill eliminates the deduction on student-loan interest and voids the income-tax exemption for tuition benefits.)

A young college graduate, having faithfully followed the American path of hard work and achievement, might now find herself in a position akin to a homeowner with negative equity: in possession of an asset that is worth much less than what she owes. In these conditions, the concept of self-interest starts to splinter. For young people, I suspect, the idea of specialness looks like a reward but mostly functions as punishment, bestowing on us the idea that there is no good way of existing other than constantly generating returns. (...)

When Twenge first published “Generation Me,” social media had not yet become ubiquitous. Facebook was limited to colleges and high schools, Twitter hadn’t formally launched, and Instagram didn’t exist. But the millennial narrative was already taking its mature shape, and social media fit into it seamlessly: the narcissism of status updates, the shallow skimming of shiny surfaces, the inability to sit still. One might therefore conclude that the story of generational self-centeredness is so flexible as to have no real definition—it can cover anything, with a little stretching. But there is another possibility: that social media feeds on the same conditions that have made millennials what they are.

“Over the last decade, anxiety has overtaken depression as the most common reason college students seek counseling services,” the Times Magazine noted in October. Anxiety, Harris argues, isn’t just an unfortunate by-product of an era when wages are low and job security is scarce. It’s useful: a constant state of adrenalized agitation can make it hard to stop working and encourage you to think of other aspects of your life—health, leisure, online interaction—as work. Social media provides both an immediate release for that anxiety and a replenishment of it, so that users keep coming back. Many jobs of the sort that allow millennials to make sudden leaps into financial safety—in tech, sports, music, film, “influencing,” and, occasionally, journalism—are identity-based and mercurial, with the biggest payoffs and opportunities going to those who have developed an online following. What’s more, cultivating a “personal brand” has become a matter of prudence as well as ambition: there is a powerful incentive to be publicly likable at a time when strangers routinely rate and review one another over minor transactions—cat-sitting, assembling ikea furniture, sharing a car ride or a spare bedroom—and people are forced to crowdsource money for their medical bills.

Young people have curled around their economic situation “like vines on a trellis,” as Harris puts it. And, when humans learn to think of themselves as assets competing in an unpredictable and punishing market, then millennials—in all their anxious, twitchy, phone-addicted glory—are exactly what you should expect. The disdain that so many people feel for Harris’s and my generation reflects an unease about the forces of deregulation, globalization, and technological acceleration that are transforming everyone’s lives. (It does not seem coincidental that young people would be criticized for being entitled at a time when people are being stripped of their entitlements.) Millennials, in other words, have adjusted too well to the world they grew up in; their perfect synchronization with economic and cultural disruption has been mistaken for the source of the disruption itself.

by Jia Tolentino, New Yorker |  Read more:
Image: Adrian Tomine

A Landmark Legal Shift Opens Pandora's Box for DIY Guns

Five years ago, 25-year-old radical libertarian Cody Wilson stood on a remote central Texas gun range and pulled the trigger on the world’s first fully 3-D-printed gun. When, to his relief, his plastic invention fired a .380-caliber bullet into a berm of dirt without jamming or exploding in his hands, he drove back to Austin and uploaded the blueprints for the pistol to his website, Defcad.com.

He'd launched the site months earlier along with an anarchist video manifesto, declaring that gun control would never be the same in an era when anyone can download and print their own firearm with a few clicks. In the days after that first test-firing, his gun was downloaded more than 100,000 times. Wilson made the decision to go all in on the project, dropping out of law school at the University of Texas, as if to confirm his belief that technology supersedes law.

The law caught up. Less than a week later, Wilson received a letter from the US State Department demanding that he take down his printable-gun blueprints or face prosecution for violating federal export controls. Under an obscure set of US regulations known as the International Trade in Arms Regulations (ITAR), Wilson was accused of exporting weapons without a license, just as if he'd shipped his plastic gun to Mexico rather than put a digital version of it on the internet. He took Defcad.com offline, but his lawyer warned him that he still potentially faced millions of dollars in fines and years in prison simply for having made the file available to overseas downloaders for a few days. "I thought my life was over," Wilson says.

Instead, Wilson has spent the last years on an unlikely project for an anarchist: Not simply defying or skirting the law but taking it to court and changing it. In doing so, he has now not only defeated a legal threat to his own highly controversial gunsmithing project. He may have also unlocked a new era of digital DIY gunmaking that further undermines gun control across the United States and the world—another step toward Wilson's imagined future where anyone can make a deadly weapon at home with no government oversight.

Two months ago, the Department of Justice quietly offered Wilson a settlement to end a lawsuit he and a group of co-plaintiffs have pursued since 2015 against the United States government. Wilson and his team of lawyers focused their legal argument on a free speech claim: They pointed out that by forbidding Wilson from posting his 3-D-printable data, the State Department was not only violating his right to bear arms but his right to freely share information. By blurring the line between a gun and a digital file, Wilson had also successfully blurred the lines between the Second Amendment and the First.

"If code is speech, the constitutional contradictions are evident," Wilson explained to WIRED when he first launched the lawsuit in 2015. "So what if this code is a gun?”

The Department of Justice's surprising settlement, confirmed in court documents earlier this month, essentially surrenders to that argument. It promises to change the export control rules surrounding any firearm below .50 caliber—with a few exceptions like fully automatic weapons and rare gun designs that use caseless ammunition—and move their regulation to the Commerce Department, which won't try to police technical data about the guns posted on the public internet. In the meantime, it gives Wilson a unique license to publish data about those weapons anywhere he chooses.

"I consider it a truly grand thing," Wilson says. "It will be an irrevocable part of political life that guns are downloadable, and we helped to do that."

Now Wilson is making up for lost time. Later this month, he and the nonprofit he founded, Defense Distributed, are relaunching their website Defcad.com as a repository of firearm blueprints they've been privately creating and collecting, from the original one-shot 3-D-printable pistol he fired in 2013 to AR-15 frames and more exotic DIY semi-automatic weapons. The relaunched site will be open to user contributions, too; Wilson hopes it will soon serve as a searchable, user-generated database of practically any firearm imaginable.

All of that will be available to anyone anywhere in the world with an uncensored internet connection, to download, alter, remix, and fabricate into lethal weapons with tools like 3-D printers and computer-controlled milling machines. “We’re doing the encyclopedic work of collecting this data and putting it into the commons,” Wilson says. “What’s about to happen is a Cambrian explosion of the digital content related to firearms.” He intends that database, and the inexorable evolution of homemade weapons it helps make possible, to serve as a kind of bulwark against all future gun control, demonstrating its futility by making access to weapons as ubiquitous as the internet.

Of course, that mission seemed more relevant when Wilson first began dreaming it up, before a political party with no will to rein in America’s gun death epidemic held control of Congress, the White House, and likely soon the Supreme Court. But Wilson still sees Defcad as an answer to the resurgent gun control movement that has emerged in the wake of the Parkland, Florida, high school shooting that left 17 students dead in February.

The potential for his new site, if it functions as Wilson hopes, would also go well beyond even the average Trump supporter’s taste in gun rights. The culture of homemade, unregulated guns it fosters could make firearms available to even those people who practically every American agrees shouldn’t possess them: felons, minors, and the mentally ill. The result could be more cases like that of John Zawahiri, an emotionally disturbed 25-year-old who went on a shooting spree in Santa Monica, California, with a homemade AR-15 in 2015, killing five people, or Kevin Neal, a Northern California man who killed five people with AR-15-style rifles—some of which were homemade—last November.

"This should alarm everyone," says Po Murray, chairwoman of Newtown Action Alliance, a Connecticut-focused gun control group created in the wake of the mass shooting at Sandy Hook Elementary School in 2013. "We’re passing laws in Connecticut and other states to make sure these weapons of war aren’t getting into the hands of dangerous people. They’re working in the opposite direction."

When reporters and critics have repeatedly pointed out those potential consequences of Wilson's work over the last five years, he has argued that he’s not seeking to arm criminals or the insane or to cause the deaths of innocents. But nor is he moved enough by those possibilities to give up what he hopes could be, in a new era of digital fabrication, the winning move in the battle over access to guns.

With his new legal victory and the Pandora's box of DIY weapons it opens, Wilson says he's finally fulfilling that mission. “All this Parkland stuff, the students, all these dreams of ‘common sense gun reforms'? No. The internet will serve guns, the gun is downloadable.” Wilson says now. “No amount of petitions or die-ins or anything else can change that."

Defense Distributed operates out of an unadorned building in a north Austin industrial park, behind two black-mirrored doors marked only with the circled letters "DD" scrawled by someone's finger in the dust. In the machine shop inside, amid piles of aluminum shavings, a linebacker-sized, friendly engineer named Jeff Winkleman is walking me through the painstaking process of turning a gun into a collection of numbers.

Winkleman has placed the lower receiver of an AR-15, the component that serves as the core frame of the rifle, on a granite table that's been calibrated to be perfectly flat to one ten-thousandth of an inch. Then he places a Mitutoyo height gauge—a thin metal probe that slides up and down on a tall metal stand and measures vertical distances—next to it, poking one edge of the frame with its probe to get a baseline reading of its position. "This is where we get down to the nitty gritty," Winkleman says. "Or, as we call it, the gnat's ass."

Winkleman then slowly rotates the guage's rotary handle to move its probe down to the edge of a tiny hole on the side of the gun's frame. After a couple careful taps, the tool's display reads 0.4775 inches. He has just measured a single line—one of the countless dimensions that define the shape of any of the dozens of component of an AR-15—with four decimal places of accuracy. Winkleman's job at Defense Distributed now is to repeat that process again and again, integrating that number, along with every measurement of every nook, cranny, surface, hole, lip, and ridge of a rifle, into a CAD model he's assembling on a computer behind him, and then to repeat that obsessively comprehensive model-building for as many guns as possible.

That a digital fabrication company has opted for this absurdly manual process might seem counterintuitive. But Winkleman insists that the analog measurements, while infinitely slower than modern tools like laser scanners, produce a far more accurate model—a kind of gold master for any future replications or alterations of that weapon. "We're trying to set a precedent here," Winkelman says. "When we say something is true, you absolutely know it's true."

One room over, Wilson shows me the most impressive new toy in the group's digitization toolkit, one that arrived just three days earlier: A room-sized analog artifact known as an optical comparator. The device, which he bought used for $32,000, resembles a kind of massive cartoon X-ray scanner.

Wilson places the body of an AR-9 rifle on a pedestal on the right side of the machine. Two mercury lamps project neon green beams of light onto the frame from either side. A lens behind it bends that light within the machine and then projects it onto a 30-inch screen at up to 100X magnification. From that screen's mercury glow, the operator can map out points to calculate the gun's geometry with microscopic fidelity. Wilson flips through higher magnification lenses, then focuses on a series of tiny ridges of the frame until the remnants of their machining look like the brush strokes of Chinese calligraphy. "Zoom in, zoom in, enhance" Wilson jokes.

Turning physical guns into digital files, instead of vice-versa, is a new trick for Defense Distributed. While Wilson's organization first gained notoriety for its invention of the first 3-D printable gun, what it called the Liberator, it has since largely moved past 3-D printing. Most of the company's operations are now focused on its core business: making and selling a consumer-grade computer-controlled milling machine known as the Ghost Gunner, designed to allow its owner to carve gun parts out of far more durable aluminum. In the largest room of Defense Distributed's headquarters, half a dozen millennial staffers with beards and close-cropped hair—all resembling Cody Wilson, in other words—are busy building those mills in an assembly line, each machine capable of skirting all federal gun control to churn out untraceable metal glocks and semiautomatic rifles en masse.

For now, those mills produce only a few different gun frames for firearms, including the AR-15 and 1911 handguns. But Defense Distributed’s engineers imagine a future where their milling machine and other digital fabrication tools—such as consumer-grade aluminum-sintering 3-D printers that can print objects in metal—can make practically any digital gun component materialize in someone's garage.

by Andy Greenberg, Wired |  Read more:
Images: Olman Hernandez and Michelle Groskopf

Tuesday, July 10, 2018


Suzanne Saroff

Walmart Nation: Mapping the Largest Employers in the U.S.


Walmart Nation: Mapping the Largest Employers in the U.S.

In an era where Amazon steals most of the headlines, it’s easy to forget about brick-and-mortar retailers like Walmart.

But, even though the market values the Bezos e-commerce juggernaut at about twice the sum of Walmart, the blue big-box store is very formidable in other ways. For example, revenue and earnings are two areas where Walmart still reigns supreme, and the stock just hit all-time highs yesterday on an earnings beat.

That’s not all, though. As today’s map shows, Walmart is dominant in one other notable way: the company is the biggest private employer in America in a whopping 22 states.

Seriously, Juice Is Not Healthy

Obesity affects 40 percent of adults and 19 percent of children in the United States and accounts for more than $168 billion in health care spending each year. Sugary beverages are thought to be one of the major drivers of the obesity epidemic. These drinks (think soda and sports drinks) are the largest single source of added sugars for Americans and contribute, on average, 145 added calories a day to our diets. For these reasons, reducing sugary beverage consumption has been a significant focus of public health intervention. Most efforts have focused on sodas.

But not juice. Juice, for some reason, gets a pass. It’s not clear why.

Americans drink a lot of juice. The average adult drinks 6.6 gallons per year. More than half of preschool-age children (ages 2 to 5) drink juice regularly, a proportion that, unlike for sodas, has not budged in recent decades. These children consume on average 10 ounces per day, more than twice the amount recommended by the American Academy of Pediatrics.

Parents tend to associate juice with healthfulness, are unaware of its relationship to weight gain and are reluctant to restrict it in their child’s diet. After all, 100 percent fruit juice — sold in handy individual servings — has been marketed as a natural source of vitamins and calcium. Department of Agriculture guidelines state that up to half of fruit servings can be provided in the form of 100 percent juice and recommend drinking fortified orange juice for the vitamin D. Some brands of juice are even marketed to infants.

Government programs designed to provide healthy food for children, such as the Special Supplemental Nutrition Program for Women, Infants, and Children, offer juice for kids. Researchers have found that children in the program are more likely to exceed the recommended daily fruit juice limit than those who are similarly poor but not enrolled.

Despite all the marketing and government support, fruit juices contain limited nutrients and tons of sugar. In fact, one 12-ounce glass of orange juice contains 10 teaspoons of sugar, which is roughly what’s in a can of Coke. (...)

It’s tempting to minimize the negative contributions of juice to our diets because it’s “natural” or because it contains “vitamins.” Studies that support this view exist, but many are biased and have been questioned.

And we doubt you’d take a multivitamin if it contained 10 teaspoons of sugar.

There is no evidence that juice improves health. It should be treated like other sugary beverages, which are fine to have periodically if you want them, but not because you need them. Parents should instead serve water and focus on trying to increase children’s intake of whole fruit. Juice should no longer be served regularly in day care centers and schools. Public health efforts should challenge government guidelines that equate fruit juice with whole fruit, because these guidelines most likely fuel the false perception that drinking fruit juice is good for health.

by Erika R. Cheng, Lauren G. Fiechtner and Aaron E. Carroll, NY Times | Read more:
Image:Nicolas Ortega

Monday, July 9, 2018

Amazon Is Already Undercutting Prices on Over-the-Counter Pills

As pharmacy chains await Amazon.com Inc.’s entry into the prescription-drug market, the online retail giant is already undercutting them for non-prescription medicine for aches, colds and allergies.

Median prices for over-the-counter, private-brand medicine sold by Walgreens Boots Alliance Inc. and CVS Health Corp. were about 20 percent higher than Basic Care, the over-the-counter drug line sold exclusively by Amazon, according to a report Friday by Jefferies Group analysts.

Last week, Amazon announced that it was buying PillPack, a pharmacy company that will give it an entry point into the U.S.’s $328.6 billion market for prescription drugs. Shares of CVS and Walgreens plunged on the news, as investors bet Amazon could lure pharmacy customers with lower prices, and give them one less reason to go to the corner drugstore.

Amazon began selling the Basic Care line in August with roughly 35 products and has since expanded its range to 65 drugs, according to the Jefferies analysts. The products include mild painkillers, cold and flu medication, sleeping aids and other medication commonly found in the pharmacy aisle.

Cheaper Drugs

At a midtown Manhattan Duane Reade, part of the Walgreens chain, a store-brand pack of 500 acetaminophen pills costs $18.99. Amazon is selling the same count and strength product for $7.40. Two allergy medications, cetirizine -- also known as Zyrtec, and loratadine -- sold under the Claritin brand, cost about three-quarters less on Amazon than the drugstore chains’ house brands did in the store.

According to the Jefferies report, 84 percent of Walgreens’ and 72 percent of CVS’s house-brand drugs were more expensive than the Basic Care line.

In-house brands are a way for retailers to sell over-the-counter products that can compete with manufacturers’ brand offerings, such as Tylenol or Advil. Amazon’s Basic Care brand is made by Perrigo Co., which also makes in-house brands for other retailers.

Many, but not all, of Amazon’s over-the-counter drugs are available through the retailer’s Prime service, which offers free shipping and fast delivery.

by Aziza Kasumov, Bloomberg | Read more:
Image: Getty

Teddybears

The Cognition Crisis

Our lives on this planet have improved in so many amazing ways over the last century. On average, we are now healthier, more affluent and literate, less violent and longer living. Despite these unprecedented positive changes, clear signs exist that we are in the midst of an emerging crisis — one that has not yet been recognized in its full breadth, even though it lurks just beneath the surface of our casual conversations and swims in the undercurrents of our news feeds. This is not the well-known crisis that we’ve induced upon the earth’s climate, but one that is just as threatening to our future. This is a crisis of our minds. A cognition crisis.

A cognition crisis is not defined by a lack of information, knowledge or skills. We have done a fine job in accumulating those and passing them along across millennia. Rather, this a crisis at the core of what makes us human: the dynamic interplay between our brain and our environment — the ever-present cycle between how we perceive our surroundings, integrate this information, and act upon it.

This ancient perception-action cycle ensured our earliest survival by allowing our primordial predecessors to seek nutrients and avoid toxins. It is from these humble beginnings that the human brain evolved to pursue more diverse resources and elude more inventive threats. It is from here that human cognition emerged to support our success in an increasingly complex and competitive environment: attention, memory, perception, creativity, imagination, reasoning, decision making, emotion and aggression regulation, empathy, compassion, and wisdom. And it is here that our crisis exists.

Today, hundreds of millions of people around the world seek medical assistance for serious impairments in their cognition: major depressive disorder, anxiety, schizophrenia, autism, post-traumatic stress disorder, dyslexia, obsessive-compulsive disorder, bipolar disorder, attention deficit hyperactivity disorder (ADHD), addiction, dementia, and more. In the United States alone, depression affects 16.2 million adults, anxiety 18.7 million, and dementia 5.7 million — a number that is expected to nearly triple in the coming decades.
American teens have experienced a 33% increase in depressive symptoms, with 31% more having died by suicide between 2010 and 2015.

The immense personal, societal and economic impact of cognitive dysfunction warrants heightened consideration because the crisis is growing, not receding. Despite substantial investment in research and treatments by governments, foundations, and companies around the world, the prevalence and impact of these conditions are escalating. Between 2005 and 2015, the number of people worldwide with depression and anxiety increased by 18.4% and 14.9% respectively, while individuals with dementia exhibited a 93% increase over those same years.

To some degree, these trends reflect the overall growth and aging of the world’s population. This will only continue to increase in the future: the global population of seniors is predicted to swell to 1.5 billion by 2050. Although there are clear benefits to living longer, an unfortunate negative consequence is the burden it places on many aspects of cognition.

There are signs something else is going on, too. Over the last several decades, worrying tears have appeared in the cognitive fabric of our youth, notably in terms of emotional regulation and attentional deployment. American teens have experienced a 33% increase in depressive symptoms, with 31% more having died by suicide in 2015 than in 2010. ADHD diagnoses have also increased dramatically. While a growing awareness of these conditions — and with it, more frequent diagnoses — are likely factors, it does not seem this is the whole story; the magnitude of this escalation points to a deeper problem. (...)

Neuroscientists and leadership in the medical world now appreciate that much more unites seemingly disparate aspects of cognition than divide them. For example, attention deficits are now recognized to be a prominent feature of major depressive disorder, and are included in the most recent diagnostic criteria — the bible used by mental health experts — as a “diminished ability to concentrate.” The reality is that each of us has one mind, and embracing this will foster our ability to nurture it.

There is also, as I’ve said, a common, underlying aggravator that has exerted an impact across all domains of cognition: the dramatic plunge we’ve taken into the information age on the back of the digital revolution. Every way we interact with our environment, as well as with each other and ourselves, has been radically transformed by technology.

The old environment, where our cognition evolved, is long gone. The new environment, where multidimensional information flows like water (from a firehose!), challenges our brain and behavior at a fundamental level.

This has been shown in the laboratory, where scientists have documented the influence of information overload on attention, perception, memory, decision making, and emotional regulation. And it has also been shown in the real world, where we see strong associations between the use of technology and rising rates of depression, anxiety, suicide, and attention deficits, especially in children.

Although the exact mechanism is still under exploration, a complex story is emerging. We are seeing accelerating reward cycles associated with intolerance to delayed gratification and sustained attention; excessive information exposure connected with stress, depression, and anxiety (e.g., fear of missing out and being non-productive); and, of course, multitasking has been linked to safety issues (such as texting while driving) and a lack of focus (which impacts our relationships, our studies, and our work).

What’s more, our constant engagement with technology interferes with the pursuit of other behaviors critical for maintaining a healthy mind, such as nature exposure, physical movement, face-to-face contact, and restorative sleep. Its negative influence on empathy, compassion, cooperation, and social bonding are just beginning to be understood.

by Adam Gazzaley MD, PhD, Medium |  Read more:
Image: Maria Medem
[ed. It ain't just technology. Economic insecurity and inequality, corporate rapaciousness (in all its various forms), parasitic "healthcare" profiteering, environmental degradation, dysfunctional politics, militarized policing, constant bombardment by consumer marketing industries (see also: Speech Defects), pervasive surveillance, endless wars and more. If you don't have some form of cognitive impairment you're probably nuts.]

Could Seoul Be the Next Great Cyberpunk City?

Since the original Blade Runner takes place in an imagined late-2010s Los Angeles, I’d have gotten a kick out of seeing its sequel, which after prolonged speculation finally came out late last year, in the actual late-2010s Los Angeles. But having moved to Korea a few years ago, I settled for a screening here in Seoul. In some ways, this ultimately felt like the more appropriate city in which to see the movie: when Blade Runner 2049‘s first trailer came out, I wrote here about its apparent acknowledgement of the considerable Korean influence felt in Los Angeles since its predecessor’s release. While no small number of Koreans already lived there back in 1982, the makers of Blade Runner — like everyone else at the time — couldn’t see past the economic rise of Japan, whose cash-flooded conglomerates then seemed poised to buy up not just Hollywood’s studios the downtown skyline as well.

When I did make it back to Los Angeles earlier this year, I saw sights that proved more memorable than even the spectacles of Blade Runner 2049. Coming in from the airport, for instance, I looked up to see the Korean Air logo looming 73 stories above downtown at the top of the Wilshire Grand Center, a building still under construction when last I saw it. Then, lowering my sights from that glowing orb so reminiscent of the South Korean flag, I spotted a tent village that had sprouted in the darkness of a freeway underpass. The first Blade Runner envisioned Los Angeles as having plunged into a kind of third-world condition, with its ruling class perched high above (if not on a different planet from) the teeming common element doing business in countless different languages down in the streets. Something tells me that the contrast in the real 2019 might look even starker than that.

But then contrast lies at the heart of the science-fiction tradition of cyberpunk, the most influential examples of which include Blade Runner as well as William Gibson’s Neuromancer, published in 1984 and now considered the archetypical cyberpunk novel. The common description of Gibson’s work of that period, “high tech meets low life,” also broadly characterizes cyberpunk itself, which, unlike so much sci-fi of earlier generations before, understands that technological progress doesn’t come with moral progress. Nor does it come with the kind of widespread social or economic progress upon which many stories of the future once premised themselves. Nor does that high tech penetrate all areas equally: “The future is already here ,” said Gibson in what has turned out to be one his most-quoted lines. “ It’s just not evenly distributed.”

A visitor to Seoul, even if they’ve come from the supposedly developed West, might feel as if they’ve entered one of those unevenly distributed chunks of the future. The obvious elements are all in place: the shiny skyscrapers and the colossal video screens on their sides; the punctual, usually unsoiled subway trains and the riders streaming high-definition video or playing startlingly advanced-looking games on their phones. But one doesn’t have to stay long to be impressed less by the technology itself than by how thoroughly it has integrated with the life of nearly all Seoulites. Here, in a place where even grandmothers stand in line for the latest smartphone, the fact that some middle-aged Americans have never bothered to get one at all looks like an example of the reverse luxury possible only in a terminally decadent culture.

Cyberpunk’s list of required conditions includes not just technology, but all-pervasive technology. Often, characters must wield their personal technology to evade the impersonal technology commanded by their corporate overlords. Does anyone make an attempt to evade the surveillance going on in seemingly every corner of Seoul, a saturation that surprises even visitors familiar with the all-seeing CCTV cameras of London? Enthusiasts of cyberpunk, attuned to its essentially dystopian nature, will also quickly take note of how Korea seems to have taken the genre’s convention of a few mega-corporations running the show as its economic model. True Korean corporate loyalists can buy just about everything — food, entertainment, healthcare, car, home, and much else besides — from the same conglomerate, or chaebol.

I actually saw Blade Runner 2049 at one of a chain of chaebol-owned movie theaters myself. At that same multiplex, conveniently located right across the street from my apartment building, I’d previously seen Oshii Mamoru’s acclaimed animated cyberpunk film Ghost in the Shell. The original Blade Runner had taken Los Angeles and Japanified it, bringing in as much the boisterous, freewheeling urban Japan of centuries past as the outwardly straight-laced technopolis of 1980s Tokyo. Ghost in the Shell cast as its near-future Japanese setting of “New Port City” an apparently little-altered Hong Kong; the tour-de-force montage in the middle of the movie constitutes a master class in not just how to make a place come realistically alive in animation, but to unify setting, theme, form, and substance at a stroke. (...)

As they and other similarly inclined foreign photographers know, cyberpunk does not live by skyscrapers and outdoor video screens alone. If it did, any of the new metropolises erected whole across China over the past few weeks would offer a superior setting. While Seoul does have one foot in the future, from the Western perspective, the other foot kept firmly in the past makes it a potentially great cyberpunk city. Or to use a different metaphor, one I’ve come across from time to time in Korean books about Korean society, this country runs simultaneously on two “clocks,” one of them pushed so aggressively to the present that it runs perpetually fast, and another that has barely moved for decades or even centuries. Korea, like cyberpunk itself, everywhere mixes the futuristic with the things and ways of the past.

by Colin Marshall, LARB | Read more:
Image: uncredited

Sunday, July 8, 2018

Black Dub


[ed. Feat. Trixie Whitley. See also: Wild Country (with her dad Chris).]

Low Bar


Gary Trudeau, Doonesbury
via:

Pentagon Audit: “There Will Be Unpleasant Surprises”

For the first time in its history, the Department of Defense is now undergoing a financial audit.

The audit, announced last December, is itself a major undertaking that is expected to cost $367 million and to involve some 1200 auditors. The results are to be reported in November 2018.

“Until this year, DoD was the only large federal agency not under full financial statement audit,” Pentagon chief financial officer David L. Norquist told the Senate Budget Committee in March. Considering the size of the Pentagon, the project is “likely to be the largest audit ever undertaken,” he said.

The purpose of such an audit is to validate the agency’s financial statements, to detect error or fraud, to facilitate oversight, and to identify problem areas. Expectations regarding the outcome are moderate.

“DOD is not generally expected to receive an unqualified opinion [i.e. an opinion that affirms the accuracy of DoD financial statements] on its first-ever, agency-wide audit in FY2018,” the Congressional Research Service said in a new report last week. See Defense Primer: Understanding the Process for Auditing the Department of Defense, CRS In Focus, June 26, 2018.

In fact, “It took the Department of Homeland Security, a relatively new and much smaller enterprise, about ten years to get to its first clean opinion,” Mr. Norquist noted at the March Senate hearing.

In the case of the DoD audit, “I anticipate the audit process will uncover many places where our controls or processes are broken. There will be unpleasant surprises. Some of these problems may also prove frustratingly difficult to fix.”

by Steven Aftergood, Federation of American Scientists | Read more:
Image: via
[ed. See also: How $21 Trillion in U.S. Tax Money Disappeared]

A Crime and a Pastime

The paranoid style of American skateboarding

People came to California for gold or agriculture or loansharking or, later, to get famous, and by late 1962 it was the most populous state in the country. And it was there, around that time, on the blacktop schoolyards of coastal suburbs, that a sport and pastime uniquely suited to the American ethos blossomed: skateboarding. The frontier sought by so many was gone, it’s true, but the frontierist’s mindset remained. What business of yours is it if my friends and I want to grind on the painted curb behind the grocery store? Leave us alone. The curb became, for the skater, a fancifully deregulated zone imbued with limitless possibilities—and therefore a kind of freedom, so long as he could be left alone in his pursuits.

Which is to say that skaters have always taken a perverse pride in being outsiders and misfits, bonding over stories of jocks who bullied them and sedans that drove by yelling “KICKFLIP!” It’s a sui generis sport typically without spectators, time limits, written rules, or even competitors—an activity so smitten with its own exceptionalism that, even today, at the height of its popularity, many skateboarders deny the “sport” label entirely. As professional skater Braydon Szafranski recently told Rolling Stone—in reference to its introduction into the 2020 Olympic Games—“Skateboarding is a crime, not a sport.” (...)

However absurd it may sound, skateboarding’s first years were clearly bound up with America’s burgeoning paranoiac libertarianism. It may have been that the blithe surfiness of early skateboarding masked its suspicion of outsiders—of anyone, that is, who doesn’t skate—as well as its predilection for clique formation against a sometimes (though not always) invisible regulatory bogeyman. Take, for example, John Severson’s editorial from the first issue of The Quarterly Skateboarder, which jumbles cheeriness with frontier psychology and an oddly preemptive suspicion of “opponents”:
Today’s skateboarders are founders in this sport—they’re pioneers—they are the first. There is no history in Skateboarding—its being made now—by you. The sport is being molded and we believe that doing the right thing now will lead to a bright future for the sport. Already there are storm clouds on the horizon with opponents of the sport talking about ban and restriction.
Whatever his good intentions (which he had in spades), Severson’s “storm cloud” prophecy against regulation haunts a sport that today proves unable to provide healthcare for its practitioners, even as it courts global reach, massive corporate sponsorship, and elite brand status. And despite Severson’s excitable projection for the sport’s “bright future,” it should be admitted that skating’s do-it-alone pioneerism has given way to a solipsistic, even paranoid culture that is dispiriting in its susceptibility to conspiratorial thinking—just consider revered skate brand Alien Workshop’s deck collaboration with Infowars.

Perhaps skating’s libertarian streak means that it can be curiously and sadly hostile to social life. Even a cursory glance at the skate video canon reveals encounters between skaters and security guards, cops, and civilians who want them off their property. These clips can bear a close similarity to YouTube videos of sovereign citizens telling police they don’t recognize their authority. As filmer and crew look on, a skater will alternately beg and scream at the person to let them just try the trick one more time. Often the angry authority figure will grab the skater’s board, at which point an all-out scramble occurs. Arguing with cops is one thing, but it’s a sad affair watching teenage skateboarders brawl with aging apartment superintendents.

The Lesson of the Master

Worryingly, these young skaters have a new champion in Jordan Peterson, the Canadian academic whose 2018 book 12 Rules for Life became a surprise best seller on the back of his popular YouTube channel. Plenty has been written about his noxious blend of misogyny, neo-fascism, and self-help, but I was struck to learn that Peterson’s eleventh commandment is “Do Not Bother Children When They Are Skateboarding.” Peterson is a pseudo-intellectual reactionary, but he basically gets skateboarding right. (He even wields the term “boardslide” correctly, which is impressive in a climate where most mainstream sources tend to consider Tony Hawk’s 900 as the alpha and omega of skateboarding achievement.) The chapter begins with Peterson talking about how, while he was working at the University of Toronto, he would sometimes watch young skateboarders hurl themselves down a set of stairs:
Some might call that stupid. Maybe it was. But it was brave, too. I thought those kids were amazing. I thought they deserved a pat on the back and some honest admiration. Of course it was dangerous. Danger was the point. They wanted to triumph over danger. They would have been safer in protective equipment, but that would have ruined it. They weren’t trying to be safe. They were trying to become competent—and it’s competence that makes people as safe as they can truly be.
He goes on to bemoan the installation of skatestoppers, treacherous blocks mounted on rails and ledges for reasons made evident in the name:
The skatestoppers are unattractive. The surround of the nearby sculpture would have to have been badly damaged by diligent boardsliders before it would look as mean as it does now, studded with metal like a pit bull’s collar.
And later:
Beneath the production of rules stopping the skateboarders from doing highly skilled, courageous and dangerous things I see the operation of an insidious and profoundly anti-human spirit.
You cannot create a perfectly safe world, Peterson argues, and efforts to do so are actively harmful. Kids need to familiarize themselves with risk. “We feel invigorated and excited when we work to optimize our future performance, while playing in the present,” he writes. “Otherwise we lumber around, sloth-like, unconscious, unformed and careless. Overprotected, we will fail when something dangerous, unexpected and full of opportunity suddenly makes its appearance, as it inevitably will.”

Peterson is pandering to a general audience; he’s especially popular among millennials, a demographic young enough to have experienced the suffocating effects of helicopter parenting and largely too poor now to have children themselves—thus his readers have not yet had to resist the (presumably overwhelming) pull to become helicopter parents. But even though he doesn’t care about skateboarding beyond using it to publicly denounce coddled children, Peterson has nonetheless stumbled over skateboarding’s bulging libertarian roots. (...)

A Faustian Economy

The caliber of tricks that can be done in quick succession and on demand pales in comparison to what can be achieved over hours and dozens of attempts in an empty parking lot, and so contests are looked at askance in the world of skateboarding—they’re seen as minor divertissements of corporatism that foster boring skating. And while events like Street League and the X Games are popular with younger fans, there has never been an appetite for a widely recognized central governing body in the manner of the NFL or MLB. When skateboarding makes its Olympic debut, it will be represented by the International Roller Sports Federation—the group responsible for rollerblading, skateboarding’s traditional punching bag—with the more credible International Skateboarding Federation playing merely an advisory role.

With paying contests relegated to minor status, and without a functioning umbrella organization, skateboarding is best understood as a full-time freelance economy funded through endorsement deals. Pro skaters are contract employees, paid to be jumping-and-grinding advertisements for half a dozen or so sponsors—makers of boards, shoes, wheels, trucks, clothing, and energy drinks. (In recent years, too, maintaining a social media presence has become another of companies’ demands.) After spending countless man-hours indentured to such companies, the gamest pros find themselves able to branch out and pursue the skater’s dream: they start their own skate brands and sign younger skaters, at which point the cycle spins forward.

Yet as skateboarding has grown in popularity over the last two decades, large corporations have more and more sought their cut. In the same way that libertarians decry the government while thirsting for its handouts, the skate industry has begun to depend on the bankrolls of the very (enormous) “carpetbaggers” it once disparaged: Vans, Adidas, and, especially, Nike. Though Vans has long been associated with skating, Nike and Adidas were mocked when they first tried to enter the market in the nineties. But when streetwear boomed in the 2000s, alongside a decline in board sales, the big two wormed their way into the market, offering top-tier pros contracts they couldn’t refuse. Now that these cherished pros ride for Nike, even riders on other teams hesitate to criticize the brand.

Meanwhile, this capricious, industry-wide shift presages darker days of the too-big-to-fail shade. Skateboarding is now ascendant, but what if these companies find that post-Olympics profits aren’t what they expected? A decision by Nike or Adidas to leave the market could be devastating; not only could the top skaters find themselves without their corporate bargains, but the cash-and-capital drought could blight out board and clothing companies, many of which are owned by the very skaters sponsored by Nike and Adidas.

The house-of-cards structure of the industry, glued together as it is by the mercurial fealty of corporate sponsorship, is masked by skateboarding’s libertarian delusion that it functions as a meritocracy. There are only a handful of famous staircases, after all, down which skateboarders can leap into prominence, provided they land a sufficiently difficult trick. But the meritocracy falls apart as soon as you realize that there is no agreed upon rubric for merit, and no worn path for the would-be sponsored skater. What’s more, there are multiple levels of sponsored skateboarders, which crisscross in both hierarchical and nonhierarchical ways: flow riders, who get products for free; amateurs, who ride for teams in exchange for product and (sometimes) stipends or travel costs; and pros, who can make serious cash through shoe deals, adjacent endorsements, and contests. To further blur the distinction, both amateurs and pros appear in ads and brand videos. And, anyway, there is no set formula for becoming a professional skater; board companies turn amateurs into professionals by way of a black box determination that factors a mix of popularity, marketability, age, and time spent as an amateur. As it turns out, amateurs are often as talented (and usually way more productive) than professionals. If they don’t get injured, if they can slog it out for a few years, amateurs might be lucky enough to earn a pro slot. It’s a system almost comparable to academia, with its adjunct and tenured professors, if, well, more disorganized and libertarian.

by Hanson O’Haver , The Baffler | Read more:
Image: Saiman Chow

Saturday, July 7, 2018

The Best Antivirus Is Not Traditional Antivirus

We set out to do a standard Wirecutter guide to the best antivirus app, so we spent months researching products, reading reports from independent testing labs and institutions, and consulting experts on safe computing. And after all that, we learned that most people should neither pay for a traditional antivirus suite, such as McAfee, Norton, or Kaspersky, nor use free programs like Avira, Avast, or AVG. The “best antivirus” for most people to buy, it turns out, is not a traditional antivirus package.

Information security experts told us that the built-in Windows Defender is good-enough antivirus for most Windows PC owners, and that both Mac and Windows users should consider using Malwarebytes Premium, an anti-malware program that augments both operating systems’ built-in protections. These options provide reliable protection without slowing your computer significantly, installing unwanted add-ons, or harassing you about upgrades.

Malwarebytes is not an all-in-one option for protecting your system against exploits, malware, and other bad stuff. But information security experts repeatedly recommended it as a useful anti-malware layer, one of multiple layers of security you need for your devices, coupled with good habits. Relying on any one app to protect your system, data, and privacy is a bad bet, especially when almost every security app—including Malwarebytes and Windows Defender—has proven vulnerable on occasion. You should have good virus and malware protection, yes, but you also need secure passwords, two-factor logins, data encryption, and smart privacy tools added to your browser. Check out our guide to setting up those layers here.

Why you should trust us

As writers and editors for Wirecutter, we have combined decades of experience with different computers and mobile devices, and their inherent vulnerabilities. We spent dozens of hours for this guide reading results from independent labs like AV-Test and AV-Comparatives, features at many publications such as Ars Technica and PCMag, and white papers and releases by institutions and groups like Usenix, Google’s Project Zero, and IEEE. We also read up on the viruses, ransomware, spyware, and other malware of recent years to learn what threats try to get onto most people’s computers today.

Then we interviewed experts, including computer-security journalists, experienced security researchers, and the information security team at The New York Times (parent company of Wirecutter), whose responsibilities include (but are not limited to) protecting reporters and bureaus both overseas and here in the US from hacking and surveillance:
These experts helped us reach a more nuanced consensus than the typical table-tennis headlines: antivirus is increasingly useless, actually it’s still pretty handy, antivirus is unnecessary, wait no it isn’t, and so on. Although we often test all the products we’re considering, we can’t test the performance of antivirus suites any better than the experts at independent test labs already do, so we relied on their expertise.

Furthermore, every information security expert we talked to agreed that most people shouldn’t pay for a traditional antivirus suite: The virus and malware protection built into Windows and macOS, combined with good habits, are enough for most people. Malwarebytes is a nonintrusive additional layer, one that may catch things written to work around Windows Defender or the Mac’s inherent defenses. So we tested Malwarebytes on Windows and macOS to learn how easy the app was to use, if it noticeably slowed performance or interfered with other apps, or if it had any annoying notifications.

Why we don’t recommend a traditional antivirus suite

It’s insufficient for a security app to just protect against a single set of known “viruses.” There are potentially infinite malware variations that have been crypted—encoded to look like regular, trusted programs—and that deliver their system-breaking goods once opened. Although antivirus firms constantly update their detection systems to outwit crypting services, they’ll never be able to keep up with malware makers intent on getting through.

A quick terminology primer: The word malware just means “bad software” and encompasses anything that runs on your computer with unintended and usually harmful consequences. In contrast, antivirus is an out-of-date term that software makers still use because viruses, Trojan horses, and worms were huge, attention-getting threats in the 1990s and early 2000s. Technically, all viruses are a kind of malware, but not all malware is a virus.

Although each expert we interviewed had their own preferred solutions to the endless stream of computer threats, none recommended buying a traditional antivirus app. So why shouldn’t you install a full antivirus suite from a known brand, just to be on the safe side? For many good reasons:
For these reasons, we don’t recommend most people spend the time or the money to add traditional antivirus software to their personal computer. We didn’t consider newer antivirus products that have not yet been tested by known independent research labs or that aren’t available to individuals.

Two caveats to our recommendations on malware protection:

If you have a laptop provided by your work, school, or another organization, and it has antivirus or other security tools installed, do not uninstall them. Organizations have systemwide security needs and threat models that differ from those of personal computers, and they have to account for varying levels of technical aptitude and safe habits among their staff. Do not make your IT department’s hard job even more difficult.
People with sensitive data to protect (medical, financial, or otherwise), or with browsing habits that take them into riskier parts of the Internet, have unique threats to consider. Our security and habit recommendations are still a good starting point, but such situations may call for more intense measures than we cover here.

by Kevin Purdy, Wirecutter |  Read more:
Image: Kyle Fitzgerald

Distilled Golf: The 3-Club Challenge

If you’re a regular golfer you probably have one club in your bag that you love more than others. One that’s as reliable as Congress is dysfunctional. For Kevin Costner in Tin Cup it was his trusty seven iron. For someone like Henrik Stenson, probably his three wood. For me, it’s my eight iron. There are certain clubs, either through experience, ability, or default just seem to stand out. 

Then there are those that just give us the willies. For example, unlike Henrik I’d put my three wood in that category. I’m convinced no amount of practice will ever make me better with that club. Invariably, I chunk or thin it and rarely hit it straight, but keep carrying it around because I’m convinced I need it - like when a situation calls for a 165 yard blooper on a 210 yard approach. A friend of mine has problems with his driver. He'll carry it around for weeks or months at a time but never use it, because “it’s just not working”. Little wonder.

If you’ve been golfing for a while you’ve probably indulged in the ‘what if’ question. I’m not talking about the misery stories you hear in a clubhouse after every round - those tear-in-the-beer laments like ‘what if I’d only laid up instead of trying to cut that corner’, or, ‘what if I hadn’t bladed that bunker shot into the lake’? Bad decisions and bad breaks. Conversations like those will go on for as long as golf exists and really aren’t that interesting (except for the person drowning their sorrows).

No, what I’m talking about is a more existential question. One that goes to the heart of every golfer’s game: what if you only had three clubs to play with, which ones would you choose? And why?

It’s a fun thought experiment because it makes you think about your abilities in a more distilled perspective: how well do I hit my clubs and what’s the best combination to use to get around a course in the lowest possible score?

Maybe you’ve had the chance to compete in a three-club tournament. They’re out there. Once in a while someone puts one together and they sound like a lot of fun. I’ve never played in one myself, but have wondered at times what clubs I'd choose if given the opportunity. Recently, I got to find out, with some surprising results.

Caveat: I’m not here to suggest that there’s one right mix of clubs for everyone, but I will say that it’s possible to shoot par golf (or better) with only three golf clubs.

First, some background. I’m a senior golfer that’s been playing the game for nearly 25 years. High single to low double digit handicap (I’m guessing since I don’t keep one). Usually shoot in the low to mid-80s with an occasional excursion into the high 70s.

Lately I’ve been playing on a nice nine hole course that rarely sees more than a dozen golfers at any time, even on the weekends. It’s not an executive course or goat-track by any means. In fact it’s as challenging a course as any muni, if not more so, and definitely in better condition. The greens-keeping staff keep it in excellent shape and share resources with a nearby Nicklaus-designed course. It’s your average really nice nine hole course, and would command premium prices if expanded to 18 holes.

Anyway, because there’s hardly anyone around I usually play three balls, mainly for exercise and practice. I’ve always carried my bag, so it’s easy to drive up, unload my stuff, stick three balls in my pocket and take off.

A while back we had some strong winds. Stiff, persistent winds that lasted for days. I don’t mind playing in wind, but these were strong enough that my bag kept falling over when I set it down, and twisting around my body, throwing me off balance when walking up and down fairways. I’m sure I must have looked a bit like a drunk staggering around (not an uncommon sight on some of the courses I’ve played), so I decided to dump the bag and just play with three clubs.

But which ones? Keep in mind that everyone is different, and the clubs I selected are the ones that I thought would work best for me.
***
To begin with, I realized that two are already taken. First, I’d need a putter. According to Golf Digest and Game Golf, you need a putter roughly 41 percent of the time on average. I don’t know about you, but I’m not going to try putting with a driver, three wood, or hybrid no matter how utilitarian they might be. It just feels too weird. Perhaps it’s just personal preference, and if that’s not a big deal with you go for it.

The next club I selected was something that could get me close from a 120 yards out, help around the fringe, and get me out of a bunker. No brainer: sand wedge. I thought about a lob wedge but it didn’t have the distance, and a gap or pitching wedge was just too tough out of the sand and didn’t have enough loft for short flops to tight pins.

Finally, my last club: a six iron. Why the six? A number of reasons. First, and probably most important: I suck at my six iron. Not as bad as my three wood, but for some reason the six has always given me problems. Maybe it's because I’ve never been fit for clubs and it always stood out as being more difficult than most of the others in my bag. I don’t know why, really. In any case, I thought “why not get a little more practice and see if I can get this guy under control”? It also has the distance. When I hit it flush, I can get it to go maybe 170 yards. Maybe. So that completed the set and my new streamlined self was ready for the wind.

Here’s where it gets interesting. Given that most Par 4s are generally in the 350 – 450 yard range or less (see here and here) and Par 5s generally about 450-690 yards (see here), it’s not that hard if you’re hitting a 160 - 170 yard six iron to get on the green in two on shorter Par 4s, and on in three for shorter Par 5s. Even on longer holes if you come up short, you’re still close enough that it’s a sand wedge into the green, usually pitching or chipping from 50 yards or less. Then it’s just a putt for par. Plus, the second or third shot is usually from the middle of the fairway, so there’s an excellent chance that you’ll put your wedge in a good position. I’ve been pleasantly surprised to find that I can make at least one, sometimes two or three pars (even a birdie sometimes), with just three clubs and three balls. It all depends on the length of the hole and the accuracy of my chipping and putting (and of course the wind). It’s a great way to get better at iron play and, especially, short game from 100 yards in.

But there’s more, and here’s where it really gets fun. For various reasons, sometimes I’ll find myself somewhere in the 120 – 160 yard range coming into a green. Too long for a sand wedge but too short for a six iron so I’ve had to learn to dial it back a bit. Hitting a six iron a 140 yards is not that much different than hitting a half swing pitch, but with more control and easier effort. The fun thing is learning how much swing is needed for various distances within that 40 yard gap. For a while, I’d frequently come up 10 yards short of the green or 10 yards long, but it’s getting better, and again, it’s been another opportunity to sharpen up my short game.

I’ve tried substituting a five iron and even a hybrid for more distance off the tee, but the second shot seems harder to control with less lofted clubs (and is tougher to dial back on short Par 3s). Maybe those clubs would work better for other golfers depending on their skill set, but dialing it back is the trickiest part for me. To each his own. The six iron just seemed to strike the right balance. The main thing is finding the right clubs that will give you the greatest accuracy, distance, and control.

Now I’ve got a whole new perspective on the game. Besides being in the fairway more often, I’m hitting more greens in regulation and, when short, still chipping or pitching up to putt for par. There’s also a new sense of creativity. Too often in the past I’d just take whatever club was at the outer limits of my abilities and swing away, full blast (with variable directional and distance control). Now I don’t mind taking a lesser club and swinging easier. To top it off, my iron play and short game have improved considerably. My sand wedge used to be my go to 80-90 yard club, and now tops out at 115. Six iron went from a shaky 165 to a reliable 170. My putting still stinks. Maybe the pros can dial in pin point accuracy with every club, but given the variability I have throughout my bag (and the varying shaft lengths of the clubs themselves) it’s been much more helpful to focus on just these three and improve on what each can do.

It also speeds up the game considerably.

So, last week I took my full bag out, thinking I needed to tune up my driver, three wood and other clubs because I didn’t want those skills to get too rusty. Guess what? I shot worse than I did with my three club setup - mainly because I was all over the fairway, short and long of the greens, and in the woods again. I’m not ready to give up on all my clubs yet, but it’s gratifying to know there are still a few new ways to rediscover the game and enjoy new challenges. Give it a try sometime. Maybe you'll find less is more.

by markk, Duck Soup |  Read more:
Image: markk

via:
[ed. I think I know where this is.]

Friday, July 6, 2018

Same As It Ever Was

Despite public and political pressure, pharmaceutical giant Pfizer keeps raising the prices of its drugs—standing apart from some of its rivals who have vowed to rein in periodic price hiking.

Around 100 of Pfizer’s drugs got higher list prices this week, the Financial Times first reported. The affected drugs include big sellers, such as Lyrica pain capsules, Chantix smoking-cessation medication, Norvasc blood-pressure pills, and the lung-cancer treatment Xalkori.

The price hikes mark a second round of increases for Pfizer this year. While many of the price changes in the individual rounds hover at or under 10 percent—many at 9.4 percent—the hikes collectively boost many drugs’ prices by double-digit percentages for the year overall. For instance, Chantix’s price jumped nearly 17 percent this year; Pfizer gave it a 9.4 percent increase in January and another seven percent boost July 1, bringing the list price of a 56-tablet bottle to $429, the Wall Street Journal noted. Likewise, Pfizer’s erectile dysfunction drug Viagra saw a 9.4 percent increase July 1 after a similar hike in January. Those hikes bring the list price of a month’s supply to $2,211.

Such twice-a-year price increases of around 10 percent used to be commonplace in the US pharmaceutical industry. But notable, eye-popping hikes have made such bumps a flashpoint for consumers and lawmakers. For instance, public fury ignited at Martin Shkreli’s abrupt 5,000 percent price increase of an old, cheap anti-parasitic drug—one often given to babies and people with HIV/AIDS. And Mylan’s gradual 400 percent price increase for the live-saving EpiPen further enraged the public and Congressional committees.

In the aftermath, many—but not all—of Pfizer’s rivals pledged to raise prices just once a year and generally keep the hikes to under 10 percent. Moreover, President Donald Trump suggested on May 30 that the industry was poised to make “massive” voluntary price cuts in the coming weeks.

No such cuts have been announced, and Pfizer’s continued increases belie that notion. “The latest increases signal that it is ‘business as usual’ rather than the voluntary concessions that Trump indicated were coming,” Michael Rea, chief executive of Rx Savings Solutions, told the Financial Times.

by Beth Mole, Ars Technica |  Read more:
Image: Getty/Bloomberg