Tuesday, December 31, 2013



Tara Keefe. Tequila!
via:

Meth: Adderall for Construction Workers

The trailer parks of Jefferson County, Missouri, are a far cry from the international cartels of Breaking Bad, but this is the real picture of meth in America: Eveready batteries and Red Devil Lye on kitchen counters, used syringes mixed in with children's homework, drawers full of forks bent out of shape by chronic users’ obsessive tinkering. Over the course of nearly a decade studying home meth production in the rural U.S., SUNY Purchase anthropologist Jason Pine has looked on as Jefferson County’s practiced ‘chemists’ cook their product, watched addicts inject their own veins, and visited houses destroyed by meth lab explosions. “Jefferson County is largely rural,” Pine told me. “Houses can be quite secluded. It has rocky ridges that make it unsuitable for farming, but great for meth cooking.”

Alice Robb: Who makes meth?

Jason Pine: Many people in Jefferson County begin cooking to supplement their income and to cover the costs of their own addiction. There were some people profiting, but those profits dwindled as their habits increased. These meth manufacturers are not like cartel leaders: They’re making it for personal use. New regulations against pseudoephedrine-based medicine have made large-scale production harder. There’s a new recipe that’s easier and simpler, though it’s more dangerous and explosive.

The cost of setting up a lab is very low—you need a Gatorade bottle, some tubing, some batteries. And it’s portable: You can make it on the run. If you need to, you can pick up your ‘lab’ and throw it out.

Cooking meth is a kind of apprenticeship. Recipes circulate among cooks like secrets or rumors. Apprenticeships take place in the woods or in the home, sometimes inter-generationally. There are cases when three generations of a single family have cooked and used together. They engage in a DIY practice that I equate with alchemy. They’re transmuting base substances—everyday commodities you can find at Walmart—into something precious: a panacea, a cure-all. Meth cures all ills of the world by transforming the world, by tweaking the user’s neurological relation to the world. Meth cooking is alchemy in its contemporary, late capitalist form.

AR: How do people in Jefferson County get into meth?

JP: Many of the people I met began meth on the job—concrete work, roofing, trucking, factory work. It’s a way to make the job easier, to work longer hours and make more money. Meth increases dopamine levels in the brain, which can cause people to engage in repetitive (and often meaningless) actions—a behavioral effect that syncs up well with ‘work you gotta turn your mind off for,’ as one cook told me.

Others began at home, often because their parents, older siblings, or grandparents were making it. I talked to people in prison who began when they were in elementary school. Some users will administer it to their children—they’ll blow it into their mouths if they’re smoking it. They want to share it with their children; they want to experience it together, feel closer. If there’s no entertainment, no sports, nothing to do after school—you need money to pay for gas, to go to the movies—the main activities are drinking, smoking weed. The boundaries are blurry.

With meth, there aren’t big parties like there are with some other drugs. If there are large groups of people who take meth together regularly, it's a network of people who help each other acquire the ingredients to cook it.

by Alice Robb, New Republic |  Read more:
Image: Behrouz Mehri/AFP/Getty

Utagawa Kuniyoshi
via:

Fukushima Plume Already at Alaska Coast

  • Main inventory of Fukushima 137Cs had been transported towards central North Pacific By 2012. […] The inventory of Fukushima radioactivity will almost entirely shift from the western to the eastern North Pacific during the next 5 years.
  • Surface water distribution of Fukushima 137Cs in 2012 (Aoyama et al., 2013; G. Hong, pers. comm.)
via: ENENews

Bacteria for Breakfast

Here’s something to think about while you eat and drink to excess on New Year's Eve: your large intestine is host to roughly a hundred trillion bacteria, weighing a few kilos, and they can have a surprising effect on your health and maybe even your behaviour.

In December, researchers at the California Institute of Technology showed that mice demonstrating abnormal social interactions, obsessive behaviour and intestinal problems – all traits associated with autism in human beings – can be cured if they ingest the right type of bacteria.

That’s quite a startling result, and just the latest in a booming area of research. Typically, you will have hundreds of different species of microbes living in your gut; this is known as your gut microbiome. A study reported this spring showed that not hosting a sufficiently diverse bacteria population can lead to insulin resistance, which is often a precursor for Type 2 diabetes and can make you prone to putting on weight. This is fixable, according to another recent study: if you go on a low-calorie diet, it boosts your gut microbe diversity.

Or you could have a microbe transplant. Lean mice have been made obese simply by giving them the gut flora of obese mice. And, in a remarkable study published in September, gut flora taken from human twins where one is obese and one is lean affected the corpulence of the mice that received them. Those that received the microbiome of the fat twin became fat, and the ones that got the lean twin’s bacteria became lean. Body shape is, to a certain degree, transmissible.

Microbiology is becoming cheaper, and the processing of biological information is happening ever faster. Today, there are even crowdsourced, open-access studies, such as the American Gut project. In September, its scientists released the first analysis of the gut microbiome of North America, based on 1,000 stool samples. This is just the start: 4,000 people are now signed up to the project (you can sign up from anywhere in the world but researchers are having problems with US Customs over importing faecal matter). American Gut is co-ordinating its findings with information from large-scale genomic and body-mapping projects; we are beginning to build the kinds of database that could revolutionise medicine.

by Michael Brooks, New Statesman |  Read more:
Image: Getty

We Need to Talk About TED

In our culture, talking about the future is sometimes a polite way of saying things about the present that would otherwise be rude or risky.

But have you ever wondered why so little of the future promised in TED talks actually happens? So much potential and enthusiasm, and so little actual change. Are the ideas wrong? Or is the idea about what ideas can do all by themselves wrong?

I write about entanglements of technology and culture, how technologies enable the making of certain worlds, and at the same time how culture structures how those technologies will evolve, this way or that. It's where philosophy and design intersect.

So the conceptualization of possibilities is something that I take very seriously. That's why I, and many people, think it's way past time to take a step back and ask some serious questions about the intellectual viability of things like TED. (...)

So what is TED exactly?

Perhaps it's the proposition that if we talk about world-changing ideas enough, then the world will change. But this is not true, and that's the second problem.

TED of course stands for Technology, Entertainment, Design, and I'll talk a bit about all three. I Think TED actually stands for: middlebrow megachurch infotainment.

The key rhetorical device for TED talks is a combination of epiphany and personal testimony (an "epiphimony" if you like ) through which the speaker shares a personal journey of insight and realisation, its triumphs and tribulations.

What is it that the TED audience hopes to get from this? A vicarious insight, a fleeting moment of wonder, an inkling that maybe it's all going to work out after all? A spiritual buzz?

I'm sorry but this fails to meet the challenges that we are supposedly here to confront. These are complicated and difficult and are not given to tidy just-so solutions. They don't care about anyone's experience of optimism. Given the stakes, making our best and brightest waste their time – and the audience's time – dancing like infomercial hosts is too high a price. It is cynical.

Also, it just doesn't work.

by Benjamin Bratton, The Guardian |  Read more:
Image: James Duncan Davidson/TED

Monday, December 30, 2013


Dieter Roth and Björn Roth 

via:

Camila Moreno / Incendie

Preste Atenção


via:
[ed. Or one generation plants the weeds, and another gets the weedwhacker.]

Ken Price (American, 1935-2012), Untitled, 1991
via:

He Ain't Going Nowhere


The title track to Guy Clark’s most recent album, My Favorite Picture of You, may be the finest song he’s ever written. This is no small feat. For one thing, there’s his catalog to consider. Guy wrote “L.A. Freeway,” one of American music’s greatest driving songs and the final word for small-town troubadours on the false allure of big cities. His lyrical detail in “Desperados Waiting for a Train” and “Texas, 1947” presents a view of life in postwar West Texas that is as true as Dorothea Lange’s best Dust Bowl portraiture. When he wrote about the one possession of his father’s that he wanted when his dad died in “The Randall Knife,” he made a universal statement about paternal love and respect. Bob Dylan lists Guy among his handful of favorite songwriters, and most of Nashville does too.

And then there’s the equally significant matter of his timing. Those songs were written in the seventies and eighties, when the hard-living coterie of Guy, Townes Van Zandt, and Jerry Jeff Walker was inventing the notion that a Texas singer-songwriter practiced his own distinct form of artistry, creating the niche in which disciples like Lyle Lovett, Steve Earle, and Robert Earl Keen would make their careers. Yet Guy penned “My Favorite Picture of You” a mere three years ago, just after turning 69, an age to which most of his contemporaries had chosen to coast, provided they were still living at all.

The song originated the way most of them do, with a line. A friend, Gordie Sampson, came to write at Guy’s West Nashville home and brought a hook list with him, a page of potential lines and titles. The two reviewed the list in Guy’s basement workshop, where he splits his time between writing and building guitars, sustaining himself on black coffee, peanut-butter crackers, hand-rolled cigarettes, and an occasional toke of boo. One wall is covered with shelves that hold some 1,500 cassette tapes of demos, live shows, and friends’ albums, and another wall holds luthier tools. The rest of the room is cluttered with the ephemera of his life, some of it stored in little clementine orange crates, the remainder hanging on the walls and scattered on tables. Guy is endlessly loyal, and each item carries a specific sentimental tie. There’s a tight portrait of Van Zandt taken by their friend Jim McGuire. A cane that artist Terry Allen found for him in Santa Fe. Every last piece of a fiddle that Guy smashed on a mantel in a drunken fit forty years ago and still means to repair. And on a stand-up table along the back wall, the actual Randall knife, along with others sent in by fans and a letter of thanks signed by the knife maker himself.

Guy sat across from Sampson at a workbench in the center of the room. A tall man with regal posture, he’s got an angular white mustache and soul patch, wavy gray hair that curls up at his collar, and a woodblock of a forehead that looms over deep-set blue eyes. His general expression is that of someone who’s thinking about something more important than you are. Or at least more interesting. (...)

In November Guy turned 72, but it must be noted that songwriter years, like dog years, aren’t the same as people years. Nashville writers of Guy’s era lived by a different set of rules than the rest of us. They didn’t punch a time clock. If they went for early-afternoon drinks on a weekday, they weren’t skipping out on work but fishing for lines, scribbling the best bits of conversation on cocktail napkins. When they passed a guitar around after the bars closed, ingesting whatever chemicals would carry them to dawn, they were soliciting reactions to new material, refining new songs. It wasn’t partying, it was writing. But it also wasn’t healthy, and while it may have kept them open to exotic ideas and experiences, it worked hell on their bodies. Those who couldn’t manage their appetites either quit drinking and drugging, like Steve Earle, or died, like the long list running from Hank Williams through Van Zandt. Somehow Guy always maintained just enough control to survive without stopping. And now he’s got a young man’s curious mind atop a body that’s fixing to turn 111.  (...)

Guy occupies a unique place among Nashville writers. His reputation is much like John Prine’s—a songwriter who has maintained a long creative career without an outlandish number of big hits. They both get referred to as songwriter’s songwriters, though as Guy told Garden and Gun last year, “It’s flattering, I guess, but you can’t make a f—ing living being a songwriter’s songwriter.” To be clear, Guy has written two country number ones, and his songs have been charting since the seventies, covered by everyone from Johnny Cash and Bobby Bare to Brad Paisley and Kenny Chesney. But it’s the way Guy has conducted his career, his refusal to write songs to anyone’s taste but his own, that has made him one of the most revered figures in Nashville.

by John Spong, Texas Monthly |  Read more:
Image: Wyatt McSpadden

Robert Rauschenberg, Arcanum XI", ur: "Arcanum suite". 1981
via:

Sunday, December 29, 2013


Ferdinando Scianna, Sicily, 1959
via:

The Rise And Fall Of Grunge Typography


Hop on the nostalgia train for a second. Think back to the 90s. To Nirvana, Linklater’s Slacker, and the flannel-clad rebels on the run from the 80s. To skateboards and graffiti and toe rings and VHS tapes. Things were messy then. And type design was messy, too. Words were splayed and chaotic, letters blurred. Textures were thick and heavy. Concert posters looked like someone had splattered paint on paper and then scratched out band names. You may have noticed it, you may not have, but at its peak, this typography style, called grunge, was ubiquitous. Alternative music cds, videogames, and zines—all the aggregate products of a wayward generation—appropriated its unfinished and frenzied aesthetic, and it became the largest, most cohesive movement in recent font design history.

It was everywhere—and then it wasn't.

The Ray Gun Effect

David Carson, the acclaimed graphic designer who created Ray Gun magazine, is the so-called Godfather of Grunge. His method was simple, his gospel twofold: you don’t have to know the rules before breaking them, and never mistake legibility for communication. Carson’s technique of ripping, shredding, and remaking letters touched a nerve. His covers for Ray Gun were bold and often disorienting.(...)

Like many other of the 90s' best things, grunge typography was rooted in angst and discontent. "Grunge typography came in as a backlash, very much like how punk music came in," Segura told me during a recent phone conversation. "It was almost like a societal complaint, if you will: everything was getting too clean. Design by people like David Carson also made it a very accessible direction to go on. We, as human beings, tend to follow more than lead, and everyone just started to do that David Carson look. … And there was, for a certain period of time, a certain refreshing look to it that had not been seen before."


The aesthetic was fueled by raw emotion, but Carson’s tactics were made imitable by technology. The rise of grunge typography coincided with the burgeoning popularity of the Macintosh, which, introduced in 1984, permanently altered the landscape of graphic design and typography. The art of designing by hand—a painful craft of precision and consistency—was no longer the only option. Designers were liberated; the screen and their imagination were the only constraints. In many ways, the modifier "grunge" denotes for typography what it does for music: unfettered, unrestrained, a cry against convention. The experimental typographer is almost always the young typographer, and young typographers in the 90s, armed with new software and ideas, rejected the rule-based fonts of their forebears.

From the viewer’s perspective, the appeal of grunge was based on a basic idea: it had not been seen before. It wasn’t just the experimental design of the letters, but the way they were placed on page. Its bedlam, its body language, resonated with the culture at large. This resonance produced a vital change in typographic method: in a field that was for decades dictated by the principle of neutrality—of meaning being implicit in the text rather than the typeface—fonts were succumbing to association with the genres or ideas with which they were paired.

by Sharan Shetty, The Awl | Read more:
Image: album/magazine covers, uncredited

The English Beat



Here's How Data Thieves Have Captured Our Lives on the Internet

[ed Somewhat of a companion piece to the article further down by Evgeny Morozov.]

Some years ago, when writing a book on understanding the internet, I said that our networked future was bracketed by the dystopian nightmares of two old-Etonian novelists, George Orwell and Aldous Huxley. Orwell thought we would be destroyed by the things we fear, while Huxley thought that we would be controlled by the things that delight us. What Snowden has taught us is that the two extremes have converged: the NSA and its franchises are doing the Orwellian bit, while Google, Facebook and co are attending to the Huxleyean side of things.

In The Master Switch: The Rise and Fall of Information Empires, his magisterial history of the main communications technologies of the 20th century – telephone, radio, movies and television – the legal scholar Timothy Wu discerned a pattern.

Each technology started out as magnificently open, chaotic, collaborative, creative, exuberant and experimental, but in the end all were "captured" by charismatic entrepreneurs who went on to build huge industrial empires on the back of this capture. This is what has become known as the Wu cycle – "a typical progression of information technologies: from somebody's hobby to somebody's industry; from jury-rigged contraption to slick production marvel; from a freely accessible channel to one strictly controlled by a single corporation or cartel – from open to closed system".

The big question, Wu asked, was whether the internet would be any different? Ten years ago, I would have answered: "Yes." Having digested Snowden's revelations, I am less sure, because one of the things he has demonstrated is the extent to which the NSA has suborned the internet companies which have captured the online activities of billions of internet users. It has done this via demands authorised by the secret foreign intelligence surveillance (Fisa) court, but kept secret from the companies' users; and by tapping into the communications that flow between the companies' server farms across the world.

The reason this made sense is because so much of our communications and data are now entrusted to these internet giants. Tapping into them must have seemed a no-brainer to the NSA. After all, Google and Facebook are essentially in the same business as the agency. Its mission – comprehensive surveillance – also happens to be their business model.

The only difference is that whereas the spooks have to jump through some modest legal hoops to inspect our content, the companies get to read it neat. And the great irony is that this has been made possible because of our gullibility. The internet companies offered us shiny new "free" services in return for our acceptance of click-wrap "agreements" which allow them to do anything they damn well please with our data and content. And we fell for it. We built the padded cells in which we now gambol and which the NSA bugs at its leisure.

In our rush for "free" services, we failed to notice how we were being conned. The deal, as presented to us in the End User Licence Agreement, was this: you exchange some of your privacy (in the form of personal information) for the wonderful free services that we (Google, Facebook, Yahoo, Skype, etc) provide in return. The implication is that privacy is a transactional good – something that you own and that can be traded. But, in these contexts, privacy is an environmental good, not a transactional one. Why? Because when I use, say, Gmail, then I'm not only surrendering my privacy to Google, but the privacy of everyone who writes to me at my Gmail address. They may not have consented to this deal, but their email is being read by Google nonetheless. And before any lawyer (or Sir Malcolm Rifkind) pops up to object that having machines read one's communications is not the same thing as having a human being do it, let me gently inquire if they are up to speed on machine-learning algorithms? The fact that Mark Zuckerberg is not sitting there sucking his pencil and reading your status updates doesn't mean that his algorithms aren't making pretty astute inferences from those same updates – which is why Facebook probably knows that two people are going to have an affair before they do; or why one can make interesting inferences about the nature of a couple's marriage from inspection of their network graphs.

And this is where the interests of the NSA and the big internet companies converge. For what they have both managed to do is to abolish the practice of anonymous reading which, in the good old analogue days, we regarded as an essential condition for an open, democratic society. In a networked world, the spooks and the companies know everything you read, and the companies even know how long you spent on a particular page. And if you don't think that's creepy then you haven't been paying attention.

by John Naughton, The Guardian |  Read more:
Image: Alamy

Saturday, December 28, 2013


Prostitute in Tamaulipas
via:

The Great Fratsby

[ed. See also: An open letter to the makers of Wolf of Wall Street, and the Wolf himself.]

A man of humble upbringing decides that he will become a millionaire. For several years, wealth is his only goal, because he desperately wants everything else that comes from being rich. He reinvents himself along the way, transcending his roots, presenting a phony, tony name as his public face to the world. He does not come about his millions entirely legally, and he will one day have to answer for his crimes. But in the high times, he buys a mansion on the Gold Coast of Long Island, where he fills glamorous parties with beautiful women and the men that lust after them. In the film version of his life, he is played by a very tan Leonardo DiCaprio in boat shoes. Pop quiz: “The Wolf of Wall Street” or “The Great Gatsby”?

Even if both films did not open in the same year, starring the same actor, set in the same context of gaudy maximalism, they would still be having an intense conversation with one another (over a Martini and a bloody steak). Both are entries in the great epic of American capitalism, stories of high-flying greed and the power of self-delusion, morality plays about deeply unhappy Trimalchios who drown their insecurities in money and false hopes. But the coincidence (or, rather, brilliant alchemy) of DiCaprio’s appearance in both films just heightens the similarities between the stories, bringing everything into sharp relief.

The tale of two Leos forces the tale of two “Gatsby”s (or three, to bring Fitzgerald’s original novel into it), pitting them against each other, the romantic story versus the depraved one, the tragedy of loving one woman too much versus the tragedy of loving money so much that the soul corrodes. Scorsese’s is a far better film than Baz Luhrmann’s swirly, neon adaptation, but Luhrmann benefited from much better source material, the essence of which couldn’t help but waft off of his hypersaturated, glossy spectacle and still hit the viewer with a cold smack of recognition. No matter where the green light goes, it is always there, and something sad and gleaming shines through. Scorsese’s entire picture sparkles from end to end, dancing so hard and at such a sustained high pitch that it threatens to topple at any moment, and yet there is no lingering light to it, no nagging lesson in Jordan Belfort’s demise. For a person falling from grace to land with a thud, he must have once been graceful. Belfort and Gatsby may share a common criminality on their way to the top, but only one of them makes it look fully disgusting.

In other words, Luhrmann’s film may be the “Gatsby” that this generation deserves (Technicolor, attention-disordered, deafeningly loud, brimming with loose cultural pastiche), but Scorsese’s “Wolf” is the “Gatsby” that the current Wall Street demands—its dark cousin and perverse reflection. There is no deeper romance to “Wolf,” only craven desire. The film has a black heart where a green light should be. (...)

After a recent screening of “Wolf” in New York, the movie’s screenwriter, Terence Winter (who knows from gangsters), said in a Q. & A. that it was a conscious decision (by Scorsese and also DiCaprio, who optioned Belfort’s story for himself and developed it as a passion project for years—this is his Citizen Cocaine) not to show Belfort’s victims in the film: “We never wanted you to hear the voices on the other end of the line.” As a result, “Wolf” has few casualties—a quick mention of a stockbroker who blows his brains out, a few near-death swipes as a result of hubris and drug-induced haze, a heartless sucker punch to the stomach of Belfort’s distraught second wife—and it is the lack of consequences that have left many critics with a queasy feeling, and the fear that Scorsese will do more harm than good by glorifying a bacchanal put on at the expense of innocent people. When the film screened on Wall Street to a crowd of finance types, there were many cheers and high fives. If any movie is in danger this year of having “bad fans,” it’s this one (watch closely as “Scarface” posters in frat houses are quietly replaced with “Wolf” ones).

by Rachel Syme, New Yorker |  Read more:
Image: uncredited

The Visible and the Invisible

Let me tell you a thing or two about jail.

From the very moment in which the words “under arrest” are uttered, everyone you encounter contributes to rendering you powerless.

When I was preparing to begin student-teaching, I talked to a number of experienced teachers who advised me to remember that the most negative response you can give a student (or anyone else for that matter) is to ignore him or her. Not to rebuke, nor to deny, but to ignore. If you have a student who speaks up too frequently and too often without a point, and who is generally disruptive, the best response is to say nothing at all. Let their talk be answered with empty silence. Render them invisible—non-existent—by treating them as such.

The police who arrest you, and then the sheriff’s deputies who are your jailers, similarly negate everything that constitutes your sense of self: your will, your intellect, your emotions. They do so by ignoring you completely. No answers to your questions, whether about the charges, about the process you are going through, about your ability to communicate with anyone outside—about anything. You are something that they process, the ultimate objectification.

Everything is uncomfortable and debilitating. The handcuffs hurt, and having your arms behind your back makes it hard to get into the back seat of the squad car without falling into it. The back seat itself is hard molded plastic, without upholstery of any kind. You can’t sit with your back supported by the seat-back because your manacled arms are in the way. You slide across the hard plastic with every turn, every acceleration or deceleration the driver makes.

At the precinct station house, they hold you first in a tiled room with hard wooden benches bolted to the shiny concrete floor, in the midst of which, as in every room you will now occupy, you see a drain toward which the floor slopes from all sides. Through a small window with wire mesh suspended within it, you can see someone going through papers in the adjoining room. She occasionally looks up at you, and occasionally others appear in the room with her. Eventually a couple of cops, their equipment swinging and rattling heavily from their belts, enter the room you are in and, still refusing to answer any questions, take you back out to the car, load you in, and drive you to the county jail. (...)

I mentioned mindless routine.

You are awakened at 4:30 a.m., ordered to dress and make your bed in the exact manner of this particular institution, made to wait standing for the carts to arrive, and then called one by one to get a tray of food from the guys who came with the cart from the kitchen.

As for the food, I must tell you about the peanut butter. You get a wad of it, wrapped in wax paper, about the size of a lemon or an egg, with some soft, easily torn white bread and no utensil to help spread the wad, which is itself only semi-soft. In every holding cell in the system, you see wads of peanut butter stuck to the ceiling overhead. The ceilings are always fifteen or more feet high. It takes a powerful arm to launch a stiff, hard wad of peanut butter at that ceiling hard enough to get it to stick. Once it is there, however, it seems to stay for perpetuity. I never saw, nor did I ever hear tell of, one of these peanut-butter hardballs coming down.

At about 5:15 a.m. you are ordered back to your cell and locked down for a few more hours of sleep.

The rest of the day consists of a rotation of time spent locked down in your cell and time spent in the common area, where you can find conversations, card games, books, magazines, and loud televisions tuned either to movies (action pictures, crime, jails, violence—lots and lots of violence) or to sports.

This common area is semi-circular and two stories high, the cells ranging along the rim of the circle, their interior walls all glass. A steel staircase and catwalk provide access to the upper story of cells. Everyone is visible at all times. The only nod to privacy is the pony-wall, about two feet high, in front of the toilet in your cell. It assures that when you sit on the toilet you are visible only from the waist up.

The guard’s desk, a miniature command center, stands at the hub of this semi-circle. She or he is watching you all the time.

by Howard Tharsing, Threepenny Review | Read more:
Image: Brad Phillips via:

A Radical Shift in Capitalism



The benefits of personal data to consumers are obvious; the costs are not, writes Evgeny Morozov

Following his revelations this year about Washington’s spying excesses, Edward Snowden now faces a growing wave of surveillance fatigue among the public – and the reason is that the National Security Agency contractor turned whistleblower has revealed too many uncomfortable truths about how today’s world works.

Technical infrastructure and geopolitical power; rampant consumerism and ubiquitous surveillance; the lofty rhetoric of “internet freedom” and the sober reality of the ever-increasing internet control – all these are interconnected in ways most of us would rather not acknowledge or think about. Instead, we have focused on just one element in this long chain – state spying – but have mostly ignored all others.

But the spying debate has quickly turned narrow and unbearably technical; issues such as the soundness of US foreign policy, the ambivalent future of digital capitalism, the relocation of power from Washington and Brussels to Silicon Valley have not received due attention. But it is not just the NSA that is broken: the way we do – and pay for – our communicating today is broken as well. And it is broken for political and economic reasons, not just legal and technological ones: too many governments, strapped for cash and low on infrastructural imagination, have surrendered their communications networks to technology companies a tad too soon.

Mr Snowden created an opening for a much-needed global debate that could have highlighted many of these issues. Alas, it has never arrived. The revelations of the US’s surveillance addiction were met with a rather lacklustre, one-dimensional response. Much of this overheated rhetoric – tinged with anti-Americanism and channelled into unproductive forms of reform – has been useless. Many foreign leaders still cling to the fantasy that, if only the US would promise them a no-spy agreement, or at least stop monitoring their gadgets, the perversions revealed by Mr Snowden would disappear.

Here the politicians are making the same mistake as Mr Snowden himself, who, in his rare but thoughtful public remarks, attributes those misdeeds to the over-reach of the intelligence agencies. Ironically, even he might not be fully aware of what he has uncovered. These are not isolated instances of power abuse that can be corrected by updating laws, introducing tighter checks on spying, building more privacy tools, or making state demands to tech companies more transparent.

Of course, all those things must be done: they are the low-hanging policy fruit that we know how to reach and harvest. At the very least, such measures can create the impression that something is being done. But what good are these steps to counter the much more disturbing trend whereby our personal information – rather than money – becomes the chief way in which we pay for services – and soon, perhaps, everyday objects – that we use?

No laws and tools will protect citizens who, inspired by the empowerment fairy tales of Silicon Valley, are rushing to become data entrepreneurs, always on the lookout for new, quicker, more profitable ways to monetise their own data – be it information about their shopping or copies of their genome. These citizens want tools for disclosing their data, not guarding it. Now that every piece of data, no matter how trivial, is also an asset in disguise, they just need to find the right buyer. Or the buyer might find them, offering to create a convenient service paid for by their data – which seems to be Google’s model with Gmail, its email service.

What eludes Mr Snowden – along with most of his detractors and supporters – is that we might be living through a transformation in how capitalism works, with personal data emerging as an alternative payment regime. The benefits to consumers are already obvious; the potential costs to citizens are not. As markets in personal information proliferate, so do the externalities – with democracy the main victim.

by Evgeny Morozov, Notes EM |  Read more: 
Image:Micah Ganske via:

Friday, December 27, 2013


Charles M. Schulz
via:
[ed. Not much worth posting lately (unless you like endless Best of 2013 lists), so enjoy some family time and reflection this holiday season.]

Wednesday, December 25, 2013

Tuesday, December 24, 2013

The Year We Broke the Internet

As winter storms were buffeting parts of the country last week, our collective attention was drawn halfway around the world to Egypt. Images of the pyramids and the Sphinx covered in snow had emerged, and were being shared tens of thousands of times on Facebook and Twitter. It wasn’t hard to see why. For some, sharing the photos was a statement on global warming. For others, sharing was about the triumph of discovery, making them proud housecats dropping a half-chewed mouse of news on the Internet’s doorstep. For most, however, the photos were just another thoughtlessly processed and soon-forgotten item that represented our now-instinctual response to the unrelenting stream of information we’re subjected to every waking hour: Share first, ask questions later. Better yet: Let someone else ask the questions. Better still: What was the question again?

Needless to say, the photos were bullshit.

It’s hard not to note the tidy symbolism here. The Internet, like the Sphinx, is a ravenous beast that eats alive anyone who can’t answer its hoary riddle. We in the media have been struggling for twenty years to solve that riddle, and this year, the answer arrived: Big Viral, a Lovecraftian nightmare that has tightened its thousand-tentacled grip on our browsing habits with its traffic-at-all-costs mentality—veracity, newsworthiness, and relevance be damned. We solved the riddle, and then we got eaten anyway.

The Egypt photos weren’t the only viral hoax to hijack the social media conversation in the past month. Of the others, the most infamous was reality-TV producer Elan Gale’s in-flight pissing match with a fellow passenger, which he documented on Twitter, and which was shepherded along by BuzzFeed to the delight of hundreds of thousands of onlookers. That it was actually a prank rankled some, but even that turned out to be a boon for the sites that shared it: They got the clicks coming and going, both on the ramp-up and in the reveal. The story may well have been, in the words of Slate’s Dave Weigel, “the sort of shoddy reporting that would get a reporter at a small newspaper fired,” but it was also a perfect microcosm of the way the Internet works now.

“We’re not in the business of publishing hoaxes,” BuzzFeed’s news editor wrote in response to Weigel’s piece, “and we feel an enormous responsibility here to provide our readers with accurate, up-to-date information”—which sounds a bit like Altria’s health inspector saying they’re sorry they gave you cancer.

The fact is, that sort of double-dipping is what most of us who produce Internet content do, myself included. Give me the viral pictures, and I’ll give you the truth. And then, after an appropriate waiting period, I’ll give you the other truth, and capitalize on that traffic too. It’s almost a perfect callback to William Randolph Hearst’s infamous declaration on the eve of the Spanish-American War, “You furnish the pictures and I’ll furnish the war.” Even more fitting, historians don’t think he ever said anything like that. Then as now, it’s the myth that plays, not the reality. Today it just plays on an exponentially larger stage.

The media has long had its struggles with the truth—that’s nothing new. What is new is that we’re barely even apologizing for increasingly considering the truth optional. In fact, the mistakes, and the falsehoods, and the hoaxes are a big part of a business plan driven by the belief that big traffic absolves all sins, that success is a primary virtue. Haste and confusion aren’t bugs in the coding anymore, they’re features. Consider what Ryan Grim, Washington bureau chief for the Huffington Post, told The New York Times in its recent piece on a raft of hoaxes, including Gale’s kerfuffle, a child’s letter to Santa that included a handwritten Amazon URL, and a woman who wrote about her fictitious poverty so effectively that she pulled in some $60,000 in online donations. “The faster metabolism puts people who fact-check at a disadvantage,” Grim said. “If you throw something up without fact-checking it, and you’re the first one to put it up, and you get millions and millions of views, and later it’s proved false, you still got those views. That’s a problem. The incentives are all wrong.”

In other words, press “Publish” or perish.

by Luke O'Neil, Esquire |  Read more:
Image: uncredited

Christmas Song

She was his girl, he was her boyfriend
Soon to be his wife, make him her husband
A surprise on the way, any day, any day
One healthy little giggling, dribbling baby boy
The Wise Men came, three made their way
To shower him with love
While he lay in the hay
Shower him with love, love, love
Love love, love
Love, love was all around

Not very much of his childhood was known
Kept his mother Mary worried
Always out on his own
He met another Mary who for a reasonable fee
Less than reputable was known to be
His heart was full of love, love, love
Love, love, love
Love, love was all around

Lyrics

Written by: Mel Torme and Robert Wells via:

Man Ray. La femme et son poisson 1938.
via:
[ed. See also: here]

HSBC Settlement Proves the Drug War is a Joke

If you've ever been arrested on a drug charge, if you've ever spent even a day in jail for having a stem of marijuana in your pocket or "drug paraphernalia" in your gym bag, Assistant Attorney General and longtime Bill Clinton pal Lanny Breuer has a message for you: Bite me.

Breuer this week signed off on a settlement deal with the British banking giant HSBC that is the ultimate insult to every ordinary person who's ever had his life altered by a narcotics charge. Despite the fact that HSBC admitted to laundering billions of dollars for Colombian and Mexican drug cartels (among others) and violating a host of important banking laws (from the Bank Secrecy Act to the Trading With the Enemy Act), Breuer and his Justice Department elected not to pursue criminal prosecutions of the bank, opting instead for a "record" financial settlement of $1.9 billion, which as one analyst noted is about five weeks of income for the bank.

The banks' laundering transactions were so brazen that the NSA probably could have spotted them from space. Breuer admitted that drug dealers would sometimes come to HSBC's Mexican branches and "deposit hundreds of thousands of dollars in cash, in a single day, into a single account, using boxes designed to fit the precise dimensions of the teller windows."

This bears repeating: in order to more efficiently move as much illegal money as possible into the "legitimate" banking institution HSBC, drug dealers specifically designed boxes to fit through the bank's teller windows. Tony Montana's henchmen marching dufflebags of cash into the fictional "American City Bank" in Miami was actually more subtle than what the cartels were doing when they washed their cash through one of Britain's most storied financial institutions.

Though this was not stated explicitly, the government's rationale in not pursuing criminal prosecutions against the bank was apparently rooted in concerns that putting executives from a "systemically important institution" in jail for drug laundering would threaten the stability of the financial system. The New York Times put it this way:
Federal and state authorities have chosen not to indict HSBC, the London-based bank, on charges of vast and prolonged money laundering, for fear that criminal prosecution would topple the bank and, in the process, endanger the financial system. (...)
So you might ask, what's the appropriate financial penalty for a bank in HSBC's position? Exactly how much money should one extract from a firm that has been shamelessly profiting from business with criminals for years and years? Remember, we're talking about a company that has admitted to a smorgasbord of serious banking crimes. If you're the prosecutor, you've got this bank by the balls. So how much money should you take?

How about all of it? How about every last dollar the bank has made since it started its illegal activity? How about you dive into every bank account of every single executive involved in this mess and take every last bonus dollar they've ever earned? Then take their houses, their cars, the paintings they bought at Sotheby's auctions, the clothes in their closets, the loose change in the jars on their kitchen counters, every last freaking thing. Take it all and don't think twice. And then throw them in jail.

Sound harsh? It does, doesn't it? The only problem is, that's exactly what the government does just about every day to ordinary people involved in ordinary drug cases.

by Matt Taibbi, Rolling Stone |  Read more:
Image: MediaBistro

Imagining the Post-Antibiotics Future


Predictions that we might sacrifice the antibiotic miracle have been around almost as long as the drugs themselves. Penicillin was first discovered in 1928 and battlefield casualties got the first non-experimental doses in 1943, quickly saving soldiers who had been close to death. But just two years later, the drug’s discoverer Sir Alexander Fleming warned that its benefit might not last. Accepting the 1945 Nobel Prize in Medicine, he said:
“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.”
As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.

Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940, while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigella emerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.

With antibiotics losing usefulness so quickly — and thus not making back the estimated $1 billion per drug it costs to create them — the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than 500 chronic-disease drugs for which resistance is not an issue — and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009, and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.

Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies — who calls antibiotic resistance as serious a threat as terrorism — recently published a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.

by Maryn McKenna, Medium |  Read more:
Image: Eneas De Troya

Sounding the Alarm


At 2:46 p.m. on March 11, 2011, the Pacific Plate, just off Japan's northeast coast, suddenly thrust downward, unleashing a monstrous, 9.0-magnitude earthquake that rocked the country for the next six minutes. The massive Tohoku quake and resulting tsunami are believed to have killed at least 16,000 people and injured 6,000 more. Another 2,600 people are still missing and presumed dead. The quake was the most powerful to ever strike Japan, and was the fourth-largest ever recorded. It also was the first earthquake to be heard in outer space, and was the most expensive natural disaster in human history, generating $235 billion in total damage. But there was a silver lining, if you could call it that: Tohoku was also the first time that Japanese citizens were given the precious, if limited, gift of time.

That gift came in the form of Japan's earthquake early warning system, which detected the giant temblor just before it hit and immediately sent computer-generated alerts across the country to cellphones, TVs, schools, factories, and transit systems. Japan put its finishing touches on its $500 million early warning system in 2007, leaving four years — barely the blink of an eye in geological timescales — before the investment paid off.

And in 2011, by all accounts it did. Although it's impossible to quantify the number of lives that the system saved, there were reports in the quake's aftermath of schools having had time to get all their students under desks, of eleven 320-mile-per-hour bullet trains slowing to a stop; of more than 16,000 elevators automatically shutting down when the alarm system went off. In the sixty seconds before the giant temblor struck, roughly 52 million people received text-message warnings that the quake was fast approaching and that they needed to get out of harm's way.

In 2007, the same year that Japan finished building its early warning system, earthquake scientists roughly 5,000 miles away in California marked a related, albeit far humbler, benchmark. Richard Allen, director of the Seismological Laboratory at UC Berkeley, was in his office on October 30 when a 5.6-magnitude earthquake hit the Alum Rock section of San Jose. The quake caused only moderate shaking and very little damage, but Allen had reason to be excited: The event marked the first time his Berkeley group was able to test its own early warning system, set up just two weeks before. "It was our first proof-of-concept event," Allen recalled in a recent interview. Thirty minutes after the light shaking ended, Allen received an email showing that the system had successfully detected the right waves, done the right math, and made the right prediction about when and how strongly the quake would hit.

Yet this was only a researcher's victory. The tiny system his team had built produced no cascade of texts, no TV or radio transmissions, and no widespread notification that an earthquake was on its way. In the event of a disaster, the technology wasn't even in place for Allen himself to receive a real-time notification from his own system.

But this was not a case of Japan being light years ahead of the United States in terms of earthquake-science research. Instead, the wide technological gap between the two countries has more to do with each nation's sense of urgency about the dangers of earthquakes, and the need to prepare for them. In fact, back in 2003, Allen had co-written what essentially became the seminal scientific paper on quake predictions. His work showed that it's technically possible to predict the size and location of quakes right before they strike, and argued for the methods that became the basis for early warning systems, much like the one later built in Japan.

And yet a decade after Allen co-authored that paper, California, the second-most seismically active state in the nation (behind only Alaska), still has next to nothing in terms of a public seismic warning system. The technology exists and has for years, but the state legislature has failed to find or allocate the necessary funds to make it happen.

by Azeen Ghorayshi, East Bay Express |  Read more:
Image: Stephen Loewinsohn

The Sense of an Ending

[ed. I tend to avoid books that seem overly hyped and/or have conflicting reviews, so I came late to The Sense of an Ending, but it's a wonderful (if somewhat short) novel that you almost want to read twice once you've finished it. It resonated with me, anyway. I have the habit of dog-earing the left-hand corner of pages in sections that contain particularly poignant or insightful passages (so I can find them again). After dog-earing nearly every other page of this book, I finally gave up. See also: Life in Smoke and Mirrors]

The new book is a mystery of memory and missed opportunity. Tony Webster, a cautious, divorced man in his 60s who “had wanted life not to bother me too much, and had succeeded,” receives an unexpected bequest from a woman he’d met only once, 40 years earlier. The mother of his college girlfriend, Veronica, has bequeathed him £500 — a legacy that unsettles Tony, pushing him to get in touch with Veronica (their relationship had ended badly) and seek answers to certain unresolved questions.

Had he loved Veronica? (At the time, it was an emotion he had lacked the spine to own up to.) What had happened to the energetic boy he used to be, “book-hungry, sex-hungry, meritocratic, anarchistic,” who thought of himself as “being kept in some kind of holding pen, waiting to be released” into an engaged adult life of “passion and danger, ecstasy and despair”? And what ever became of the friend he and Veronica both knew back then, a brainy, idealistic boy named Adrian Finn? Gradually, Tony assembles his willfully forgotten past impressions and actions, joining together the links that connect him to these people, as if trying to form a “chain of individual responsibilities” that might explain how it happened that his life’s modest wages had resulted in “the accumulation, the multiplication, of loss.” (...)

Adrian’s indifference to playing it cool somehow made him the leader of the boys’ clique when they were teenagers; he became the one they looked up to. Yet Tony never emulated Adrian, and was guilty of the pose Adrian deplored: pretending not to care. He pays for this failure again and again, from his 20s to his 60s. “Does character develop over time?” Tony asks himself, wondering at the “larger holding pen” that has come to contain his adult life. Maybe character freezes sometime between the ages of 20 and 30, he speculates. “And after that, we’re just stuck with what we’ve got. We’re on our own. If so, that would explain a lot of lives, wouldn’t it? And also — if this isn’t too grand a word — our tragedy.” (...)

But who does Tony enfold into his “we”? His agonized analysis is entirely self-­referential, as solitary and armored as the man himself. Decades earlier, Tony had accused Veronica of an “inability to imagine anyone else’s feelings or emotional life,” but it was he, not she, who was incapable of looking outside his own head. Barnes’s unreliable narrator is a mystery to himself, which makes the novel one unbroken, sizzling, satisfying fuse. Its puzzle of past causes is decoded by a man who is himself a puzzle. Tony resembles the people he fears, “whose main concern is to avoid further damage to themselves, at whatever cost,” and who wound others with a hypersensitivity that is insensitive to anything but their own needs. “I have an instinct for survival, for self-­preservation,” he reflects. “Perhaps this is what Veronica called cowardice and I called being peaceable.”

by Liesl Schillinger, NY Times |  Read more:
Image: via:

Monday, December 23, 2013


Donna Watson, Edge of Light
via:

Frieda Meaney
via:

Surviving Anxiety


I’ve finally settled on a pre-talk regimen that enables me to avoid the weeks of anticipatory misery that the approach of a public-speaking engagement would otherwise produce.

Let’s say you’re sitting in an audience and I’m at the lectern. Here’s what I’ve likely done to prepare. Four hours or so ago, I took my first half milligram of Xanax. (I’ve learned that if I wait too long to take it, my fight-or-flight response kicks so far into overdrive that medication is not enough to yank it back.) Then, about an hour ago, I took my second half milligram of Xanax and perhaps 20 milligrams of Inderal. (I need the whole milligram of Xanax plus the Inderal, which is a blood-pressure medication, or beta-blocker, that dampens the response of the sympathetic nervous system, to keep my physiological responses to the anxious stimulus of standing in front of you—the sweating, trembling, nausea, burping, stomach cramps, and constriction in my throat and chest—from overwhelming me.) I likely washed those pills down with a shot of scotch or, more likely, vodka, the odor of which is less detectable on my breath. Even two Xanax and an Inderal are not enough to calm my racing thoughts and to keep my chest and throat from constricting to the point where I cannot speak; I need the alcohol to slow things down and to subdue the residual physiological eruptions that the drugs are inadequate to contain. In fact, I probably drank my second shot—yes, even though I might be speaking to you at, say, 9 in the morning—between 15 and 30 minutes ago, assuming the pre-talk proceedings allowed me a moment to sneak away for a quaff.

If the usual pattern has held, as I stand up here talking to you now, I’ve got some Xanax in one pocket (in case I felt the need to pop another one before being introduced) and a minibar-size bottle or two of vodka in the other. I have been known to take a discreet last-second swig while walking onstage—because even as I’m still experiencing the anxiety that makes me want to drink more, my inhibition has been lowered, and my judgment impaired, by the liquor and benzodiazepines I’ve already consumed. If I’ve managed to hit the sweet spot—that perfect combination of timing and dosage whereby the cognitive and psychomotor sedating effect of the drugs and alcohol balances out the physiological hyperarousal of the anxiety—then I’m probably doing okay up here: nervous but not miserable; a little fuzzy but still able to speak clearly; the anxiogenic effects of the situation (me, speaking in front of people) counteracted by the anxiolytic effects of what I’ve consumed. But if I’ve overshot on the medication—too much Xanax or liquor—I may seem to be loopy or slurring or otherwise impaired. And if I didn’t self-medicate enough? Well, then, either I’m sweating profusely, with my voice quavering weakly and my attention folding in upon itself, or, more likely, I ran offstage before I got this far. I mean that literally: I’ve frozen, mortifyingly, onstage at public lectures and presentations before, and on several occasions I have been compelled to bolt from the stage.

Yes, I know. My method of dealing with my public-speaking anxiety is not healthy. It’s dangerous. But it works. Only when I am sedated to near-stupefaction by a combination of benzodiazepines and alcohol do I feel (relatively) confident in my ability to speak in public effectively and without torment. As long as I know that I’ll have access to my Xanax and liquor, I’ll suffer only moderate anxiety for days before a speech, rather than sleepless dread for months. (...)

My assortment of neuroses may be idiosyncratic, but my general condition is hardly unique. Anxiety and its associated disorders represent the most common form of officially classified mental illness in the United States today, more common even than depression and other mood disorders. According to the National Institute of Mental Health, some 40 million American adults, about one in six, are suffering from some kind of anxiety disorder at any given time; based on the most recent data from the Department of Health and Human Services, their treatment accounts for more than a quarter of all spending on mental-health care. Recent epidemiological data suggest that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetime. And it is debilitating: studies have compared the psychic and physical impairment tied to living with an anxiety disorder with the impairment tied to living with diabetes—both conditions are usually manageable, sometimes fatal, and always a pain to deal with. In 2012, Americans filled nearly 50 million prescriptions for just one antianxiety drug: alprazolam, the generic name for Xanax. (...)

Stigma still attaches to mental illness. Anxiety is seen as weakness. In presenting my anxiety to the world by writing publicly about it, I’ve been told, I will be, in effect, “coming out.” The implication is that this will be liberating. We’ll see about that. But my hope is that readers who share this affliction, to whatever extent, will find some value in this account—not a cure for their anxiety, but perhaps some sense of the redemptive value of an often wretched condition, as well as evidence that they can cope and even thrive in spite of it. Most of all, I hope they—and by “they” I mean “many of you”—will find some solace in learning that they are not alone.

by Scott Stossel, The Atlantic |  Read more:
Image: Jamie Chung

The Weight of the Past

This column is about Edward Snowden. And the National Football League. And, I suspect, most of the rest of us.

Although it’s about Snowden, it’s not about the National Security Administration, surveillance or privacy. And although it’s about the N.F.L., it’s not about concussions. Instead, it’s about the unbalanced trajectory of human life.

Snowden’s actions, regardless of whether one supports them or not, have had a prodigious impact on the debate about privacy in the United States and will likely continue to do so. They have had roughly the impact that Snowden wanted them to have. That is, they have altered how many of us think about our relation to the government and to our own technology, and because of this, they infuse this period of his life with a luminescence that will always be with him. He will not forget it, nor will others.

There is an assumption I would like to make here, one that I can’t verify but I think is uncontroversial. It is very unlikely that Edward Snowden will ever do anything nearly as significant again. Nothing he does for the remainder of his life will have the resonance that his recent actions have had. The powers that be will ensure it. And undoubtedly he knows this. His life will go on, and it may not be as tortured as some people think. But in an important sense his life will have peaked at age 29 or 30.

This is not to say that Snowden’s days will not have their pleasures or their meaningfulness. Rather, those pleasures and that meaningfulness will likely always have lurking in the background the momentous period in the spring of 2013.

Players in the N.F.L. have an average career of six years. For many of them, those years — typically along with their college years — are the most exciting of their lives. They represent the cities they play for, enjoy the adulation of fans and receive higher salaries than they are ever likely to again. Many develop deep bonds with their teammates. They get to experience in reality what many male children dream about. And then it is over. If concussions don’t take their toll, they can expect to live another 45 or 50 years while not playing football.

For many people — not just activists like Snowden or professional athletes — life crests early. But it doesn’t end there. It goes on, burdened by a summit that can never be reached again, which one can gaze upon only by turning back. This is not to say that good and worthwhile things will not happen to them, and for a fortunate few there will be other, higher summits. For many, however, those earlier moments will be a quiet haunting, a reminder of what has been and cannot be again.

We might think of these kinds of lives, lives whose trajectories have early peaks and then fall off, as exceptional. In a way they are. But in another way they are not. There is something precisely in the extremity of these lives that brings out a phenomenon that appears more subtly for the rest of us. It appears in different times and different places and under different guises, but it is there for us nevertheless. (...)

Many will balk, reasonably so, at the characterization I have just given. After all, who is to say that a life has crested? How do we know, except perhaps in unique cases like Snowden or many N.F.L. players, that there aren’t higher peaks in our future? Who is able to confidently say they have already lived the best year of their life?

That consideration, I think, only adds to the difficulty. We don’t know. We cannot know whether the future will bring new experience that will light the fire again or will instead be a slowly dying ember. And the puzzle then becomes, how to respond to this ignorance? Do we seek to introduce more peaks, watching in distress if they do not arise? And how would we introduce those peaks? After all, the arc of our lives is determined not simply by us but also our circumstances. Alternatively, do we go on about our days hoping for the best? Or do we instead, as many people do, lead what Thoreau called lives of quiet desperation?

This is not to say that nostalgia is our inescapable fate. The lesson I am trying to draw from reflecting on the examples of Snowden and the N.F.L. is not that the thrill ends early. Rather, in their extremity these examples bring out something else. For most of us, as our lives unfold we simply do not, we cannot, know whether we have peaked in an area of our lives — or in our lives themselves — in ways that are most important to us. The past weighs upon us, not because it must cancel the future, but because it is of uncertain heft.

by Todd May, NY Times |  Read more:

Sunday, December 22, 2013


Itō Jakuchū (Japanese, 1716 - 1800)
via:

Paul Cézanne, Seven Bathers ( c 1900).
via:

Voice Hero: The Inventor of Karaoke Speaks

It’s one a.m. The bar is closing but the night isn’t over yet. While milling about on the sidewalk, a friend suggests, ‘Karaoke?’ And suddenly the night gets a lot brighter—and a little more embarrassing.

It’s safe to say that at no point in human history have there been as many people singing the songs of themselves, uncaring that their song was first sung by Gloria Gaynor, Frank Sinatra, or Bruce Springsteen. Karaoke has become inescapable, taking over bars from Manila to Manchester. Passions run high. In the Philippines, anger over off-key renditions of ‘My Way’ have left at least six dead. That statistic hides, however, the countless renditions of the Sinatra anthem that leave people smiling—or at least just wincing. The sing-along music machine terrifies the truly introverted, but it is a hero to countless closet extroverts, letting them reveal their private musical joy. Literally, karaoke is the combination of two Japanese words, ‘empty,’ and ‘orchestra’—but we might also lovingly translate it as ‘awkward delight.’

Yet for all karaoke’s fame, the name of its Dr. Frankenstein is less known, perhaps because he never took a patent out on the device and only copyrighted its name in the U.S. in 2009. His name is Daisuke Inoue, a Japanese businessman and inventor born in Osaka in 1940. In 2004 he was honored with an Ig Nobel Prize, given for unusual inventions or research.

In 2005, he shared the story of his life leading up to the Ig Nobel in an interview with Robert Scott Field for Topic Magazine. No longer in print, Topic was one of The Appendix’s inspirations (along with StoryCorps) for its celebration of the everyday and undersung heroes of our world. As a history of another sort of invention, Mr. Inoue’s interview was particularly memorable and deserves to be more widely available. With the permission of both Topic and Mr. Inoue, we are pleased to re-present his delightfully inspiring account of his life and work.

We hope you sing along.

***

Last year I received a fax from Harvard University. I don’t really speak English, but lucky for me, my wife does. She figured out the letter was about the Ig Nobel Prizes, awards that Harvard presents for inventions that make people laugh—and then make them think. I was nominated for an Ig Nobel Peace Prize as the inventor of karaoke, which teaches people to bear the awful singing of ordinary citizens, and enjoy it anyway. That is “genuine peace,” they told me.

Before I tell you about my hilarious adventures at the prize ceremony, though, you need to know how I came to invent the first karaoke machine. I was born in May 1940, in a small town called Juso, in Osaka, Japan. My father owned a small pool hall. When I was three and a half years old, I fell from the second floor and hit my head. I was unconscious for two weeks. The doctors told my parents that if I lived, I would probably have brain damage. A Buddhist priest visited me, blessed me and replaced my birth name, Yusuke, with a new name: Daisuke, which means, in the written characters of kanji, “Big Help.” I needed it. Later I learned that the same Buddhist priest had commented that the name would also lead me to help others.

by Daisuke Inoue and Robert Scott, The Appendix | Read more:
Image: courtesy Daisuke Inoue