Saturday, February 28, 2015

You Are Listening To...

...a soothing mix of police radio chatter and ambient music. Choose from Los Angeles, New York, San Francisco, Chicago, or my personal recommendation, Montréal. French police chat really blends into the music nicely. You may need to adjust the balance of each stream a bit to find the right mix. 
[ed. Repost]

photo: markk
[ed. Repost]

Erik Satie

[ed. Repost]

Los Amigos Invisibles

Have some fun.
[ed. Repost]

Our Date with Miranda

[ed. One of my favorite movies: Me and You and Everyone We Know.]

I first met Miranda July years ago at a faraway literary conference in Portland, Oregon. Along with Rick Moody and others we were on a panel that was supposed to converse authoritatively about narrative structure. When it came time for July to speak, she stood up and started singing. She was large-eyed and lithe. I don’t remember what song it was—something she had written herself, I believe. I was startled. Who was this woman? (Her performances and short films had not appeared widely enough to have caught my notice.) I was then mortified, not for her, since she seemed completely at ease and the audience was enthralled, but mortified for narrative structure, which had clearly been given the bum’s rush. (Well, fiction writers will do anything to avoid this topic: it is the one about which they are the most clueless and worried and improvisational.)

Sitting next to Ms. July was the brilliant Denis Johnson, who, inspired by his neighbor, when it was his turn (figuring out one’s turn can be the most difficult part of a panel) also began to sing. Also something he had written himself. I may have laughed, thinking it was all supposed to be funny, realizing too late my mistake. There was a tragic aspect to one verse in the Johnson song. I believe he did not sit down because he had not stood to begin with.

Then it was clearly, or unclearly, my turn. If not the wallflower at the orgy then I was the mute at the a cappella operetta (a condition typical of many a July character though not of July herself): I refused to sing. I don’t remember what I said—I believe I read from some notes, silently vowing never to be on another panel. (The next panel I was on, in Boston, I thought went well by comparison. That is, no one burst into random song. But when I said as much to the person sitting next to me, the editor of a prominent literary journal, he said, “Really? This was the worst panel I’ve ever participated in.”) So my introduction to July was one at which I watched her redefine boundaries and hijack something destined to be inert and turn it into something uncomfortably alive, whether you wanted her to or not. This has been my experience of her work ever since.

July’s first feature-length film, the now-famous independent Me and You and Everyone We Know, also upends expectations. July writes, directs, and stars in all her films. In many ways, while remaining a love story, the film is about the boundary-busting that is ruleless sexuality—stalking and sexual transgression—though here the predators and perpetrators are gentle and female. A boy is coercively fellated by two slightly unpleasant teenage girls devising a competition. Low-level sexual harassment is everywhere and July sometimes plays it for laughs. Two kids in a chat room lead someone on a wild goose chase, writing scatological comments in the language of very young children, and despite all this it is hilarious. A shoe salesman named Richard who has set his hand on fire in front of his sons is hounded by a woman named Christine (played by July herself) who does not know him but who is erotically obsessed with him. She has psychically and perhaps correctly marked him as her mate (the telepathic heart is at the center of much of July’s work).

Another character, a middle-aged woman seeking a partner online, finds herself hooked up with a five-year-old boy in the park. Images of flame and precariousness recur—the burning sun, the burning hand, a bright goldfish riding in a plastic bag on a car roof. And yet all is put forward with tenderness and humor. The desire for human love goes unquestioned and its role in individual fate is assumed to be essential. July’s Christine, a struggling artist who works as a driver for ElderCab, possesses a thin-skinned empathy for everyone, and her love for the shoe salesman (who is played in convincingly addled fashion by John Hawkes) is performed with both vulnerability and purity of passion.

In her two feature-length films the chemistry with her male leads is quite strong: they as well as July are like openly soulful children, attaching without reason or guile, and July is quite focused on this quality of connective vulnerability, as well as on children themselves. Her work also engages with the criterion offered up by the character of a museum curator looking at Christine’s own works: “Could this have been made in any era or only now?” With July it is a little of both. She focuses on people living “courageously with grace,” while also quietly arguing with a culture that asks us to do that.

by Lorrie Moore, NY Review of Books |  Read more:
Image: Nick Wall/Eyevine/Redux

A Glorious Distraction

During the two weeks before the Super Bowl there were more than 10,000 news articles written about the slight deviation in air pressure of the footballs used by the New England Patriots in their American Football Conference Championship victory over the Indianapolis Colts. The Patriots quarterback Tom Brady, in an attempt to defuse conspiracy allegations, joked in a press conference, “Things are fine—this isn’t ISIS.”

He was right: it wasn’t ISIS. During those two weeks, the Islamic State of Iraq and Syria was the subject of only seventy-nine articles in The New York Times. “Deflate-gate” was the subject of eighty. These included interviews with football players, who explained why a deflated ball was easier to throw and catch; physicists, who suggested that the deflation might have occurred due to climate effects; logisticians, who opined on the time necessary to deflate a football; and a seamstress of Wilson footballs who vowed, “It’s not Wilson’s fault.” Even the leader of the free world felt obliged to make a statement. “Here’s what I know,” said President Obama on Super Bowl Sunday. “The Patriots were going to beat the Colts regardless of what the footballs looked like.”

In that period Andy Studebaker’s name appeared in only nine articles, all published in sports blogs. Studebaker is the twenty-nine-year-old backup linebacker for the Colts who, while defending a punt return, was blindsided with a gruesome hit to the chest by the Patriots’ backup running back Brandon Bolden. Studebaker’s head jerked back and he landed on his neck. On the sideline after the play Studebaker was seen coughing up blood.

Nor was much made of the fine levied on professional monster Clay Matthews of the Green Bay Packers for illegally smashing into the defenseless head of Seattle Seahawks quarterback Russell Wilson in the National Football Conference Championship game. Matthews’s fine was $22,050, or approximately what he earns every ninety seconds of game play. There was also little attention given to the fact that, in the second half of that game, Seattle cornerback Richard Sherman injured his left arm so badly that he couldn’t straighten it; he played the final quarter with it bent and pressed tightly to his chest like a chicken wing.

Was it broken? Badly sprained? Was he given shockingly powerful illegal or legal drugs in order to endure the pain? The league, and Seattle, were mum on these points. When asked ten days later about the injury, Sherman said, “It’s a little sore, but not too bad.” Then, with a wink: “That’s my story and I’m sticking to it.” Minutes after the Super Bowl ended it was revealed that Sherman had torn ligaments in his elbow and will have to undergo reconstructive surgery. (...)

NFL Commissioner Roger Goodell might have been grateful for the deflation controversy because it distracted from what otherwise have been the season’s two dominant storylines: the league’s reluctance to discipline players who commit domestic violence and its failure to protect its players from brain damage. But Goodell didn’t need the help. Every thinking fan must, in order to enjoy any NFL game, consent to participate in a formidable suspension of disbelief. We must put aside our knowledge that nearly every current NFL player can expect to suffer from chronic traumatic encephalopathy, a degenerative disease that leads to memory loss, impaired judgment, depression, and dementia.

Football players are also four times more likely both to die from ALS (a fact that Goodell, despite participating in this past year’s ALS ice-bucket challenge, refuses to acknowledge) and to develop Alzheimer’s disease. An NFL player can expect to live twenty years less than the average American male. The average NFL career lasts 3.3 years. By that measure, each season costs an NFL player about six years of his life. Football fans, in other words, must ignore the fact that we are watching men kill themselves.

by Nathaniel Rich, New York Review of Books |  Read more:
Image: Jim Davis/Boston Globe/Getty Images

Hans Erni, Le Dessinateur or Kybernetes, Lithograph in 5 colours, 1956

Plastilina Mosh/El Guincho/Odisea

[ed. Repost]

The Dress That Melted The Internet

The mother of the bride wore white and gold. Or was it blue and black?

From a photograph of the dress the bride posted online, there was broad disagreement. A few days after the wedding last weekend on the Scottish island of Colonsay, a member of the wedding band was so frustrated by the lack of consensus that she posted a picture of the dress on Tumblr, and asked her followers for feedback.

“I was just looking for an answer because it was messing with my head,” said Caitlin McNeill, a 21-year-old singer and guitarist.

Within a half-hour, her post attracted some 500 likes and shares. The photo soon migrated to Buzzfeed and Facebook and Twitter, setting off a social media conflagration that few were able to resist.

As the debate caught fire across the Internet — even scientists could not agree on what was causing the discrepancy — media companies rushed to get articles online. Less than a half-hour after Ms. McNeil’s original Tumblr post, Buzzfeed posted a poll: “What Colors Are This Dress?” As of Friday afternoon, it had been viewed more than 28 million times. (White and gold was winning handily.)

At its peak, more than 670,000 people were simultaneously viewing Buzzfeed’s post. Between that and the rest of Buzzfeed’s blanket coverage of the dress Thursday night, the site easily smashed its previous records for traffic. So did Tumblr.

Everyone, it seems, had an opinion. And everyone was convinced that he, or she, was right. (...)

In an era when just about everyone seems to be doing anything they can to ignite interest online, the great dress debate went viral the old-fashioned way. It just happened. (...)

At its center was a simple yet bedeviling mystery with an almost old-fashioned, trompe l’oeil quality: How could different people see the same article of clothing so differently? The simplicity of the debate, the fact that it was about something as universal as the color of a dress, made it all the more irresistible.

“This definitely felt like a special thing,” said Buzzfeed’s editor in chief, Ben Smith. “It sort of erased the line between web culture and real culture.”

by Jonathan Mahler, NY Times |  Read more:
Images: New Yorker and Wired

Friday, February 27, 2015

William S. Burroughs, The Art of Fiction No. 36

Firecrackers and whistles sounded the advent of the New Year of 1965 in St. Louis. Stripteasers ran from the bars in Gaslight Square to dance in the street when midnight came. Burroughs, who had watched television alone that night, was asleep in his room at the Chase Park Plaza Hotel, St. Louis's most elegant.

At noon the next day he was ready for the interview. He wore a gray lightweight Brooks Brothers suit with a vest, a blue-striped shirt from Gibraltar cut in the English style, and a deep-blue tie with small white polka dots. His manner was not so much pedagogic as didactic or forensic. He might have been a senior partner in a private bank, charting the course of huge but anonymous fortunes. A friend of the interviewer, spotting Burroughs across the lobby, thought he was a British diplomat. At the age of fifty, he is trim; he performs a complex abdominal exercise daily and walks a good deal. His face carries no excess flesh. His expression is taut, and his features are intense and chiseled. He did not smile during the interview and laughed only once, but he gives the impression of being capable of much dry laughter under other circumstances. His voice is sonorous, its tone reasonable and patient; his accent is mid-Atlantic, the kind of regionless inflection Americans acquire after many years abroad. He speaks elliptically, in short, clear bursts.

On the dresser of his room sat a European transistor radio; several science fiction paperbacks; Romance, by Joseph Conrad and Ford Madox Ford; The Day Lincoln Was Shot, by Jim Bishop; and Ghosts in American Houses, by James Reynolds. A Zeiss Ikon camera in a scuffed leather case lay on one of the twin beds beside a copy of Field & Stream. On the other bed were a pair of long shears, clippings from newspaper society pages, photographs, and a scrapbook. A Facit portable typewriter sat on the desk, and gradually one became aware that the room, although neat, contained a great deal of paper.

Burroughs smoked incessantly, alternating between a box of English Ovals and a box of Benson & Hedges. As the interview progressed, the room filled with smoke. He opened the window. The temperature outside was seventy degrees, the warmest New Year's Day in St. Louis's history; a yellow jacket flew in and settled on the pane. The bright afternoon deepened. The faint cries of children rose up from the broad brick alleys in which Burroughs had played as a boy. (...)


When and why did you start to write?


I started to write in about 1950; I was thirty-five at the time; there didn't seem to be any strong motivation. I simply was endeavoring to put down in a more or less straightforward journalistic style something about my experiences with addiction and addicts.


Why did you feel compelled to record these experiences?


I didn't feel compelled. I had nothing else to do. Writing gave me something to do every day. I don't feel the results were at all spectacular. Junky is not much of a book, actually. I knew very little about writing at that time.


Where was this?


In Mexico City. I was living near Sears, Roebuck, right around the corner from the University of Mexico. I had been in the army four or five months and I was there on the GI Bill, studying native dialects. I went to Mexico partly because things were becoming so difficult with the drug situation in America. Getting drugs in Mexico was quite easy, so I didn't have to rush around, and there wasn't any pressure from the law.


Why did you start taking drugs?


Well, I was just bored. I didn't seem to have much interest in becoming a successful advertising executive or whatever, or living the kind of life Harvard designs for you. After I became addicted in New York in 1944, things began to happen. I got in some trouble with the law, got married, moved to New Orleans, and then went to Mexico.


There seems to be a great deal of middle-class voyeurism in this country concerning addiction, and in the literary world, downright reverence for the addict. You apparently don't share these points of view.


No, most of it is nonsense. I think drugs are interesting principally as chemical means of altering metabolism and thereby altering what we call reality, which I would define as a more or less constant scanning pattern. (...)


The visions of drugs and the visions of art don't mix?


Never. The hallucinogens produce visionary states, sort of, but morphine and its derivatives decrease awareness of inner processes, thoughts, and feelings. They are painkillers, pure and simple. They are absolutely contraindicated for creative work, and I include in the lot alcohol, morphine, barbiturates, tranquilizers—the whole spectrum of sedative drugs. As for visions and heroin, I had a hallucinatory period at the very beginning of addiction, for instance, a sense of moving at high speed through space. But as soon as addiction was established, I had no visions—vision—at all and very few dreams. (...)


You regard addiction as an illness but also a central human fact, a drama?


Both, absolutely. It's as simple as the way in which anyone happens to become an alcoholic. They start drinking, that's all. They like it, and they drink, and then they become alcoholic. I was exposed to heroin in New York—that is, I was going around with people who were using it; I took it; the effects were pleasant. I went on using it and became addicted. Remember that if it can be readily obtained, you will have any number of addicts. The idea that addiction is somehow a psychological illness is, I think, totally ridiculous. It's as psychological as malaria. It's a matter of exposure. People, generally speaking, will take any intoxicant or any drug that gives them a pleasant effect if it is available to them. In Iran, for instance, opium was sold in shops until quite recently, and they had three million addicts in a population of twenty million. There are also all forms of spiritual addiction. Anything that can be done chemically can be done in other ways, that is, if we have sufficient knowledge of the processes involved. Many policemen and narcotics agents are precisely addicted to power, to exercising a certain nasty kind of power over people who are helpless. The nasty sort of power: white junk, I call it—rightness; they're right, right, right—and if they lost that power, they would suffer excruciating withdrawal symptoms. The picture we get of the whole Russian bureaucracy, people who are exclusively preoccupied with power and advantage, this must be an addiction. Suppose they lose it? Well, it's been their whole life.

by Conrad Knickerbocker, Paris Review | Read more:
Image: via:

What Long-Distance Trains Teach Us About Public Space in America

"What people don’t like about the train is the time lapse. People don’t have time to tie their own shoes these days.” Trent, a fellow passenger on Amtrak’s Sunset Limited from New Orleans to Los Angeles, was philosophizing about the train. Trent is a middle-aged African-American man from California with whom I struck up a conversation in the observation car, which, for those of you who aren’t versed in the lingo of the rails, is the living room of a long-distance train, heavily windowed and designed for friendly interaction. Rolling through the desert, Trent and I talked for more two hours about family, spirituality, and all the other things that come up when you have opted to ride across the country with strangers and without WiFi.

“People [usually] just want to get from point A to point B as quickly as possible. We don’t give ourselves the chance to be in the moment,” Trent says. “Whether it’s good or bad, you grow. It’s about experiences.” He was describing something special about the long-distance train: It is a place to slow down and experience the present. (...)

Michel Foucault coined the term “heterotopia” to describe what he called “placeless places,” autonomous zones where societal rules are reinterpreted — like the train. (...)

The physical qualities that help to facilitate this sense of connection are human-scale design, a clean and safe environment, and an aesthetic that is straightforward and not overly fanciful. The dimensions of the car make it (generally speaking) cozy and comfortable, but spacious enough that you aren’t on top of the person seated next to you. (When people are too physically close they tend to retreat emotionally and mentally, as anyone who has ever ridden the 1 train during rush hour in Manhattan can attest.)

That long-distance trains aren’t designed with one specific aesthetic, demographic or psychographic in mind means that the ride is more about what’s unfolding within the space rather than the materiality of the car. It also frames the passing landscape in a way that makes it easy to use as a conversation starter. This follows the concept of “triangulation,” which William H. Whyte, a famous public space researcher and advocate, coined to describe a third element that gives people something easy to talk about.

Another important element encouraging interaction is what I will call the “together alone” factor. Riders are in the same space — and apart from everything and everyone else — for an extended period of time. Being in a shared physical space that’s also outside of one’s normal environment for an extended duration facilitates a special sense of focus and an enhanced sense of accountability, which can lead to conversations we wouldn’t normally have with strangers. (Online, the standard for conversation is a different ball game but we aren’t talking about that here.) Democratic theorists since the Ancient Greeks have celebrated public discourse. But where in contemporary offline America does this occur? Housing policies have segregated us by race, class and political leanings. It is increasingly difficult to have open, face-to-face conversations about important topics. Yet on train rides I observed conversations between total strangers about race, religion, sexuality and other taboo topics. Unlike the flame wars of the Internet, these conversations were civil.

by Danya Sherman, Next City |  Read more:
Image: Nikki Yanofsky, YouTube

Why 40-Year-Old Tech Is Still Running America’s Air Traffic Control

On Friday, September 26, 2014, a telecommunications contractor named Brian Howard woke early and headed to Chicago Center, an air traffic control hub in Aurora, Illinois, where he had worked for eight years. He had decided to get stoned and kill himself, and as his final gesture he planned to take a chunk of the US air traffic control system with him.

Court records say Howard entered Chicago Center at 5:06 am and went to the basement, where he set a fire in the electronics bay, sliced cables beneath the floor, and cut his own throat. Paramedics saved Howard's life, but Chicago Center, which controls air traffic above 10,000 feet for 91,000 square miles of the Midwest, went dark. Airlines canceled 6,600 flights; air traffic was interrupted for 17 days. Howard had wanted to cause trouble, but he hadn't anticipated a disruption of this magnitude. He had posted a message to Facebook saying that the sabotage “should not take a large toll on the air space as all comms should be switched to the alt location.” It's not clear what alt location Howard was talking about, because there wasn't one. Howard had worked at the center for nearly a decade, and even he didn't know that.

At any given time, around 7,000 aircraft are flying over the United States. For the past 40 years, the same computer system has controlled all that high-altitude traffic—a relic of the 1970s known as Host. The core system predates the advent of the Global Positioning System, so Host uses point-to-point, ground-based radar. Every day, thousands of travelers switch their GPS-enabled smartphones to airplane mode while their flights are guided by technology that predates the Speak & Spell. If you're reading this at 30,000 feet, relax—Host is still safe, in terms of getting planes from point A to point B. But it's unbelievably inefficient. It can handle a limited amount of traffic, and controllers can't see anything outside of their own airspace—when they hand off a plane to a contiguous airspace, it vanishes from their radar.

The FAA knows all that. For 11 years the agency has been limping toward a collection of upgrades called NextGen. At its core is a new computer system that will replace Host and allow any controller, anywhere, to see any plane in US airspace. In theory, this would enable one air traffic control center to take over for another with the flip of a switch, as Howard seemed to believe was already possible. NextGen isn't vaporware; that core system was live in Chicago and the four adjacent centers when Howard attacked, and this spring it'll go online in all 20 US centers. But implementation has been a mess, with a cascade of delays, revisions, and unforeseen problems. Air traffic control can't do anything as sophisticated as Howard thought, and unless something changes about the way the FAA is managing NextGen, it probably never will.

This technology is complicated and novel, but that isn't the problem. The problem is that NextGen is a project of the FAA. The agency is primarily a regulatory body, responsible for keeping the national airspace safe, and yet it is also in charge of operating air traffic control, an inherent conflict that causes big issues when it comes to upgrades. Modernization, a struggle for any federal agency, is practically antithetical to the FAA's operational culture, which is risk-averse, methodical, and bureaucratic. Paired with this is the lack of anything approximating market pressure. The FAA is the sole consumer of the product; it's a closed loop.

The first phase of NextGen is to replace Host with the new computer system, the foundation for all future upgrades. The FAA will finish the job this spring, five years late and at least $500 million over budget. Lockheed Martin began developing the software for it in 2002, and the FAA projected that the transition from Host would be complete by late 2010. By 2007, the upgraded system was sailing through internal tests. But once installed, it was frighteningly buggy. It would link planes to flight data for the wrong aircraft, and sometimes planes disappeared from controllers' screens altogether. As timelines slipped and the project budget ballooned, Lockheed churned out new software builds, but unanticipated issues continued to pop up. As recently as April 2014, the system crashed at Los Angeles Center when a military U-2 jet entered its airspace—the spy plane cruises at 60,000 feet, twice the altitude of commercial airliners, and its flight plan caused a software glitch that overloaded the system.

Even when the software works, air traffic control infrastructure is not prepared to use it. Chicago Center and its four adjacent centers all had NextGen upgrades at the time of the fire, so nearby controllers could reconfigure their workstations to see Chicago airspace. But since those controllers weren't FAA-certified to work that airspace, they couldn't do anything. Chicago Center employees had to drive over to direct the planes. And when they arrived, there weren't enough workstations for them to use, so the Chicago controllers could pick up only a portion of the traffic. Meanwhile, the telecommunications systems were still a 1970s-era hardwired setup, so the FAA had to install new phone lines to transfer Chicago Center's workload. The agency doesn't anticipate switching to a digital system (based on the same voice over IP that became mainstream more than a decade ago) until 2018. Even in the best possible scenario, air traffic control will not be able to track every airplane with GPS before 2020. For the foreseeable future, if you purchase Wi-Fi in coach, you're pretty much better off than the pilot.

by Sara Breselor, Wired |  Read more:
Image: Valero Doval

'Pics or It Didn’t Happen'

Our social networks have a banality problem. The cultural premium now placed on recording and broadcasting one’s life and accomplishments means that Facebook timelines are suffused with postings about meals, workouts, the weather, recent purchases, funny advertisements, the milestones of people three degrees removed from you. On Instagram, one encounters a parade of the same carefully distressed portraits, well-plated dishes and sunsets gilded with smog. Nuance, difference, and complexity evaporate as one scrolls through these endless feeds, vaguely hoping to find something new or important but mostly resigned to variations on familiar themes.

In a digital landscape built on attention and visibility, what matters is not so much the content of your updates but their existing at all. They must be there. Social broadcasts are not communications; they are records of existence and accumulating metadata. Rob Horning, an editor at the New Inquiry, once put it in tautological terms: “The point of being on social media is to produce and amass evidence of being on social media.” This is further complicated by the fact that the feed is always refreshing. Someone is always updating more often or rising to the top by virtue of retweets, reshares, or some opaque algorithmic calculation. In the ever-cresting tsunami of data, you are always out to sea, looking at the waves washing ashore. As the artist Fatima Al Qadiri has said: “There’s no such thing at the most recent update. It immediately becomes obsolete.”

Why, then, do we do it? If it’s so easy to become cynical about social media, to see amid the occasionally illuminating exchanges or the harvesting of interesting links (which themselves come in bunches, in great indigestible numbers of browser tabs) that we are part of an unconquerable system, why go on? One answer is that it is a byproduct of the network effect: the more people who are part of a network, the more one’s experience can seem impoverished by being left out. Everyone else is doing it. A billion people on Facebook, hundreds of millions scattered between these other networks – who wants to be on the outside? Who wants to miss a birthday, a friend’s big news, a chance to sign up for Spotify, or the latest bit of juicy social intelligence? And once you’ve joined, the updates begin to flow, the small endorphin boosts of likes and re-pins becoming the meagre rewards for all that work. The feeling of disappointment embedded in each gesture, the sense of “Is this it?”, only advances the process, compelling us to continue sharing and participating.

The achievement of social-media evangelists is to make this urge – the urge to share simply so that others might know you are there, that you are doing this thing, that you are with this person – second nature. This is society’s great phenomenological shift, which, over the last decade, has occurred almost without notice. Now anyone who opts out, or who feels uncomfortable about their participation, begins to feel retrograde, Luddite, uncool. Interiority begins to feel like a prison. The very process of thinking takes on a kind of trajectory: how can this idea be projected outward, towards others? If I have a witty or profound thought and I don’t tweet or Facebook it, have I somehow failed? Is that bon mot now diminished, not quite as good or meaningful as it would be if laid bare for the public? And if people don’t respond – retweet, like, favourite – have I boomeranged back again, committing the greater failure of sharing something not worth sharing in the first place? After all, to be uninteresting is a cardinal sin in the social-media age. To say “He’s bad at Twitter” is like saying that someone fails to entertain; he won’t be invited back for dinner.

In this environment, interiority, privacy, reserve, introspection – all those inward-looking, quieter elements of consciousness – begin to seem insincere. Sharing is sincerity. Removing the mediating elements of thought becomes a mark of authenticity, because it allows you to be more uninhibited in your sharing. Don’t think, just post it. “Pics or it didn’t happen” – that is the populist mantra of the social networking age. Show us what you did, so that we may believe and validate it.

by Jacob Silverman, The Guardian | Read more:
Image: Peter Macdiarmid/PA

What It Means to be Made in Italy

My Italian has gotten good enough that I can understand pretty much everything the locals say to me. The only words I consistently miss are the English words that they insert into conversation like french fries stuck in a spaghetti carbonara. WTF is “Nike” when it rhymes with “hike”? “Levi’s” when it rhymes with “heavies”? “Ee Red Hot Keelee Pepper?” But one English phrase comes up so often in conversation, at least within the rag trade, that I can pick it up on the first take: “Made In Italy.”

Cosa Vuol Dire “Made In Italy”?

To understand the meaning of “Made In Italy,” you have to go back to the genesis of the Italian nation, in the second half of the 19th century. Before that, Italy was a geographic concept, but not a political or cultural one. There was no real sense of an “Italian people” in the same way as there was already for the Germans, who formed a nation around the same time. Italy became one country not through collaboration, but through conquest by the Piedmont in the far north, which might as well have been Sweden as far as many Italians were concerned. If you think of Italy as a boot, the Piedmont would be the knee. A knee the rest of the peninsula would feel at their throats.

Citizens of the newly formed Italian state had little shared history, so newly-crowned propagandists created one, often relying on Roman iconography. Over the following decades, nationalistic myths hypertrophied into fascism - also largely a Northern phenomenon. Italy’s defeat in World War II broke this fever, but at a huge cost. The War was, for Italy, also a civil war, mostly pitting North against South, breaking open all the fissures that had been plastered over at the nation’s birth.

Two industries recreated Italian identity following the war - the film industry, and the fashion industry. Film helped the country understand its experience with the war and the poverty that followed. Fashion gave Italians a new nationalistic myth. Its appeals were more to the artistic achievements of the Italian Renaissance than the empire-building of the Roman era, and it helped that the industry’s first successes were in Tuscany, birthplace of Michelangelo. The Sala Bianca in the Pitti Palace hosted the first Italian fashion show in 1951, as well as Brioni’s men’s fashion show, famously the first of its kind, in 1952. Italian designers were able to capture something of the uniquely Italian approach to luxury and craft that had eluded the stuffy couturiers and tailors of Paris and Savile Row. As post-war realist film gave way to Fellini’s surrealist fantasies, Marcello Mastroianni became the guy everyone wanted to look, dress, and act like. And he wore Italian suits.

Allure, but Insecure

By 1980, the industry had grown tremendously, but had become something different. It had mostly moved to Milan, the industrial behemoth of the North. And it had begun to shift its focus from brands like Brioni to emerging giants like Armani and Ferre’. It was at this point that the “Made In Italy” campaign began, with the ambitious goal of branding an entire country. As one politico at Pitti’s “Opening Ceremony” said this year,” ‘Made In Italy’ is not just about selling fashion - it’s about selling Italian quality of life.” “Made In Italy” was intended to convey more than just the country of origin, but elegance, sophistication, craftsmanship - as if Leonardo DaVinci himself had blessed every stitch.

The campaign has been a massive success. Armani remains one of the most valuable brands in all of fashion. Gucci, Prada, and Zegna aren’t far behind. The manufacturing infrastructure that supports these brands is now also used by brands from Huntsman to Tom Ford to Ralph Lauren Purple Label, all of which are Made In Italy.

But the future is uncertain. At the Pitti’s Opening Ceremony, politician after politician announced their full support for the Italian fashion industry, for Pitti as a trade show, and their belief in the enduring allure of Italian luxury. Each one pledged a re-investment in “Made In Italy”. Which is what you do when you’re worried that a good idea’s time is running out.

by David Isle, Styleforum |  Read more:
Image: uncredited

The Thrill of Defeat
[ed. Being 'scooped' in science discoveries]

Thursday, February 26, 2015

[ed. Love you, buddy.]

Regulators Approve Tougher Rules for Internet Providers

[ed. Well, until the next administration anyway. See also: Brief History of the Internet.]

Internet activists declared victory over the nation's big cable companies Thursday, after the Federal Communications Commission voted to impose the toughest rules yet on broadband service to prevent companies like Comcast, Verizon and AT&T from creating paid fast lanes and slowing or blocking web traffic.

The 3-2 vote ushered in a new era of government oversight for an industry that has seen relatively little. It represents the biggest regulatory shake-up to telecommunications providers in almost two decades.

The new rules require that any company providing a broadband connection to your home or phone must act in the "public interest" and refrain from using "unjust or unreasonable" business practices. The goal is to prevent providers from striking deals with content providers like Google, Netflix or Twitter to move their data faster.

"Today is a red-letter day for Internet freedom," said FCC Chairman Tom Wheeler, whose remarks at Thursday's meeting frequently prompted applause by Internet activists in the audience.

President Barack Obama, who had come out in favor of net neutrality in the fall, portrayed the decision as a victory for democracy in the digital age. In an online letter, he thanked the millions who wrote to the FCC and spoke out on social media in support of the change.

"Today's FCC decision will protect innovation and create a level playing field for the next generation of entrepreneurs - and it wouldn't have happened without Americans like you," he wrote.

Verizon saw it differently, using the Twitter hashtag (hash)ThrowbackThursday to draw attention to the FCC's reliance on 1934 legislation to regulate the Internet. Likewise, AT&T suggested the FCC had damaged its reputation as an independent federal regulator by embracing such a liberal policy.

"Does anyone really think Washington needs yet another partisan fight? Particularly a fight around the Internet, one of the greatest engines of economic growth, investment and innovation in history?" said Jim Cicconi, AT&T's senior executive vice president for external and legislative affairs.

Net neutrality is the idea that websites or videos load at about the same speed. That means you won't be more inclined to watch a particular show on Amazon Prime instead of on Netflix because Amazon has struck a deal with your service provider to load its data faster.

For years, providers mostly agreed not to pick winners and losers among Web traffic because they didn't want to encourage regulators to step in and because they said consumers demanded it. But that started to change around 2005, when YouTube came online and Netflix became increasingly popular. On-demand video began hogging bandwidth, and evidence surfaced that some providers were manipulating traffic without telling consumers.

By 2010, the FCC enacted open Internet rules, but the agency's legal approach was eventually struck down in the courts. The vote Thursday was intended by Wheeler to erase any legal ambiguity by no longer classifying the Internet as an "information service" but a "telecommunications service" subject to Title II of the 1934 Communications Act.

That would dramatically expand regulators' power over the industry and hold broadband providers to the higher standard of operating in the public interest.

by Anne Flaherty, AP |  Read more:
Image: uncredited via:

Blogger Porn Ban – Google's Arbitrary Prudishness is Attacking the Integrity of the Web

[ed. This post brought to you by Blogger. See also: Silicon Valley's War on Sex Continues.]

Google has steadily been cutting down on adult-oriented material hosted on Blogger, its blogging platform, over the last few years. Previously, bloggers could freely post “images or videos that contain nudity or sexual activity,” albeit behind an warning screen that Blogger implemented in 2013.

Then, Blogger said “censoring this content is contrary to a service that bases itself on freedom of expression”, so bloggers rightly assumed that they would be free to continue to post adult content.

But in a huge U-turn, Google has changed its position and decided that as of 23 March, there will be no explicit material allowed on Blogger unless it offers “public benefit, for example in artistic, educational, documentary, or scientific contexts” – all which will be determined by Google. Quite how they will do that has not been made clear.

Anything else that does not fall into this category will be restricted to private-only viewing, where only people who have been invited by the blog’s creator will be able to see them; it won’t appear in search results.

This is like having a public library where all the shelves are empty and all the books imperceptible to readers, and authors are required to stand there in person, handing out copies of their work to those hoping to read it. What Google is doing, in reality, is making these blogs invisible. It effectively kills them off.

Some people might read this and think: “Well, Google just doesn’t want to host porn for free any more, that’s why it’s bringing in these restrictions, what’s wrong with that?” To some extent, they’d have a point, because other blog platforms are available and if a users’ sole intent is to make money, then they’re a business and should pay for hosting, not expect to get it for free.

But this new policy has more far-reaching and long-term implications than just censorship and a loss of profit for those posting explicit content, and here’s an example of why: it breaks the internet.

My own personal blog (no explicit images, but graphic descriptions of sex) has had more than 8m readers over 11 years of being hosted on Blogger. If I was forced to make it private and invitation-only, there is no conceivable way that I could contact every single one of those readers and send them a password link to access it.

When I joined Blogger in 2004, I did more than just sign up to publishing a sex blog, I joined a community of people: other erotic writers, non-erotic writers, sex educators, feminist porn-makers, memoirists, political activists, journalists, photographers, news-sharers, comedians, artists, comic creators and more. A disparate bunch of people joined together by one thing in common: we all posted stuff on the internet and then shared it.

This network – indeed the Internet itself – is made up of links. You find a link, click through, and expect to arrive at a page containing some form of content, whether that be text, images, video, or audio files. From its inception, blogging has been about people sharing links; indeed, one of the UK’s first well-known blogs back in 1999 was the link-sharing LinkMachineGo.

By forcing blogs – any blogs, regardless of their content – to become private, it means the link to that blog will no longer work: people clicking through without a password would arrive on a non-existent page. Thousands of other bloggers and websites may have shared that blog’s link over some years, and as a result of this policy change, that link would effectively be dead. In essence, what this means is that a long-standing, interactive, supportive community will be killed off overnight.

by Zoe Margolis, The Guardian |  Read more:
Image: Alamy

Wednesday, February 25, 2015

Jennifer Cantwell, Letter home, 2011

Leigh Smith, selective memory, 2012

Kurt Vonnegut on the Shapes of Stories

[ed. Kurt Vonnegut: A Man Without a Country]

“The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is.”

"Now let me give you a marketing tip. The people who can afford to buy books and magazines and go to the movies don’t like to hear about people who are poor or sick, so start your story up here [indicates top of the G-I axis]. You will see this story over and over again. People love it, and it is not copyrighted. The story is ‘Man in Hole,’ but the story needn’t be about a man or a hole. It’s: somebody gets into trouble, gets out of it again [draws line A]. It is not accidental that the line ends up higher than where it began. This is encouraging to readers. (...)

Now there’s a Franz Kafka story [begins line D toward bottom of G-I axis]. A young man is rather unattractive and not very personable. He has disagreeable relatives and has had a lot of jobs with no chance of promotion. He doesn’t get paid enough to take his girl dancing or to go to the beer hall to have a beer with a friend. One morning he wakes up, it’s time to go to work again, and he has turned into a cockroach [draws line downward and then infinity symbol]. It’s a pessimistic story. (...)

The question is, does this system I’ve devised help us in the evaluation of literature? Perhaps a real masterpiece cannot be crucified on a cross of this design. How about Hamlet? It’s a pretty good piece of work I’d say. Is anybody going to argue that it isn’t? I don’t have to draw a new line, because Hamlet’s situation is the same as Cinderella’s, except that the sexes are reversed.

His father has just died. He’s despondent. And right away his mother went and married his uncle, who’s a bastard. So Hamlet is going along on the same level as Cinderella when his friend Horatio comes up to him and says, ‘Hamlet, listen, there’s this thing up in the parapet, I think maybe you’d better talk to it. It’s your dad.’ So Hamlet goes up and talks to this, you know, fairly substantial apparition there. And this thing says, ‘I’m your father, I was murdered, you gotta avenge me, it was your uncle did it, here’s how.’

Well, was this good news or bad news? To this day we don’t know if that ghost was really Hamlet’s father. If you have messed around with Ouija boards, you know there are malicious spirits floating around, liable to tell you anything, and you shouldn’t believe them. Madame Blavatsky, who knew more about the spirit world than anybody else, said you are a fool to take any apparition seriously, because they are often malicious and they are frequently the souls of people who were murdered, were suicides, or were terribly cheated in life in one way or another, and they are out for revenge.

So we don’t know whether this thing was really Hamlet’s father or if it was good news or bad news. And neither does Hamlet. But he says okay, I got a way to check this out. I’ll hire actors to act out the way the ghost said my father was murdered by my uncle, and I’ll put on this show and see what my uncle makes of it. So he puts on this show. And it’s not like Perry Mason. His uncle doesn’t go crazy and say, ‘I-I-you got me, you got me, I did it, I did it.’ It flops. Neither good news nor bad news. After this flop Hamlet ends up talking with his mother when the drapes move, so he thinks his uncle is back there and he says, ‘All right, I am so sick of being so damn indecisive,’ and he sticks his rapier through the drapery. Well, who falls out? This windbag, Polonius. This Rush Limbaugh. And Shakespeare regards him as a fool and quite disposable.

You know, dumb parents think that the advice that Polonius gave to his kids when they were going away was what parents should always tell their kids, and it’s the dumbest possible advice, and Shakespeare even thought it was hilarious.

‘Neither a borrower nor a lender be.’ But what else is life but endless lending and borrowing, give and take?

‘This above all, to thine own self be true.’ Be an egomaniac!

Neither good news nor bad news. Hamlet didn’t get arrested. He’s prince. He can kill anybody he wants. So he goes along, and finally he gets in a duel, and he’s killed. Well, did he go to heaven or did he go to hell? Quite a difference. Cinderella or Kafka’s cockroach? I don’t think Shakespeare believed in a heaven or hell any more than I do. And so we don’t know whether it’s good news or bad news.

I have just demonstrated to you that Shakespeare was as poor a storyteller as any Arapaho.

But there’s a reason we recognize Hamlet as a masterpiece: it’s that Shakespeare told us the truth, and people so rarely tell us the truth in this rise and fall here [indicates blackboard]. The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is.

And if I die — God forbid — I would like to go to heaven to ask somebody in charge up there, ‘Hey, what was the good news and what was the bad news?’"

by Maria Popova, Brain Pickings |  Read more:
Image: Kurt Vonnegut

How to Avoid Rape in Prison

The Marshall Project put together this short film where former inmates explain how to avoid being sexually assaulted while incarcerated.

Cities Don’t ♥ Us

[ed. See also: A Last Ditch Effort to Preserve the Heart of the Central District, and Fixing Pioneer Square.]

Each day in New York an army of street-sweeping trucks fan across the boroughs purportedly inhaling the litter and waste that parks itself in curbside crevices along residential blocks. (Commercial districts are typically cleaned overnight.) If you’ve ever seen one of these massive contraptions you’ve probably wondered how much they truly clean—rather than just disperse the dirt and debris to another location for the next day’s job—and whether they do more environmental harm than good. And if you happen to be a car-owning New Yorker, the sound of a street sweeper even one or two blocks away can easily trigger a chain of panicked questions starting with “What time is it?” followed by “What day is it?” before landing on “What side of the street am I parked on?”

Alternate-side parking is a part of life in New York City. Both for New Yorkers and the city they live in, which relies on parking violation revenue to provide city services. Last year alone the city raked in $70 million from 1.2 million alternate-side parking violations at $55 a pop. Is it any wonder the Dept. of Sanitation fought a recent proposal that would allow car owners to re-park as soon as the street sweeper finished its work—rather than waste countless hours idling just to honor the official parking rules?

New York’s parking wars embody the modern city’s twisted relationship with its dwellers. Officials know street sweeping is largely ineffective and environmentally harmful. They know the fine bears no relation to the underlying offense (spare me the “social cost” argument) and targets working people living in the low-income outer-borough neighborhoods where parking is tight and cars are essential since mass transit is less available. They know that even if everyone earnestly tries to follow the law, there aren’t nearly enough spots for everyone during alternate-side parking times. They know the average urbanite has zero sympathy (disdain is more like it) for drivers even though the billions the city rakes in each year from bridge and tunnel tolls subsidize their train and bus commutes.

The suggestion that alternate-side parking fines exist for any reason other than revenue is vulgar and pretentious. Yet no official, elected or otherwise, will ever come out and admit alternate-side parking rules have been engineered to extract what amounts to a backdoor tax. Doing so would undermine the movement heralding the smart city as humanity’s redeemer. Cities, we are told again and again by the sustainability expert, are our destiny. (...)

Yet this is the crux of urbanism’s shell game. I don’t believe white urbanites are an inherently favored species. Stock images of attractive white couples may adorn the latest luxury condo, but not because urbanism has a special place in its heart for them. It’s economics, pure and simple. This, of course, contradicts the prevailing propaganda pumping out of government public relations offices across the country. The modern city cares about our health and wellness. It wants to be livable, sustainable, and walkable—vibrant. It wants to provide us with amenities and opportunities to experience culture, food, and community. According to the urbanist, the city wishes us to believe it can be both affordable and upscale. That it is invested in our children’s education, our safety, our careers. It is all things to all people. It has a heart.

And in fact, the city will sometimes tease us. The train you desperately need will arrive on time. There will be parking on the block, an open table at a new restaurant. Your favorite artist will be playing in the park, for free. In that moment, you will believe that things could not be any better than they are. You will feel the soothing satisfaction of having made the right choice in life. You will forget the infinite frustrations and heartaches you endure. You will rationalize your overpriced micro-dwelling as a social good. You will believe the life the city offers has been created to suit your unique and discriminating needs and tastes. And you will be wrong.

Here’s what really happens. First, a city hires a think tank to come up with a revitalization plan (pdf). That plan typically entails attracting young people with skills and education and retaining slightly older people with money and small children. Case in point: Washington, DC, in the early 2000s. As I’ve written elsewhere (pdf), in 2001 Brookings Institute economist Alice Rivlin published a report entitled “Envisioning a Future Washington” in which she mapped a revitalization plan that became a blueprint for gentrification. Urban planning and design firms are then hired to figure out how to make a city more desirable to these people. They conduct surveys, mine the data, and issue reports that award these people a flattering label like “creative class” and pronounce what they are looking for and how cities can attract/retain them. What we see happening in cities across America is the result: an unmitigated backlash against the era of sprawl and its accomplices—strip malls, subdivisions, and big-box chains—nothing more, nothing less.

Indeed, the true genius of urbanism is that the marketing campaigns promoting it have seized upon a search for meaning that traditional institutions can no longer satisfy, promising, if only implicitly, to fill the gap. Just look at the shimmering, stylized artist renditions accompanying every new upscale urban development. Rays of light from the heavens above shower the newly paved sidewalks, reflecting boundlessly off the glass buildings and brightening the lives of the multi-hued populace carrying fresh fruits and vegetables in their canvas tote bags.

Urbanism has become the secular religion of choice practiced with the enthusiasm of a Pentecostal tent revival, and the amenitized high-rise the new house of worship. It, after all, promises to fulfill or at least facilitate all of one’s needs while on Earth—with everything from rooftop community gathering space to sunlit Saturday morning yoga classes in the atrium.

This isn’t a new idea. In his celebrated and remarkably enduring 1949 essay, “Here is New York” (pdf). E.B. White addressed the spiritual life that a city offers:
Many people who have no real independence of spirit depend on the city’s tremendous variety and sources of excitement for spiritual sustenance and maintenance of morale … I think that although many persons are here for some excess of spirit (which cause them to break away from their small town), some, too, are here from a deficiency of spirit, who find in New York a protection, or an easy substitution.
White’s essay isolates the beauty of New York: It is a love letter. By all means, I invite you to be taken with it; I am. His city offers the range of rewards—sights and sounds and things to do. I marvel at the way White’s city operates, they way it manages to instill order and achieve artistry. In White’s capable hands, cities are humanity’s premier expression of civilization.

Urbanism, as well, has deftly aligned itself with human progress. It trumpets terms like “smart growth,” “sustainability,” “resilience,” and “scalability” to demonstrate both its concern with the quality of our lives and its progressive street cred. It champions urban “green space” as the solution for everything from obesity to asthma. But green spaces aren’t even parks. Often people can only use them during prescribed times and in particular ways—concerts, film screenings, seasonal outdoor markets. Moreover, they’re usually owned by a developer who likely built it as a concession for a sweet deal on the land. Yet this is what we celebrate? A paltry scrap of flora? Which just begs a question Thomas Frank posed in his Baffler essay skewering the “vibrancy” movement so many cities have staked their futures on:
… [W]hy is it any better to pander to the “creative class” than it is to pander to the traditional business class? Yes, one strategy uses “incentives” and tax cuts to get companies to move from one state to another, while the other advises us to emphasize music festivals and art galleries when we make our appeal to that exalted cohort. But neither approach imagines a future arising from something other than government abasing itself before the wealthy.
To be fair, in as much as cities can be said to have a consciousness, they fully comprehend their vulnerability. Urban planners know perfectly well that if the delicate balance between safety and prosperity is lost, then disinvestment and abandonment can strike. But they have also learned that people can be manipulated to identify with the city and thereby tolerate just about anything it dishes.

by Dax-Devlon Ross, TMN | Read more:
Image: Steven Guerrisi

Tuesday, February 24, 2015

Whistlin' Dixie

Driving south from the North, we tried to spot exactly where the real South begins. We looked for the South in hand-scrawled signs on the roadside advertising ‘Boil Peanut’, in one-room corrugated tin Baptist churches that are little more than holy sheds, in the crumbling plantation homes with their rose gardens and secrets. In the real South, we thought, ships ought to turn to riverboats, cold Puritanism to swampy hellfire, coarse industrialists with a passion for hotels and steel to the genteel ease of the cotton planter.

Most of what we believe about the South, wrote W.J. Cash in the 1930s, exists in our imagination. But, he wrote, we shouldn’t take this to mean that the South is therefore unreal. The real South, wrote Cash in The Mind of the South, exists in unreality. It is the tendency toward unreality, toward romanticism, toward escape, that defines the mind of the South.

The unreality that shaped the South took many forms. In the South, wrote Cash (himself a Southern man), is “a mood in which the mind yields almost perforce to drift and in which the imagination holds unchecked sway, a mood in which nothing any more seems improbable save the puny inadequateness of fact, nothing incredible save the bareness of truth.” Most people still believe, wrote Cash — but no more than Southerners themselves — in a South built by European aristocrats who erected castles from scrub. This imaginary South, wrote Cash, was “a sort of stagepiece out of the eighteenth century,” where gentlemen planters and exquisite ladies in farthingales spoke softly on the steps of their stately mansions. But well-adjusted men of position and power, he wrote, “do not embark on frail ships for a dismal frontier… The laborer, faced with starvation; the debtor, anxious to get out of jail; the apprentice, eager for a fling at adventure; the small landowner and shopkeeper, faced with bankruptcy and hopeful of a fortune in tobacco; the neurotic, haunted by failure and despair” — only these would go.”

The dominant trait of the mind of the South, wrote Cash, was an intense individualism — an individualism the likes of which the world hadn’t seen since Renaissance days. In the backcountry, the Southern man’s ambitions were unbounded. For each who stood on his own little property, his individual will was imperial law. In the South, wrote Cash, wealth and rank were not so important as they were in older societies. “Great personal courage, unusual physical powers, the ability to drink a quart of whiskey or to lose one’s whole capital on the turn of a card without the quiver of a muscle — these are at least as important as possessions, and infinitely more important than heraldic crests.”

The average white Southern man (for this man was Cash’s main focus) was a romantic, but it was a romance bordering on bedlam. Any ordinary man tends to be a hedonist and a romantic, but take that man away from Old World traditions, wrote Cash, and stick him in the frontier wilds. Take away the skepticism and realism necessary for ambition and he falls back on imagination. His world becomes rooted in the fantastic, the unbelievable, and his emotions lie close to the surface. Life on the Southern frontier was harsh but free — it could make a man’s ego feel large.

The Southern landscape, too, had an unreal quality, “itself,” wrote Cash, “a sort of cosmic conspiracy against reality in favor of romance.” In this country of “extravagant color, of proliferating foliage and bloom, of flooding yellow sunlight, and, above all, perhaps, of haze,” the “pale blue fogs [that] hang above the valleys in the morning,” the outlines of reality blur. The atmosphere smokes “rendering every object vague and problematical.” A soft languor creeps through the blood and into the brain, wrote Cash, and the mood of the South becomes like a drunken reverie, where facts drift far away. “But I must tell you also that the sequel to this mood,” wrote Cash, “is invariably a thunderstorm. For days — for weeks, it may be — the land lies thus in reverie and then …”

The romanticism of the South, wrote W.J. Cash, was one that tended toward violence. It was a violence the Southern man often turned toward himself as much as those around him. The reverie turns to sadness and the sadness to a sense of foreboding and the foreboding to despair. Nerves start to wilt under the terrifying sun, questions arise that have no answers, and “even the soundest grow a bit neurotic.” When the rains break, as they will, and the South becomes a land of fury, the descent into unreality takes hold. Pleasure becomes sin, and all are stripped naked before the terror of truth.

by Stefany Anne Golberg, The Smart Set |  Read more:
Image: uncredited

The Drug Technology Boom

Cannabis is joining the coca leaf in the ranks of drugs improved by technology.

Concentrates are made from the entire plant, while the smokable flower is only a part. To sell flowers, plants must be cut, trimmed, and dried. The process takes weeks and a lot of manpower. For refined product, the entire plant can be processed without having to wait. Solvents turn live plants into BHO (butane hash oil) on the spot. Those who produce concentrates have a sellable product within 24 hours, where it would take weeks to properly prepare buds.

Dabs are refined marijuana with highly concentrated doses of THC, which explains why, when I tried it, I could feel my thoughts. About 50 percent of the marijuana flower is vegetative plant matter and weeded out in clarification. What’s left are the cannabinoids and terpenes (the good stuff with the medical applications and flavor).

Marijuana is oil- and fat-soluble, which is why edibles (like brownies) work. When the flowers are simmered in butter, the butter takes on the intoxicating components of the plant, which can be then be strained out; boiling it in water would just get it hot and wet. Once ingested (however they’re ingested) and absorbed into the bloodstream, fat-soluble drugs collect in fatty tissue like the brain.

Whether a substance is fat- or water-soluble depends on polarity. Fat-soluble materials are nonpolar, meaning they lack electrical charge. Nonpolar molecules are held together by covalent bonds, in which electrons are shared between connected atoms. Because it’s fat-soluble, concentrates are made with oil-based solvents (often butane) to suspend the cannabinoids. This separates the essential oils from the plant matter.

For those of us who weren’t in AP Chemistry, a solvent is a material that dissolves another, chemically different material (the solute). There are water-based solvent-extraction techniques, but they are less popular.

“Blasting” (extracting concentrated THC from the plant) is complicated, dangerous, and easy to find on YouTube.

The solvent is introduced to the solute (dank, loud, nugs, and other stupid names), resulting in a yellow liquid, as long as ventilation has been good enough to prevent suffocation by fire.

The product of the last chemical reaction becomes the reactant in the next one. The liquid is heated to evaporate the remaining harsh chemicals until it resembles delicious crème brûlée. The material is then moved into a vacuum chamber where the pump bubbles and fluffs it some more. (The strict need for a vacuum chamber is a point of some Reddit dispute, but it remains the favored process.)

When the fluff looks like gooey home insulation, it returns to the heat source to take its final form. Once isolated, the beneficial compounds can take a variety of structures. A few variables determine the consistency, but temperature is the big one.

“Shatter,” as it’s referred to in this state, looks like the meth seen in Breaking Bad, but yellow. It’s solid yet fragile and breaks apart easily. Shatter is subjected to an additional process to extract the lipids, fats, waxes, and terpenes. It’s the purest of the refined products. Good shatter can reach over 80 percent THC.

“Wax” looks like a mother extracted it from a 14-year-old boy’s ear. It still has the terpenes, which makes it more flavorful but less potent than shatter—usually 70 to 80 percent. Of the three most popular concentrates, it’s definitely the one I’d most like to get under a microscope.

“Honey oil” looks a bit like shatter and feels a lot like maple syrup. It’s the least refined and most flavorful of the three. (...)

Since concentrates are better vaped than smoked, their growing popularity is changing the apparatuses used to consume them. The first vaporizer I ever used was a direct-inhale box. If you tried to get someone stoned with that today, they’d laugh at you. They were analog, complicated, and hard to use. I never once felt that I got stoned, and oh lord did I try. (...)

The two technologies are evolving alongside one another. Since that pioneering stoner first invented the apple pipe, potheads have been seeking to optimize consumption.

“Some people lack the skills to become accomplished stoners,” Greene laughed. “If you hold the lighter the wrong way, you burn your thumb; everyone can push a button.”

I don’t know if I agree. Ritual is part of addiction. Don’t get me wrong: I think pot is great and everyone should get stoned, but I am, without a doubt, addicted. My dad hasn’t smoked since the ’80s, but he still prides himself on rolling great joints (and put to the test, he will). Similarly, I get a rush out of tearing open a new bag of Agent Orange (or Green Crack or another unfortunately named strain), ripping apart the bud, and packing the bowl. The texture of flowers crumbling between my fingers is an intoxicating part of the experience.

With my vape, I’m beginning to appreciate a new set of rituals. I play with the temperature settings, pack the oven with a particular gold pencil, and meticulously scrape the lid clean. These aren’t the same rituals that make me feel like I’m in my childhood bathroom blowing smoke into the ass end of a fan while listening to Belle and Sebastian’s Tigermilk, but there’s magic in the new as well.

by Cece Lederer, The Kernel | Read more:
Images: Imgur, Andres Rodriguez/Flickr (CC BY 2.0), Vjiced/Wikimedia Commons (CC BY-SA 3.0)


[ed. Full album here.]

The Glacier Priest

[ed. Amazing how everything looks much the same. The Wikipedia entry doesn't mention it, but I assume the Hubbard Glacier was named after the Rev. Bernard Hubbard, remembered by Newsweek after he died as: "the Glacier Priest, a tireless Jesuit who led 32 expeditions to Alaska and once listed the requisites of an explorer as "a strong back, a strong stomach, a dumb head, and a guardian angel."]

Winter Forever

The New York Times has something to say about the season:
Long stretches of painfully frigid weather, brief respites, then more snow, ice and freezing rain. Freeze, thaw, repeat — the cycle expands in the mind to unbearable infinities, the unyielding sensation of being trapped. But check the calendar: This week means we are officially in late February, which means March. March means daffodils, which means this all must eventually end.
It’s a nice thought, but I am here to tell you that, like all hope, this optimism is deeply misplaced. Sure, the calendar will change. The birds will sing again. The scent of fresh urine will alight on the nose as you wander the streets. Perhaps it will even once more grow so warm that we shall shed the heavy layers of clothing with which we bundle ourselves before each undertaking outdoors. But mark my words: What you’ve seen this winter and the winter before can never be erased. There is no return from such raw horror. However hard the sun shines down on you in future days you will always carry this winter around in your cold, barren heart. The light that once danced and played behind your eyes has been permanently dimmed and replaced by a mean, dull glare that stares shivers into anything it surveys. If the human race lasts another hundred years each member of the species will carry within its shattered soul a darkness so intense that the all the trees of the fields shall bend their branches away in fear from its frigid malevolence. Winter will never end, nor will you ever rid yourself of it. It is in you, and of you. It is you. There is no turning back, ever. Your stock of sorrows will freeze and, having frozen, crack into a thousand tiny icicles that stab sadness into any spare bit of joy that threatens to melt your bitter, broken spirit. When you walk you carry with you the icy frost of death which, with the wind chill factor, feels like negative five degrees icy frost of death. The only warmth left for you will come when you are lowered for the last time into the ground’s final embrace.

by Alex Balk, The Awl | Read more:
Image: via:

Monday, February 23, 2015

Donna Summer

[ed. Hey, didn't Lady Gaga kill it at the Oscars last night? Sound of Music covers. Huh. Who would have thought?]

How Pop Made a Revolution

Yeah! Yeah! Yeah!: The Story of Pop Music from Bill Haley to Beyoncé, Bob Stanley, W.W. Norton, 624 pages

I wish I could say that my love of pop music began when my middle school music teacher showed me a documentary called “The Compleat Beatles.” That would be the socially acceptable, hipster-sanctioned origin story. But truthfully, the affair began a couple years earlier in 1986, when I conspired with some friends to flood our local top 40 station with requests for the song “Rock Me Amadeus.” Dismayed that Falco’s masterwork had slipped in the charts, we resolved to do whatever we could to reverse its fate. This was either true love or something equally intense—a force that could drive a 12-year-old boy to cold-call a radio station and then sit next to the stereo for hours with finger poised over the tape-record button, enduring songs by Mister Mister and Starship, just waiting for that descending synth motif to issue forth from the speakers.

I’m not terribly surprised that “Rock Me Amadeus” receives no mention in Bob Stanley’s new book. While the song embodies the very essence of pop—it is quirky, flamboyant, goofily ambitious, yet so very of its moment—it was ultimately a failed experiment, a novelty hit. (Though, to be fair, it was no less kitschy than The Timelords’ “Doctorin’ The Tardis,” which does receive mention.) I listen to it now and wonder what the hell my 12-year-old self was thinking. But that’s love, right? It rarely makes sense after it has passed. Stanley clearly knows something about the fever dream of the besotted pop fan, and much of his book is written from that headspace.

What a joy it is to find a music writer who didn’t get the rock-critic memo—the one that says you’re supposed to worship at the altar of punk rock, praise Radiohead, and hate the Eagles. Stanley has plenty of nice things to say about the Eagles, the Bee Gees, Hall and Oates, and Abba. Conversely, he has nothing but contempt for The Clash, those self-anointed exemplars of punk rock. “The Clash set out parameters,” he writes, “and then squirmed like politicians when they were caught busting their own manifesto.” (Stanley prefers the more self-aware Sex Pistols.) Radiohead fare even worse; he describes these critical darlings as “dad rock.” Vocalist Thom York sings “as if he was in the fetal position.”

Of Bob Dylan, a figure as close to a saint as we get in the annals of rock lit, Stanley writes: “along with the Stones he sealed the concept of snotty behavior as a lifestyle, snarled at the conventional with his pack of giggling lickspittle dogs, and extended Brando’s ‘What have you got?’ one-liner into a lifelong party of terse putdowns.” For those of us who grew up reading far too many issues of Rolling Stone for our own good, this is bracing tonic indeed.

What gives Stanley the edge over so many other music journalists is the fact that he is a songwriter himself, and a fairly successful one at that: his band Saint Etienne had a string of UK Top 20 hits in the 1990s. It is easier for musicians than for non-musician critics, I believe, to see beyond genre boundaries and appreciate tunefulness wherever it may reside. Stanley, whom I’m pretty sure would rather be known as a “musician who writes” than a “writer who plays music,” takes a more expansive view of the term “pop” than a lot of other writers might do. In his view, pop simply means “popular.” It is not, as is typically imagined, a specific sound—say that of a Britney Spears or Katy Perry. Under Stanley’s definition, Nirvana qualifies as pop. So do Pink Floyd, Black Sabbath, and Glen Campbell.

At 624 pages, Yeah! Yeah! Yeah! is a doorstop of a book, but Stanley’s enthusiasm for the material keeps the narrative moving briskly. He can get inside a song and describe its magic to outsiders like no one else I have ever come across. Consider the following highlights: Of the “clattering, drum-heavy” mix of Bill Haley’s “Rock Around the Clock,” he writes, “It sounded like jump blues, only with someone dismantling scaffolding in the studio.” On Abba: “No one musician stands out on any of their hits because they don’t sound like anyone played an instrument on them; they all sound like a music box carved from ice.” On the Police: “Singer Sting had a high, mewling voice that, appropriately, sounded a little like the whine of a police siren.” And, as is probably apparent already, Stanley is very effective with the terse putdown. My favorite concerns The Cure—a band that has spawned an entire cottage industry of mopey imitators: “It was all somehow powdery and a little slight,” he writes. “The Cure were more about stubbing your toe than taking your life.” Ouch.

Stanley’s two preoccupations throughout the book are innovation and craft, in that order. He gives a lot of space to sonic pioneers like Joe Meek, Phil Spector, and later the architects of early hip-hop and house music, detailing how each wave of experimentation inevitably made its way into the heart of the mainstream sound, eventually becoming calcified until the next upheaval came along to shake things up.

by Robert Dean Lurie, The American Conservative | Read more:
Image: The Beatles / Wikimedia Commons